fbpx
Wikipedia

Systemantics

General Systemantics (retitled to Systemantics in its second edition and The Systems Bible in its third) is a systems engineering treatise by John Gall in which he offers practical principles of systems design based on experience and anecdotes.

General Systemantics
1977 edition
AuthorJohn Gall
IllustratorR. O. Blechman
LanguageEnglish
SubjectSystems science
PublisherGeneral Systemantics Press
Publication date
1975/78, 1986, 2002
Media typePrint

It is offered from the perspective of how not to design systems, based on system engineering failures. The primary precept of the treatise is that large complex systems are extremely difficult to design correctly despite best intentions, so care must be taken to design smaller, less-complex systems and to do so with incremental functionality based on close and continual touch with user needs and measures of effectiveness.

History Edit

The book was initially self-published after Gall received rejection letters from 30 publishers. After several reviews in academic journals, it was picked up by Quadrangle–The New York Times Book Company, who published it in 1977.[1]

Title origin Edit

The term systemantics is a commentary on prior work by Alfred Korzybski called general semantics which conjectured that all systems failures could be attributed to a single root cause – a failure to communicate. Gall observes that, instead, system failure is an intrinsic feature of systems. He thereby derives the term general systemantics in deference to the notion of a sweeping theory of system failure, but attributed to an intrinsic feature based on laws of system behavior. He observes as a side-note that system antics also playfully captures the concept that systems naturally "act up."

Contents Edit

Background Edit

Premise Edit

  • Systems in general work poorly or not at all.[2]

This is more a universal observation than a law. The origin of this observation is traced back via:

  1. Murphy's Law that "if anything can go wrong, it will",
  2. Alfred Korzybski's general semantics notion of failure's root cause being a communication problem,
  3. Humorist Stephen Potter's One-upmanship on ways to "game" the system for personal benefit,
  4. Historian C. Northcote Parkinson's principle called Parkinson's Law – "Work expands so as to fill the time available for its completion"
  5. Educator Lawrence J. Peter's widely cited Peter Principle – "In a hierarchy every employee tends to rise to his level of incompetence ... in time every post tends to be occupied by an employee who is incompetent to carry out its duties ... Work is accomplished by those employees who have not yet reached their level of incompetence."

Scope Edit

By systems, the author refers to those that "...involve human beings, particularly those very large systems such as national governments, nations themselves, religions, the railway system, the post office..." though the intention is that the principles are general to any system.

Additionally, the author observes:

  1. Everything is a system.
  2. Everything is part of a larger system.
  3. The universe is infinitely systematized, both upward (larger systems) and downward (smaller systems).
  4. All systems are infinitely complex.

First principles Edit

  • New systems mean new problems.[3]

Once a system is set up to solve some problem, the system itself engenders new problems relating to its development, operations and maintenance. The author points out that the additional energy required to support the system can consume the energy it was meant to save. This leads to the next principle:

  • The total amount of anergy in the universe is fixed.

The author defines anergy as the effort required to bring about a change. This is meant as a tongue-in-cheek analog of the law of conservation of energy.

  • Systems tend to expand to fill the known universe.

One of the problems that a system creates is that it becomes an entity unto itself that not only persists but expands and encroaches on areas beyond the original system's purview.

Why systems behave poorly Edit

  • Complicated systems produce unexpected outcomes (Generalized Uncertainty Principle).[4]

The author cites a number of spectacular unexpected behaviors including:

  1. The Aswan Dam diverting the Nile River's fertilizing sediment to Lake Nasser (where it is useless) requiring the dam to operate at full electrical generating capacity to run the artificial fertilizer plants needed to replace the diverted sediment.
  2. The space Vehicle Assembly Building at Kennedy Space Center designed to protect vehicles from weather is so large that it produces its own weather.

Feedback Edit

Not only do systems expand well beyond their original goals, but as they evolve they tend to oppose even their own original goals. This is seen as a systems theory analog of Le Chatelier's principle that suggests chemical and physical processes tend to counteract changed conditions that upset equilibrium until a new equilibrium is established. This same counteraction force can be seen in systems behavior. For example, incentive reward systems set up in business can have the effect of institutionalizing mediocrity.[5] This leads to the following principle:

  • Systems tend to oppose their own proper function.[6]

What's in a name Edit

People performing roles in systems often do not perform the role suggested by the name the system gives that person, nor does the system itself perform the role that its name suggests.

  • People in systems do not actually do what the system says they are doing (Functionary's Falsity).[7]
  • The system itself does not actually do what it says it is doing (The Operational Fallacy).

Inside systems Edit

  • The real world is what is reported to the system (The Fundamental Law of Administrative Workings [F.L.A.W.]).[8]

In other words, the system has a severely censored and distorted view of reality from biased and filtering sensory organs, which displaces understanding of the actual real-world which pales and tends to disappear. This displacement creates a type of sensory deprivation and a kind of hallucinogenic effect on those inside the systems, causing them to lose common sense. In addition to negatively affecting those inside the system, the system attracts to it people who are optimized for the pathological environment the system creates. Thus,

  • Systems attract systems-people.

Elementary systems functions Edit

  1. A complex system cannot be "made" to work. It either works or it does not.
  2. A simple system, designed from scratch, sometimes works.
  3. Some complex systems actually work.
  4. A complex system that works is invariably found to have evolved from a simple system that works.
  5. A complex system designed from scratch never works and cannot be patched up to make it work. One has to start over, beginning with a working simple system.

Advanced systems functions Edit

  1. The Functional Indeterminacy Theorem (F.I.T.): in complex systems, malfunction and even total non-function may not be detectable for long periods, if ever.
  2. The Newtonian Law of Systems Inertia: a system that performs a certain way will continue to operate in that way regardless of the need or of changed conditions.
  3. Systems develop goals of their own the instant they come into being.
  4. Intrasystem goals come first.

System failure Edit

  1. The Fundamental Failure-Mode Theorem (F.F.T.): complex systems usually operate in a failure mode.
  2. A complex system can fail in an infinite number of ways. (If anything can go wrong, it will; see Murphy's law.)
  3. The mode of failure of a complex system cannot ordinarily be predicted from its structure.
  4. The crucial variables are discovered by accident.
  5. The larger the system, the greater the probability of unexpected failure.
  6. "Success" or "function" in any system may be failure in the larger or smaller systems to which the system is connected.
  7. The Fail-Safe Theorem: when a fail-safe system fails, it fails by failing to fail safe.

Practical systems design Edit

  1. The Vector Theory of Systems: systems run better when designed to run downhill.
  2. Loose systems last longer and work better. (Efficient systems are dangerous to themselves and to others.)

Management and other myths Edit

  1. Complex systems tend to produce complex responses (not solutions) to problems.
  2. Great advances are not produced by systems designed to produce great advances.

Other laws of systemantics Edit

  1. As systems grow in size, they tend to lose basic functions.
  2. The larger the system, the less the variety in the product.
  3. Control of a system is exercised by the element with the greatest variety of behavioral responses.
  4. Colossal systems foster colossal errors.
  5. Choose systems with care.

Reception Edit

A 1977 review in Etc: A Review of General Semantics states that the book's aim is unclear, commenting, "As a put-down of institutional practices it works well, as good as anything in print", but "As a slam at systems theory the book is less successful, even ambiguous."[9] A Library Journal review from 1977 comments, "Like some of its predecessors, the book pretends to rebuke people for their manifold stupidities, but is, in fact, an invitation to take pleasure in them. That's not a failing, just a fact. Recommended."[10] A 2004 review in the American Society of Safety Professionals' Professional Safety says, "It is at once deadly serious with all the outrageous contrived irony of Gary Larson's 'Far Side' cartoons" and that "the book is one continuous insight after another."[11] PCMag calls the book "small but insightful".[12]

References Edit

  1. ^ Serrin, Judith (1977-01-05). "Why Things Just Won't Work". Detroit Free Press. pp. 1C, 5C. Retrieved 2023-09-20 – via Newspapers.com.
  2. ^ Gall, John (1978). Systemantics. Pocket Books. pp. 22. ISBN 9780671819101.
  3. ^ Gall, John (1978). Systemantics. Pocket Books. pp. 29. ISBN 9780671819101.
  4. ^ Gall, John (1978). Systemantics. Pocket Books. pp. 40. ISBN 9780671819101.
  5. ^ Pink, Daniel (2011). Drive. Penguin. ISBN 978-1594484803.
  6. ^ Gall, John (1978). Systemantics. Pocket Books. pp. 48. ISBN 9780671819101.
  7. ^ Gall, John (1978). Systemantics. Pocket Books. pp. 58. ISBN 9780671819101.
  8. ^ Gall, John (1978). Systemantics. Pocket Books. pp. 65. ISBN 9780671819101.
  9. ^ Quinby, David L. (December 1977). "Review: General Sematics and General Systems: An Irreverent View". Etc: A Review of General Semantics. 34 (4) – via JSTOR.
  10. ^ Anderson, A. J. (1977-05-01). "Humor: Gall, John. Systemantics: How systems work and especially how they fail". Library Journal. 102 (9): 1018 – via EBSCO.
  11. ^ Metzgar, Carl R. (October 2004). "Writing Worth Reading: Review: The Systems Bible". Professional Safety. American Society of Safety Professionals. 49 (10): 20, 72 – via JSTOR.
  12. ^ "Definition of Systemantics". PCMag. Retrieved 2023-09-20.

Sources Edit

  • Gall, John (2003). The Systems Bible: The Beginner's Guide to Systems Large and Small (3rd ed.). Walker: General Systemantics Press. ISBN 9780961825171.
  • Gall, John (1986). SYSTEMANTICS: The Underground Text of Systems Lore. How Systems Really Work and How They Fail (2nd ed.). Ann Arbor: General Systemantics Press. ISBN 9780961825102.
  • Gall, John (1978). SYSTEMANTICS: How Systems Really Work and How They Fail (1st ed.). New York: Pocket Books. ISBN 9780671819101.

External links Edit

  • Bart Stewart's Explanation of Systemantics
  • Commentary on the principles of "Systemantics", by Anthony Judge
  • c2 wiki entry on Systemantics

systemantics, this, article, possibly, contains, original, research, please, improve, verifying, claims, made, adding, inline, citations, statements, consisting, only, original, research, should, removed, november, 2012, learn, when, remove, this, template, me. This article possibly contains original research Please improve it by verifying the claims made and adding inline citations Statements consisting only of original research should be removed November 2012 Learn how and when to remove this template message General Systemantics retitled to Systemantics in its second edition and The Systems Bible in its third is a systems engineering treatise by John Gall in which he offers practical principles of systems design based on experience and anecdotes General Systemantics1977 editionAuthorJohn GallIllustratorR O BlechmanLanguageEnglishSubjectSystems sciencePublisherGeneral Systemantics PressPublication date1975 78 1986 2002Media typePrintIt is offered from the perspective of how not to design systems based on system engineering failures The primary precept of the treatise is that large complex systems are extremely difficult to design correctly despite best intentions so care must be taken to design smaller less complex systems and to do so with incremental functionality based on close and continual touch with user needs and measures of effectiveness Contents 1 History 2 Title origin 3 Contents 3 1 Background 3 1 1 Premise 3 1 2 Scope 3 2 First principles 3 3 Why systems behave poorly 3 4 Feedback 3 5 What s in a name 3 6 Inside systems 3 7 Elementary systems functions 3 8 Advanced systems functions 3 9 System failure 3 10 Practical systems design 3 11 Management and other myths 3 12 Other laws of systemantics 4 Reception 5 References 6 Sources 7 External linksHistory EditThe book was initially self published after Gall received rejection letters from 30 publishers After several reviews in academic journals it was picked up by Quadrangle The New York Times Book Company who published it in 1977 1 Title origin EditThe term systemantics is a commentary on prior work by Alfred Korzybski called general semantics which conjectured that all systems failures could be attributed to a single root cause a failure to communicate Gall observes that instead system failure is an intrinsic feature of systems He thereby derives the term general systemantics in deference to the notion of a sweeping theory of system failure but attributed to an intrinsic feature based on laws of system behavior He observes as a side note that system antics also playfully captures the concept that systems naturally act up Contents EditBackground Edit Premise Edit Systems in general work poorly or not at all 2 This is more a universal observation than a law The origin of this observation is traced back via Murphy s Law that if anything can go wrong it will Alfred Korzybski s general semantics notion of failure s root cause being a communication problem Humorist Stephen Potter s One upmanship on ways to game the system for personal benefit Historian C Northcote Parkinson s principle called Parkinson s Law Work expands so as to fill the time available for its completion Educator Lawrence J Peter s widely cited Peter Principle In a hierarchy every employee tends to rise to his level of incompetence in time every post tends to be occupied by an employee who is incompetent to carry out its duties Work is accomplished by those employees who have not yet reached their level of incompetence Scope Edit By systems the author refers to those that involve human beings particularly those very large systems such as national governments nations themselves religions the railway system the post office though the intention is that the principles are general to any system Additionally the author observes Everything is a system Everything is part of a larger system The universe is infinitely systematized both upward larger systems and downward smaller systems All systems are infinitely complex First principles Edit New systems mean new problems 3 Once a system is set up to solve some problem the system itself engenders new problems relating to its development operations and maintenance The author points out that the additional energy required to support the system can consume the energy it was meant to save This leads to the next principle The total amount of anergy in the universe is fixed The author defines anergy as the effort required to bring about a change This is meant as a tongue in cheek analog of the law of conservation of energy Systems tend to expand to fill the known universe One of the problems that a system creates is that it becomes an entity unto itself that not only persists but expands and encroaches on areas beyond the original system s purview Why systems behave poorly Edit Complicated systems produce unexpected outcomes Generalized Uncertainty Principle 4 The author cites a number of spectacular unexpected behaviors including The Aswan Dam diverting the Nile River s fertilizing sediment to Lake Nasser where it is useless requiring the dam to operate at full electrical generating capacity to run the artificial fertilizer plants needed to replace the diverted sediment The space Vehicle Assembly Building at Kennedy Space Center designed to protect vehicles from weather is so large that it produces its own weather Feedback Edit Not only do systems expand well beyond their original goals but as they evolve they tend to oppose even their own original goals This is seen as a systems theory analog of Le Chatelier s principle that suggests chemical and physical processes tend to counteract changed conditions that upset equilibrium until a new equilibrium is established This same counteraction force can be seen in systems behavior For example incentive reward systems set up in business can have the effect of institutionalizing mediocrity 5 This leads to the following principle Systems tend to oppose their own proper function 6 What s in a name Edit People performing roles in systems often do not perform the role suggested by the name the system gives that person nor does the system itself perform the role that its name suggests People in systems do not actually do what the system says they are doing Functionary s Falsity 7 The system itself does not actually do what it says it is doing The Operational Fallacy Inside systems Edit The real world is what is reported to the system The Fundamental Law of Administrative Workings F L A W 8 In other words the system has a severely censored and distorted view of reality from biased and filtering sensory organs which displaces understanding of the actual real world which pales and tends to disappear This displacement creates a type of sensory deprivation and a kind of hallucinogenic effect on those inside the systems causing them to lose common sense In addition to negatively affecting those inside the system the system attracts to it people who are optimized for the pathological environment the system creates Thus Systems attract systems people Elementary systems functions Edit A complex system cannot be made to work It either works or it does not A simple system designed from scratch sometimes works Some complex systems actually work A complex system that works is invariably found to have evolved from a simple system that works A complex system designed from scratch never works and cannot be patched up to make it work One has to start over beginning with a working simple system Advanced systems functions Edit The Functional Indeterminacy Theorem F I T in complex systems malfunction and even total non function may not be detectable for long periods if ever The Newtonian Law of Systems Inertia a system that performs a certain way will continue to operate in that way regardless of the need or of changed conditions Systems develop goals of their own the instant they come into being Intrasystem goals come first System failure Edit The Fundamental Failure Mode Theorem F F T complex systems usually operate in a failure mode A complex system can fail in an infinite number of ways If anything can go wrong it will see Murphy s law The mode of failure of a complex system cannot ordinarily be predicted from its structure The crucial variables are discovered by accident The larger the system the greater the probability of unexpected failure Success or function in any system may be failure in the larger or smaller systems to which the system is connected The Fail Safe Theorem when a fail safe system fails it fails by failing to fail safe Practical systems design Edit The Vector Theory of Systems systems run better when designed to run downhill Loose systems last longer and work better Efficient systems are dangerous to themselves and to others Management and other myths Edit Complex systems tend to produce complex responses not solutions to problems Great advances are not produced by systems designed to produce great advances Other laws of systemantics Edit As systems grow in size they tend to lose basic functions The larger the system the less the variety in the product Control of a system is exercised by the element with the greatest variety of behavioral responses Colossal systems foster colossal errors Choose systems with care Reception EditA 1977 review in Etc A Review of General Semantics states that the book s aim is unclear commenting As a put down of institutional practices it works well as good as anything in print but As a slam at systems theory the book is less successful even ambiguous 9 A Library Journal review from 1977 comments Like some of its predecessors the book pretends to rebuke people for their manifold stupidities but is in fact an invitation to take pleasure in them That s not a failing just a fact Recommended 10 A 2004 review in the American Society of Safety Professionals Professional Safety says It is at once deadly serious with all the outrageous contrived irony of Gary Larson s Far Side cartoons and that the book is one continuous insight after another 11 PCMag calls the book small but insightful 12 References Edit Serrin Judith 1977 01 05 Why Things Just Won t Work Detroit Free Press pp 1C 5C Retrieved 2023 09 20 via Newspapers com Gall John 1978 Systemantics Pocket Books pp 22 ISBN 9780671819101 Gall John 1978 Systemantics Pocket Books pp 29 ISBN 9780671819101 Gall John 1978 Systemantics Pocket Books pp 40 ISBN 9780671819101 Pink Daniel 2011 Drive Penguin ISBN 978 1594484803 Gall John 1978 Systemantics Pocket Books pp 48 ISBN 9780671819101 Gall John 1978 Systemantics Pocket Books pp 58 ISBN 9780671819101 Gall John 1978 Systemantics Pocket Books pp 65 ISBN 9780671819101 Quinby David L December 1977 Review General Sematics and General Systems An Irreverent View Etc A Review of General Semantics 34 4 via JSTOR Anderson A J 1977 05 01 Humor Gall John Systemantics How systems work and especially how they fail Library Journal 102 9 1018 via EBSCO Metzgar Carl R October 2004 Writing Worth Reading Review The Systems Bible Professional Safety American Society of Safety Professionals 49 10 20 72 via JSTOR Definition of Systemantics PCMag Retrieved 2023 09 20 Sources EditGall John 2003 The Systems Bible The Beginner s Guide to Systems Large and Small 3rd ed Walker General Systemantics Press ISBN 9780961825171 Gall John 1986 SYSTEMANTICS The Underground Text of Systems Lore How Systems Really Work and How They Fail 2nd ed Ann Arbor General Systemantics Press ISBN 9780961825102 Gall John 1978 SYSTEMANTICS How Systems Really Work and How They Fail 1st ed New York Pocket Books ISBN 9780671819101 External links EditBart Stewart s Explanation of Systemantics Commentary on the principles of Systemantics by Anthony Judge c2 wiki entry on Systemantics Retrieved from https en wikipedia org w index php title Systemantics amp oldid 1178624606, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.