Systemantics
From Wikipedia, the free encyclopedia
This article's tone or style may not be appropriate for Wikipedia. Specific concerns may be found on the talk page. See Wikipedia's guide to writing better articles for suggestions. (August 2008) |
Systemantics (retitled The Systems Bible in its third edition) is a text by John Gall in which he proposes several "laws" of systems' failures. Systemantics is a play on words on semantics and systems display antics.
It is written in the style of a serious academic work, and is often mistakenly cited as such. The content is similar in style to Murphy's Law and the Peter Principle, which are both referenced in the work.
Of course, from outside the System of Defining Serious Academic Work, this volume (and others like it) may be seen as utterly serious, merely wearing a fake nose and bushy eyebrows to distract the pedantic system-encysted. One can make a case that it fulfills the spirit of Serious Academic Work: illuminating Reality. As always, the reader should make up his or her own mind.
The final paragraph of Systemantics is as follows:
"There are some who assert that General SystemANTICS is a spoof of a serious scientific subject called General System Theory. Devotees of General System Theory attribute the founding of their science to Professor Ludwig von Bertalanffy, who noted, in the early decades of this century, that scientists had overlooked the establishment of a science of Anything and Everything and who, with Teutonic thoroughness, made up the oversight." (p. 141)
Contents |
[edit] Some laws of Systemantics
- The Primal Scenario or Basic Datum of Experience: Systems in general work poorly or not at all. (Complicated systems seldom exceed five percent efficiency.)
- The Fundamental Theorem: New systems generate new problems.
- The Law of Conservation of Anergy [sic]: The total amount of anergy in the universe is constant. ("Anergy" = 'human energy')
- Laws of Growth: Systems tend to grow, and as they grow, they encroach.
- The Generalized Uncertainty Principle: Systems display antics. (Complicated systems produce unexpected outcomes. The total behavior of large systems cannot be predicted.)
- Le Chatelier's Principle: Complex systems tend to oppose their own proper function. As systems grow in complexity, they tend to oppose their stated function.
- Functionary's Falsity: People in systems do not actually do what the system says they are doing.
- The Operational Fallacy: The system itself does not actually do what it says it is doing.
- The Fundamental Law of Administrative Workings (F.L.A.W.): Things are what they are reported to be. The real world is what it is reported to be. (That is, the system takes as given that things are as reported, regardless of the true state of affairs.)
- Systems attract systems-people. (For every human system, there is a type of person adapted to thrive on it or in it.)
- The bigger the system, the narrower and more specialized the interface with individuals.
- A complex system cannot be "made" to work. It either works or it doesn't.
- A simple system, designed from scratch, sometimes works.
- Some complex systems actually work.
- A complex system that works is invariably found to have evolved from a simple system that works.
- A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over, beginning with a working simple system.
- The Functional Indeterminacy Theorem (F.I.T.): In complex systems, malfunction and even total non-function may not be detectable for long periods, if ever.
- The Newtonian Law of Systems Inertia: A system that performs a certain way will continue to operate in that way regardless of the need or of changed conditions.
- Systems develop goals of their own the instant they come into being.
- Intrasystem [sic] goals come first.
- The Fundamental Failure-Mode Theorem (F.F.T.): Complex systems usually operate in failure mode.
- A complex system can fail in an infinite number of ways. (If anything can go wrong, it will.) (See Murphy's law.)
- The mode of failure of a complex system cannot ordinarily be predicted from its structure.
- The crucial variables are discovered by accident.
- The larger the system, the greater the probability of unexpected failure.
- "Success" or "Function" in any system may be failure in the larger or smaller systems to which the system is connected.
- The Fail-Safe Theorem: When a Fail-Safe system fails, it fails by failing to fail safe.
- Complex systems tend to produce complex responses (not solutions) to problems.
- Great advances are not produced by systems designed to produce great advances.
- The Vector Theory of Systems: Systems run better when designed to run downhill.
- Loose systems last longer and work better. (Efficient systems are dangerous to themselves and to others.)
- As systems grow in size, they tend to lose basic functions.
- The larger the system, the less the variety in the product.
- Control of a system is exercised by the element with the greatest variety of behavioral responses.
- Colossal systems foster colossal errors.
- Choose your systems with care.
[edit] Advanced systems theory
- Everything is a system.
- Everything is part of a larger system.
- The universe is infinitely systematized, both upward (larger systems) and downward (smaller systems).
- All systems are infinitely complex.
[edit] Systemantics while standing on one foot
Systemantics can be reduced to two propositions roughly equivalent to the use of abstinence and condoms:
- A priori systems are considered guilty until proven innocent, given that reason is finite.
- As evolution is the only system known to produce intelligent behaviour, it is to be preferred.
[edit] References
- Gall, John. The Systems Bible: The Beginner's Guide to Systems Large and Small (Third Edition of SYSTEMANTICS), General Systemantics Press/Liberty, 2003. ISBN 0-9618251-7-0.
- Gall, John. SYSTEMANTICS: The Underground Text of Systems Lore. How Systems Really Work and How They Fail (Second Edition), General Systemantics Press, 1986. ISBN 0-9618251-0-3.
- Gall, John. SYSTEMANTICS: How Systems Really Work and How They Fail (First Edition), Pocket, 1978. ISBN 0-671-81910-0.