Mind-benders—The Godel Fallacy

By the, "Gödel fallacy," I do not mean that Kurt Gödel was guilty of a fallacy, but that his incompleteness theorems are used by those who have no idea what they mean to make fallacious assertions about the nature of truth, knowledge, reason, and reality.

Though frequently referred to as, "the Gödel theorem," there are actually two incompleteness theorems:

1. For any axiomatic system of natural numbers, no such system can prove all the true facts about natural numbers.

2. Any axiomatic system of natural numbers capable of proving basic facts about natural numbers cannot be used to prove the consistency of the system itself.

Outside the context of symbolic logic and natural number theory, Godel's theorems have no meaning whatsoever. Nevertheless, they are consistently used by the ignorant to make such assertions as, "the premises of all models of reality and the basis of all logical arguments are arbitrary axioms which can never be proved."

Such pronouncements not only reveal a gross ignorance of the nature of Godel's theorems, but of the nature of the distinction between, "knowledge of methods," and, "knowledge of facts."

Mathematics and Symbolic Logic Are Methods

An intellectual method is a man-made system for manipulating a fixed set of symbols that may be used to represent actual concepts, but in themselves, identify nothing objective. Knowledge about a method is real knowledge but it is not knowledge about anything in objective reality, it is knowledge only about the man-made system. The rules and principles of a method do not apply to anything in objective reality.

Godel's theorems, for example, apply to a subset of symbolic logic dealing exclusively with the, "natural numbers," also called positive integers (1, 2, 3, 4,...) or nonnegative integers (0, 1, 2, 3, 4,...). The concepts involved in symbolic logic do not identify any facts or existents in objective reality, only the symbols and rules of how they may be manipulated. The words "true" and "false" as used in symbolic logic, do not refer to what is metaphysically true, only to the legitimacy of the use of the symbols according to the rules of the system.

Counting Numbers

The, "natural numbers," are sometimes referred to as the, "counting numbers." Before men learned how to count, or even today where primitive tribes do not have that skill, there is no certain way to determine the quantity of things, such as how many cattle one has, or how many people there are in the village.

There is often an observable metaphysical difference in the "quantity" of things, especially when the difference is great such as two fields of sheep, one with only few sheep and other with a great many. Though the difference can be, "seen," exactly what the difference is cannot be directly observed.

Though the exact difference cannot be seen, the method of counting can be used to determine exactly what that difference is—and counting is a method with specific rules like all methods. Using a set of different symbols, always recited or recorded in the same order, assigning a different symbol to each item in a collection, the last symbol used will indicate the total number of items in that collection.

We take this method for granted, but it required a genius to discover and develop it. The discovery was not the discovery of a scientific fact, but the discovery of the idea of order and the discovery that the method would work. The order was very important. One might create a set of different symbols, but if they were recited willy nilly, (1, 13, 7, 2, 8,...) they would never result in a correct, "number," accept by accident.

Here are (example) rules for counting numbers:

1. There must be a set of symbols which are all different.

2. Each symbol must be used only once.

3. Every item being counted must be assigned a symbol.

4. The order of the symbols must be fixed.

5. The assigning of symbols must always be in the same fixed order.

Observe that none of these rules apply to any fact of reality, only to what counting numbers must be and how they must be used. There are no counting numbers in "nature." They must be invented. As counting numbers, no number identifies any particular existent, even after counting. The items counted do not have to be in any particular order, and may be counted in a totally different order with identical results. (Ordinal numbers are a totally different, though related, subject.)

The exact number of items in a collection is a metaphysical fact, and the method of counting can be used to discover and identify that metaphysical fact. The method of counting works because it conforms to the facts of reality in its operation, but the fact that items in a collection can be counted is not a metaphysical characteristic of the items, or even their relationship to each other. The metaphysical nature of those items (as discrete entities) and their relationships (part of the same collection) are what make them, "countable."

Before methods of creating numbers by means of combinations of a fixed number of symbols (like Arabic numerals) it would have been necessary to create entirely new symbols for every number to be counted. Another rule then might have been:

6. To count any possible number of things, there must be a potentially infinite number of symbols. (The decimal system provides that potential.)

I'm sure if there had been such a rule there would have been an ancient Gödel with a theorem proving "no system of counting numbers can count everything, (because nothing can be infinite)" and a Zeno, like the modern day sophists, would have used the theorem to prove nothing could be known for certain by counting.

Modern day sophists who attempt to use Gödel's theorem to repudiate certain knowledge are simply ignorant of the nature of methods and their relationship to objective reality. Most of them also commit what I call the Pythagorean fallacy.

The Pythagorean Fallacy

The following is adapted from Saving Science: A Criticism of the Thesis in David Harriman's The Logical Leap: Induction in Physics:

Limits of Mathematics

The Pythagorean fallacy, or superstition, is the belief that numbers, or mathematics is, in some profound way, the ultimate explanation for everything.

Pythagoras said, "all things are numbers." Modern Pythagoreans do not say all things are numbers, but do believe everything can ultimately be understood in terms of numbers or explained by mathematics. When the ancient Pythagoreans discovered incommensurables, some of them committed suicide, because that discovery showed that all they believed, the very basis of meaning in their lives, was wrong. I hope the modern Pythagoreans will not react to what I have to say with similar despair.

Objective Base of a Method

Mathematics, like language, is a method, a human invention with the purpose of dealing with certain specific attributes of the perceived physical world.

All of what is called mathematics begins with the concept of numbers. At some level, the field of mathematics merges with geometry and some aspects of logic as well, but the strictly mathematical part of even the advance mathematical fields of trigonometry and the Calculus, for example, depend on the concept of numbers.

The objective attribute of the world which numbers pertain to is multiplicity—that is, the fact existence consists of multiple discrete entities. That is the objective foundation of all mathematics.

Numbers are the conceptual tool of counting, as described above.

All of mathematics is an extension of that basic method of determining the number of things. Addition and subtraction are just shortcuts for counting and "counting backwards." Multiplication and division are shortcuts for performing addition and subtraction. Fractions and decimals are shortcut methods of notation for division.


Another unknown genius of ancient history discovered that numbers could also be used to identify other characteristics of things, such as length, weight, or speed, as well as relationships between things, such as distance. The technique is called measurement.

Obviously this discovery has been just as important to the development of the civilized world as counting itself. Unfortunately, it also led to one of the first great mistakes in philosophy from which philosophy has never thoroughly rid itself.

Measurement uses the method of counting to determine a "measurable" attribute. All measurement requires "a unit of measure" commensurate with the attribute to be measured. If the attribute to be measured is length, for example, the unit of measure must be some length that is chosen as a "standard" length. If the attribute to be measured is weight, the unit of measure must be some standard weight.

The method of measurement is counting, and what is counted is the number of "units of measure" that equal the measure of whatever characteristic is to be determined. If we use length as an example, one way to measure it would be to take a small stick, as the unit of measure. The stick could be laid out on the length to be measured, starting at one end, then placing it next where it last ended, repeating this process, counting each time the stick is laid down until the end of the length being measured is reached. If the stick is laid down 10 times, the measured length is "10 'sticks' long."

While a stick is a metaphysical existent, and has length as an attribute, and its own length is a metaphysical fact, and it's length is being used as the "unit of measure" to measure something else, it is an arbitrary unit. As a concept for a "unit of measure" it is only a concept, there is no metaphysical existent, "stick-length."

When counting entities, counting is absolute. If there are thirty seven entities, counting will tell you exactly how many there are, that is, 37—and there are 37, absolutely.

When measuring something, the number of "units of measure" that are "counted" may, or may not be the exact measure of a thing, and in fact will almost never be perfectly exact.

The main reason for this is because units of measure are discrete; they are concepts and all concepts are discrete. But concepts have no physical existence, only mental or conceptual existence. There are no metaphysical inches, pounds, or minutes, there are only length, weight, and time, and they are all analog.

For any discrete unit of length conceived, there may be existents it can exactly measure, but there are, potentially, an infinite number of existents it cannot exactly measure. This is true of all units of measure. To suppose that everything can ultimately be known in terms of mathematics forgets that mathematics is only a method, a method of determining the numbers of things adapted as a method for dealing with measurable attributes of existents and their relationships. Measurement is only the application of the method of counting to that which cannot truly be counted, but with the invention of, "units of measure," some of the attributes of the physical can be treated as though they had "parts" that can be counted, which metaphysically, they do not have.

Pythagoras' Devastating Discovery

The disillusionment of the ancient Pythagoreans followed directly from Pythagoras' greatest discovery that, where a and b are the legs (sides next to the right angle) of a right triangle, and c is the hypotenuse, (side opposite the right angle), a2 + b2 = c2. This led immediately to the discovery that in an isosceles right triangle, where a = b, there is no commensurate unit of measure that can measure both a leg of an isosceles right triangle and the hypotenuse.

Today many such "irrational" (no ratio) relationships are known, and very close approximations, such a pi, are used in calculations where such relationship need to be measured. It is difficult not to have the impression that irrationals, like pi, actually do have a value if one could just carry it out far enough. The ancient way of describing these irrational relationships is much clearer in demonstrating there is no such value.

Suppose the sides of an isosceles right triangle are one inch long. Let the length of the hypotenuse be represented by m/n. Since a2 + b2 = c2, substituting 1 for both a and b, and m/n for c, yields m2/n2 = 2. Divide out any common factor in m/n, now either m or n must be odd (because if both are even there is still the common factor 2).

Multiply both sides of the equation, m2/n2 = 2, by n2 to get m2 = 2n2. Therefore, m2 is even; therefore m is even. Suppose m = 2p (if m is even it must be 2 times something). Substituting 2p for m in the equation, m2 = 2n2, yields 4p2 = 2n2. Dividing both sides by 2 yields n2 = 2p2, therefore n is even.

If there were a unit of measure that could measure both a leg and hypotenuse of an isosceles right triangle, the length of the hypotenuse could be represented as m/n units, and either m or n, reduced to lowest terms, would have to be odd. Since both m and n can be demonstrated logically (or mathematically) to be even, there can be no unit of measure that can measure both a leg and hypotenuse of an isosceles right triangle.

Obviously, metaphysically, there is an exact relationship between either leg of an isosceles right triangle and the hypotenuse but there is no number that can truly represent that relationship.

This discovery was enough to demonstrate to the ancient Pythagoreans that not only is, "all things are numbers," not true, but all things cannot even be described by numbers. It is even worse than that, however, for the modern Pythagoreans.

Mathematically Unknowable

There is a class of physical events described by a set of concepts called "chaos" or "fractals" or "Lorenz attractors." The peculiar thing about such events is that they are determined, not randomly as chaos might imply, but by strictly in terms mathematical functions, though the actual mathematical function for any real chaotic event or process can never be discovered, and the actual behavior of chaotic events and processes are impossible to predict (which is the real reason they are called "chaos").

True natural "chaos" events and phenomena are analog, not discrete, in nature, but scientists can simulate such events with digital computers using what are called iterative functions. The technology is too complex to describe here.

There are many aspects of the real world, however, that are examples of "chaos" theory. The human heart beat, for example, is never absolutely even, because the electronic nature of the heart behaves, apparently, like a Lorenz attractor. It is a feedback mechanism (like taking the output of one equation and using it as the input of the next). In fact, if the heartbeat were perfectly symmetrical, it would race uncontrollably, a condition which does happen called fibrillation.

The almost endless patterns of snow flakes are examples of fractals. Each is completely different, because the physics that forms them, though identical, begins with a different value for each snow flake (because the particles of dust all snow flakes form on are slightly different).

Ferns are another example. While ferns all look very similar, they are never identical. Broccoli exhibits the same fractal characteristics. There are, in fact, chaotic characteristics in all life. The venous and arterial systems in a human kidney, flowers, and trees are all examples, and DNA clusters form shapes that resemble Julia sets. Non-living examples include clouds, frost and ice formations, lightning, galaxies, and ocean currents.

Perhaps the most interesting example of chaos is the weather. It was while studying weather that the famous Lorenz attractor was discovered. Edward Lorenz, an MIT meteorologist was attempting to create a program that all meteorologists dream of. It was believed if one could map all the meteorological states of the world, one could predict all the weather, indefinitely. What they discovered was, because weather "feeds itself" it behaves chaotically, and therefore was unpredictable, which came as no surprise to anyone except meteorologists.

None of these limitations of mathematics in any way repudiates the value and utility of mathematics, especially in the sciences. All that men have achieved and accomplished in the fields of science and technology rests heavily on mathematics and without the knowledge of that method, all the benefits that contribute to the quality of life enjoyed by modern man in western civilization would not be possible. But mathematics is only a tool, a method invented by man. It is when mathematics, or any other single aspect of intellectual achievement, is raised to the level of mystic insight that will provide the answers and explanation for all things that it becomes a superstition.

Reality Is Not Mathematical

Expressions like, "the mathematical nature of the universe," or "the mathematical nature of space and time," or, "the mathematical nature of the living world," or, "the mathematical nature of consciousness," or, "the mathematical nature of the mind," are ubiquitous, but all are untrue.

No aspect of objective reality, the physical, the living, conscious, or mental, has any mathematical attributes, because mathematics is a method, a unique kind of, "language," for identifying those characteristics of objective reality that can be identified in terms of numbers. All language is a method of identifying existents and their characteristics in objective reality, but no one supposes that there is an "English" nature of the universe, or a "French" nature of space and time, or a "Chinese nature of consciousness," or a "Swahili nature of the mind." To apply the method of identification of physical attributes to the physical attributes themselves is the worst of reification.

There are many attributes of the physical that are not mathematical in nature at all. There is no measurement of attributes like right-handed and left-handed, or mirror imaged. If something has, "handedness," it is either right or left, there are no degrees. Such attributes are not mathematical, but different kinds of qualities altogether, Clockwise, counter-clockwise, and polarity, are others.

Are physical states, like solid, liquid, and gas, measurable? There might be measurable attributes associated with particular substances and their states, like temperature, but the states themselves, and the concepts for them, have nothing measurable about them. What is the measurable attribute of the property "sublimes, (chemical substances which have only two states, solid and gas, like carbon dioxide and iodine). What is the commensurate unit of measure for the attributes defining a plasma? A plasma is a plasma, period. One understands what a plasma is by description, not in terms of any measurable attributes. (Of course plasma's have measurable attributes, like temperature and charge, but they, alone, do not make a plasma what it is. Most other things have those same measureable attributes.)

All of the physical attributes science uses mathematics to measure are not themselves mathematical. Before anything can be measured, it must first be identified. We first have to grasp the identity of entities, attributes (qualities), relationships, and events before we can even notice differences or similarities between them. We first must observe that things have attributes like length, weight, temperature, volume, and speed, before we can discover differences in those attributes and before we discover there is any way to, "measure," them.

An Example

Both the Gödel Fallacy and the Pythagorean Fallacy are common arguments used by those who try to repudiate objective reason and knowledge in attempts to put over everything from post modernism to some kind of intellectual snake oil. There is hardly a better example than Dr. Edward De Bono.

De Bono has no idea what the Gödel theorems are but does not hesitate to repeat a common misconception baed on them. I supply two of his versions:

"We have believed that teaching logic was enough and this would ensure good perception. This is totally false. Goedel's theorem shows that from within a system you can never logically prove the starting points. The starting points are arbitrary perceptions and values."

Goedel's theorem points out that from within a situation it is impossible logically to prove the starting points - which are arbitrary perceptions, assumptions and values. So logic is not enough.

In both cases, De Bono thinks he is using Gödel as an authority to repudiate logic, apparently unaware that both of Gödel's theorems are based entirely on logic. If you are going to repudiate logic, if you're going to say, "it's not enough," than you would have to repudiate Gödel's theorems, themselves, as, "not enough."

As explained earlier, Gödel's theorems only pertain to systems of natural numbers which are purely symbolic systems. The "logic" De Bono is referring to is not a system, it is merely the principles of objective reasoning, which is never based on assumptions, but on the observable facts of reality. Perhaps De Bono's "reasoning" is based on arbitrary assumptions and values, but for the rest of us, values are never assumed at the beginning, but come at the end of a long chain of non-contradictory objective reasoning.

It is objective reason itself that De Bono means to repudiate, which he must do in order to put over the rationally absurd notions of, "thinking," he promotes. He also invokes the Pythagorean Fallacy in his promotion:

"We now know there is a mathematical necessity for lateral thinking."

De Bono's so-called, "lateral thinking," is meant to be a kind of thinking different from logical (correct) thinking, and is, therefore, in fact, illogical (incorrect) thinking, but even if his "lateral thinking" were correct thinking, what a "mathematical necessity" for it could possibly be cannot even be imagined—the words identify nothing.

De Bono is able get away with this because most people do entertain some form of the Pythagorean Fallacy, believing that, in some way they cannot identify, mathematics is a kind of mystic force that makes things happen, or even makes things, "necessary." But, of course, no method of gaining knowledge or identifying things makes anything happen. Mathematics is simply a tool of the human mind and outside the human mind does not even exist.

There is no such thing as a "mathematical necessity" outside the very "bent" mind of Dr. Edward De Bono.