11 December 2020


Please note that this is work in progress. Consequently, not all chapters are available at all times.

[link] The Elements of Truth Project
[link] Foreword


[link] 1. The basic premises of truth.
2. Truth is probabilistic.


[link] 1. Use doubt wisely.
[link] 2. Pursue the truth, not falsehoods.
[link] 3. If you want to know, you will have to find out.
[link] 4. Know what it is you want to know.
[link] 5. Distinguish between facts, beliefs, preferences, and opinions.
[link] 6. Evaluate the reliability of information.
[link] 7. Aim for the primary source of information.
[link] 8. Adopt the scientific method.


[link] 1. Never argue with lunatics.
[link] 2. Clarify the source of disagreement.
[link] 6. Use simple language.
[link] 13. Never argue against preferences.

















The Elements of Truth Project

I started writing The Elements of Truth a long, long time ago, around 2008. It came from the observation that among humankind common sense is not common, and intellectual rigour is rare.

Consequently, I wanted to write a book that is to thinking what Strunk and White is to writing. Confident, concise, and clear.

But writing that way is hard work. Clarity arises from simplification, and simplification from omission. How much omission can I justify? Besides, often when I wanted to describe a piece of the cathedral upon which truth is built, I discovered that the cathedral I had in mind was just a rickety old scaffold. I then had to go and take down the old and rebuild something stronger. And because I am slow, it took time.

In any case, this book would not exist without the lives and works of many people alive and dead. For the necessary causes: Kimberley Wakil (my wife and the love of my life), Viktor Weiskopf, Gerhard Hanebeck, Johnny Rotten, Charles Darwin, Ludwig Wittgenstein, William Strunk Jr., and Carl Walters. For the sufficient causes: Enid Blyton, Heinrich Harrer, Bob Geldof, Ernest Hemingway, Daniel Kahneman, and Margarete Müllbacher. A strange group indeed.

Michael Baumann, December 2020

08 December 2020


"It is a capital mistake to theorize before one has data. Insensibly one begins to twist the facts to suit theories, instead of theories to suit facts."
-- Sherlock Holmes to Dr. Watson in Arthur Conan Doyle (1892), A Scandal in Bohemia

I have studied people for half a century.

I have observed myself, my family, my friends. I have observed children and parents, high school students and university students, teachers and professors. I have observed researchers, doctors, lawyers, engineers, businesspeople, and other professionals. I have observed public servants, politicians, and the general public.

I have observed the things we do, the things we say, and the things we say we believe. I have reached this conclusion:

Common sense is not common, and intellectual rigour is rare.

I have reached this conclusion neither quickly nor lightly.

As a child I was told that cold and wet feet will give me pneumonia and that wet hair and a draft will give me meningitis. I was told that swimming after lunch will kill me by drowning and that falling into a patch of stinging nettles will kill me by suffocation.

I was told that if I misbehave, the Krampus will take me to hell in a basket, that if I break a mirror, I will suffer from seven years of bad luck, and that if the 13th day of the month falls on a Friday, it is a bad omen.

I was told that what doesn't kill you makes you stronger, and I wondered about all the "strong" people in war zones, and famine zones, and hospitals. I was told that people use only 10% of their brain, and that this is not a metaphor. I was told that seeing a spider in morning brings sorrow, that seeing the same spider in the evening is invigorating, and I wondered what changed in the course of the day.

I was told that old elephants migrate to a secret elephant graveyard to die, that lemmings commit mass suicide by diving from a cliff, and that ostriches stick their heads into the sand to avoid detection.

When I was eight years old, I saw a U.F.O. land in an Austrian forest. When I was twelve, I believed that paranormal things happen in the Bermuda Triangle. When I was thirteen, I believed that if 1,000 people willed a door to open, it would open. When I was fourteen, I believed that human beings could spontaneously combust. When I was eighteen, I went to church and prayed to pass my driving exam.

I watched my grandmother boil tap water after the Chernobyl nuclear accident because boiling water does purify it. I watched parents pray to god to save their sick child from a disease that god must have given the child in the first place. I watched university students wear good-luck charms, and I watched their professors do the same.

As an adult, I was told that Omega−3 fatty acids are a universal cure, that people are rational, and selfish, and that their tastes do not change, and that every morning at breakfast Baron Rothschild sets the interest rates for the global banking system. And I said nothing.

What a despicable collection. I am sure you have your own list.

But isn't it true that many false beliefs do not cause much harm? Isn't it better, as Blaise Pascal speculated, to believe in a god than to find yourself being punished for not believing? Isn't it better to believe that all snakes are poisonous, than to be bitten by one that actually is?

Maybe, possibly, and yes.

Besides, shouldn't people have the right to ruin their own lives?

Unfortunately, in a democracy the beliefs that people hold have the potential to ruin not only their own lives but other people's lives as well. And there is no reason to believe that people will inform themselves more thoroughly, make better decisions, or act more wisely when the wellbeing of others is involved.

I agree with Plato's assessment of democracy:

"It's an agreeable anarchic form of society, with plenty of variety, which treats all men as equal, whether they are equal or not."
-- Plato (ca. 375 B.C.E.)

But I also agree with Winston Churchill's:

"Indeed, it has been said that democracy is the worst form of Government except all those other forms that have been tried from time to time; but there is the broad feeling in our country that the people should rule, continuously rule, and that public opinion, expressed by all constitutional means, should shape, guide, and control the actions of Ministers who are their servants and not their masters."
-- Winston Churchill (11 November 1947)

One way out of this dilemma is to give the citizens the tools to render themselves better informed, less manipulated, smarter citizens. Education is the method of providing those tools, and this little book is my contribution to the education of the citizen.

In our journey towards the truth, I introduce each important idea by a simple rule and a brief description. Understanding is developed through examples. That said, these examples are not comprehensive analyses of particular situations, rather my intention is to stimulate you, the reader, to think of personal experiences.

Sections I, II, and III form the foundation of thinking, i.e. investigating a question of interest, formulating a valid argument, and guarding against the intentional distortion of reality.

Section IV looks at the limitations of the human mind and their evolutionary and psychological causes. Section V is a summary of common cognitive biases and shortcuts. Section VI takes a look at the model of reality that you carry around in your mind and at the formal process of model building.

A glossary provides definitions for terms used throughout the book.

Just as understanding the rules of chess will not make you an expert player, understanding the rules of thinking will not make you an expert thinker. But the goal is not to become an expert thinker, the goal is to become a better, more rigorous thinker. And anyone can become a better, more rigorous thinker. All that is required is a single statement: "I don't believe it." The rest flows from here.

Yes, rigorous thinking is hard work and takes plenty of practice. Just as in chess, the first thing you must learn to appreciate is your own mistakes. Exploring your own mistakes is true learning.

That said, this little book is a weapon against corruption and stupidity. Use it wisely.

Michael Baumann, December 2020

30 November 2020


1. The basic premises of truth.

A few words on philosophy are in order at the beginning. Philosophy can roughly be classified into six branches.

  1. Metaphysics: Examines the nature of reality.
  2. Epistemology: Examines the nature of knowledge.
  3. Ethics: Examines the nature of human behaviour.
  4. Politics: Examines the nature of governance.
  5. Economics: Examines the nature of goods and services.
  6. Aesthetics: Examines the nature of beauty.

The study of truth falls into all six branches of philosophy, and has serious consequences in each.

That said, before we can examine truth, we must agree on a few metaphysical premises. Without these premises, truth cannot be established. And if truth cannot be established in principle, there is no point in trying to establish it or fight over it. The five premises are:

  1. There exists an external reality. This external reality exists independent of an observer.
  2. In this external reality, somewhere in space and sometime in time, there exists an observer.
  3. Parts of this external reality are accessible to the observer. Through sensation, perception, and thought the observer may construct an observed reality.
  4. Through language the observer may construct statements about an observed reality.
  5. The relationships (match, mismatch) between external reality, an observed reality, and a statement about an observed reality are as follows.

(Note that human language is less specific about agreements between external reality, an observed reality, and a statement about an observed reality, than it is about disagreements. All agreements are called a truth, a fact, or a matter of fact.)

Example: Statements of witnesses to a murder.

External reality: The murderer is a woman.

Observed realities of ...
Alice: The murderer is a woman.
Bob: The murderer is a woman.
Carol: The murderer is a man.
David: The murderer is a man.

Statements of ...
Alice: "The murderer is a woman."
Bob: "The murderer is a man."
Carol: "The murderer is a man."
David: "The murderer is a woman."

Alice and Bob perceive the truth, but only Alice speaks the truth; Bob is lying. Carol and David perceive an illusion. Carol speaks the truth, which is really a falsehood. David is lying and produces an accidental truth.

Of course, there is a lot to be said about partial truths and partial falsehoods, but the point of these premises is to establish a set of basic rules to guarantee that truth can be established in principle.


29 November 2020


"Believe nothing, no matter where you read it, or who said it, no matter if I have said it, unless it agrees with your own reason and your own common sense."
-- Attributed to the Buddha (5th century B.C.E.)

"Nullius in verba."
"On no man's word."
-- Motto of the Royal Society since 1663

28 November 2020

Use doubt wisely.

We live in times where a lie is halfway around the world before the truth gets halfway out of bed.

From half-educated half-wits declaring themselves as experts in anything, to the sharpening and levelling of political information, to the smearing of opponents, to the latest rumours in a crisis, to the creation of alternative facts, to targeted disinformation campaigns, how do you know that what you are being told is true?

You don't.

Almost nothing you know or believe about the world is based on your own experience. Almost everything you know or believe about the world you know on trust.

And this applies to both information regarding questions of fact and information regarding questions of cause. There is good reason for the motto of the Royal Society: Nullius in verba. On no man's word.

The foundational rule of The Elements of Investigation had to be strong, and my first worry was that people are too gullible, and are thus easy prey for manipulators of information. A strong call for doubt seemed appropriate, and my first choice reflected this: Make "I don't believe it." your maxim.

But then Dr. Dale Kolody pointed out that if we look at the full spectrum of doubt, both extremes, "doubting nothing" and "doubting everything", avoid the need for critical thinking. And this brought about my second worry: That people believe too little, believe their opinion on something they know nothing about is as valid as expert opinion, ... and are thus easy prey for manipulators of information.

Hence my choice of the foundational rule: Use doubt wisely.

Belief and doubt both need to be moderated by reason in order to find truth.

Somewhere out there there is level of doubt that will maximize your survival/your success in life. Too little doubt will leave you in the gullibilty trough, too much doubt will leave you in the uncertainty trough. The exact shape of this curve is of course unknown.

Or as Bertrand Russell expressed it so well: "The fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt."

Doubt is often viewed with suspicion. But doubt is not synonymous with cynicism nor is it synonymous with distrust. I would not call a person a friend who believes everything I say. Not because I am telling a lie, but because there is always the possibilty that I am wrong.


27 November 2020

Pursue the truth, not falsehoods.

As a scientist you learn early to abide by the words of Thomas Jefferson (1820), that "we are not afraid to follow truth wherever it may lead, nor to tolerate any error so long as reason is left free to combat it."

This rule is easy to follow in the sciences because in general the scientist has no stake where the destination lies. (Although this may be true for the Natural Sciences more often than for the Social Sciences.) In fact, the very purpose of scientists is to have a group of people paid for not holding preconceived notions.

The non-scientific world is burdened by more complicated goals.

Philosophers and theologians have an endless capacity to argue about the nature of truth. For most of us the definition is simple: Truth is a synonym of fact, and a fact is an observable phenomenon that can be independently verified. The opposite of truth is a falsehood. (And if a falsehood is spread with the knowledge that it is a falsehood, it becomes a lie.)

Between truth and falsehood lies the indeterminant, a phenomenon that has not been or cannot be declared a truth or a falsehood.

"Planet Earth moves around the Sun." (a fact)
"Joe Biden won the 2020 U.S. Presidential Elections." (a fact)
"Unicorns exist." (an indeterminant) "There is a heaven." (an indeterminant)
"The Earth is 4,004 years old." (a falsehood)
"The song "Edelweiss" is the national anthem of Austria." (a falsehood)
"Alfred Dreyfus sold military secrets to Germany." (a lie)
"President Barack Obama was born outside the United States." (a lie)

Someone may believe false information to be true. If she spreads this information, she is misinforming others. Someone may know false information to be false. If he spreads this information, he is disinforming others.

Keep in mind that the absence of something (God, U.F.O.s, the Deep State) cannot be proven in principle. Consequently, conspiracy theories can never be fully proven false. Keep also in mind the Sagan standard: "Extraordinary claims require extraordinary evidence." It is not up to you to prove an extraordinary claim false. It is up to the person making an extraordinary claim to provide evidence.


If you want to know the truth, you will have to find out.

If you want to know the truth, you will have to find out. Finding things out is called research. Crudely speaking there are two types of it:

Records research: Records research involves reading written documents, listening to sound recordings (e.g. interviews), and viewing still photography or film footage. The communication in records research is one-way only, from the source to the sink.

Original/live research: Finding out for yourself (i.e. conducting original research), or talking to somebody who found out herself (e.g. a witness to the shooting of U.S. President Kennedy), or talking to somebody who talked to somebody who found out herself (e.g. the grandson of the last survivor of the sinking of the Titanic), and so on. Live research is interactive.

Rigorous research follows a systematic process, the scientific process. It requires a critical mind and doubt, and it produces scientific knowledge. Sloppy research is based on personal observations, personal experience, perceived wisdoms, and anecdotes. It produces traditional knowledge.


26 November 2020

Know what it is you want to know.

What is your question?

There are two types of questions: Questions of fact and questions of cause. Both types of questions can be sources for disagreement.

The focus of questions of fact may be a person (who?), a thing (what?), a point in space (where?), a point in time (when?), or a number (how many?, how much? how often?). Questions of fact can often be answered by a single datum or record.

"What is the capital of Switzerland?"
"At what temperature does water boil?"

Statements are either true or false and are often based on conventions. If a statement corresponds to an observable phenomenon that can be independently verified, it is called a fact, a matter of fact, or a statement of fact. That said, a fact may be provisional.

"The capital of Germany is Berlin." (true until 1949 and after 1990)
"The capital of Germany is Bonn." (true between 1949 and 1990)

Facts may also require further definitions.

"Climate change is a fact."
"Global mean surface temperature has been rising since the early 1900s."

What is meant by climate change? What are the exact definitions of global mean surface temperature, rising, and early 1900s?

Questions of cause address often complicated causal chains of events (how?, why?). Finding answers to these questions requires logically consistent arguments and a series of independent observations. Statements of cause are either invalid or valid.

A statement of cause is valid as long as no observation has been made that refutes the claim of the statement. Because a cause can never be verified, it will only constitute provisional explanation.

"Childbed fever is caused by bacterial infections."
"Human activity is the cause for climate change."

Depending on scope and detail of the question and your prior knowledge, the process of answering may vary in duration from the few seconds it takes to google the question to a lifetime of scholarship. Whatever your commitment, an explicitly stated question at the beginning will save you grief and time later.

Keep in mind that not everything you want to know is actually known or knowable. We only have limited access to the world around us.


Distinguish between facts, beliefs, preferences, and opinions.

Fact/A matter of fact: An observable phenomenon that can be independently verified.

"Water boils at 100 degrees Celsius."
"Marie Curie was awarded her second Nobel Prize in 1911."

Belief: A phenomenon that is assumed to be true. The origin of a belief may stem from a combination of facts, misinformation, and/or disinformation.

"I believe climate change is happening."
"I believe God created the universe in six days."

Preference: A favoured choice. Values are preferences. The origin of a preference may be known or not -- observation, indoctrination, operant conditioning, delusion.

"I prefer a fixed-rate mortgage over a variable rate, because it gives me greater piece of mind."
"I like chocolate cake. I don't like cheesecake. I don't know why."
"Honesty is important to me."

Opinion: A judgement based on beliefs and preferences.

"In my opinion A Moveable Feast is the best book ever written."
"Socialism is superior to both Communism and Capitalism."
"In order to reduce greenhouse gas emissions, the government should tax gasoline."

(In my life I have learned three things: First, people do not like to stick to facts, they like to stick to whatever it is they believe. Second, people judge and choose even if they know nothing or only parts of the story. And last, people have two standards, one which they apply to themselves, family, and friends, and another that they apply to everybody else.)


24 November 2020

Evaluate the reliability of information.

Almost nothing you know or believe about the world is based on your own experience. Almost everything you know or believe about the world you know on trust.

Consequently, it is up to you to assess the reliability of information presented to you. Here is a list of eight questions you should ask: four regarding the information source and four regarding the information itself.

1. What is the quality of the information channel?

Did the information reach you through a peer-reviewed publication, a monograph by a professor, a textbook, an edited secondary publication, an encyclopaedia, a lecture, a presentation, a face-to-face conversation, a newspaper article, a TV broadcast, a blog, a YouTube video, or a social media post? You must gauge the quality of the information according to the quality of the information channel.

2. Who is the primary source of the information?

Is the primary source a specialist scientist, a generalist expert, a civil servant, a professional in the relevant field (an accountant, physician, banker, or lawyer), a journalist, a teacher (at a university, high school, or elementary school), a friend, a co-worker, a person "in-the-know", a politician, a salesperson? Is the author of the information competent? What are her credentials? What is his record?

3. Is the primary source of the information independent?

Who is paying the primary source’s bills? Taxpayers, a newspaper, a television station, a business, a political party, or an interest group (e.g. the pharmaceutical industry, the tobacco industry, the petroleum industry)? Follow the money.

4. What is the intent of the communication?

To encourage, to enlighten, to inform, to educate, to test, to self-aggrandize, to calm, to convince, to confuse, to mislead, to deceive, to enrage, to panic? Cui bono? Who benefits from the communication?

5. How was the knowledge obtained?

Through original research (e.g. from theory, laboratory experiments, field studies), through records research (a meta-analysis, a literature review, an exploratory review), from anecdotes, from hear-say, by guessing, as folk wisdom?

6. Does the information appear to be accurate and complete?

Is the information current? Has someone pre-selected the information for you? Has contradictory information been considered? Have inconsistencies been addressed? Have alternative interpretations been explored?

(The New York Times boasts: "All the News That's Fit to Print". It is the editors who decide on the fitness of a story. And fitness is defined by information content, entertainment value, and possibly an agenda. Editors not only determine what you read, they also determine what you do not read.)

7. Can the information be independently validated?

Was the communication peer reviewed? Are references cited and available? Are research hypothesis, experimental design, data collection, and data analysis (for instance), described in enough detail that you could replicate the results, at least in principle?

8. Does the information appear to be unbiased?

Are the results statistically significant? Is the effect size statistically relevant? Are the logic of the argument and the conclusions valid?

Thinking in general makes most people uncomfortable. Judging the reliability of information requires you to think. It is your duty as a citizen to think and be well-informed.

Eight questions, four regarding the information source and four regarding the information itself. It's an easy-enough checklist to keep in mind.


23 November 2020

Aim for the primary source of information.

Consider the children's game Chinese Whispers. Players line up such that they can whisper into the ear of their immediate neighbours. The player at the beginning of the line thinks up a phrase and whispers it to the next player. This player in turn passes on the received message to the next player, and so on. The last player in the line calls out the message she received.

Each information transfer between two children may carry changes to the phrase, i.e. a loss, a change, or an addition to the incoming message. Only a fraction of the incoming information content survives each information transfer. The information content of a message after n transfers I(n) is a function of the fraction information content that survives each transfer s(i), where I(0) is the initial information.

I(n) = s(1) * s(2) * s(3) * ... * s(n) * I(0)

The final, called-out phrase may bear little resemblance to the initial phrase. Reasons for the changes to the phrase include difficulty in uttering and understanding whispers, anxiousness, impatience, erroneous corrections, or deliberate alterations.

It is these losses, changes, and additions that you want to avoid in your investigation. That is why you should aim for the primary source of information.

Most information you assimilate has a long information trail:

  • A citizen files a complaint about police brutality.
  • The Internal Affairs Department starts an investigation.
  • The investigation report is sent to the Public Relations Department.
  • A Police spokeswoman holds a press conference.
  • A journalist writes up an article on police brutality.
  • A co-worker reads the article.
  • You listen to the co-worker talking about police brutality.

Seven steps, six information transfers.

Even if you cannot get to the primary source of the information, you should aim to come as close as possible.


22 November 2020

Adopt the scientific method.

The scientific method is a systematic procedure to acquire knowledge about cause and effect in the phenomenal world.

It is nothing more -- and nothing less -- than formalized common sense. In fact, the scientific method comes so naturally to us, it is so embedded in our daily actions, that we often have difficulties detecting the individual steps.

Figure: The steps of the scientific method. Start at the bottom.

The steps are illustrated by a simple example: It is evening and getting dark outside, and you decide to turn on the light. You walk over to the lamp and turn the light switch ...

1. You make an observation: The light failed to come on.

2. You ask: Why did the light fail to come on?

3. You formulate a hypothesis: The light failed to come on, because the light bulb is burned out.

Cause: The light bulb burned out.
Effect: The light failed to come on.

(In fact, several hypotheses may come to your mind: The light failed to come on, because ... the lamp is unplugged, the light bulb is burned out, the lamp is somehow broken, a fuse has melted, there is a power failure in the building, there is a citywide blackout, ...)

4. You make a prediction: If the current light bulb is replaced by a new light bulb, the light will come on.

5. You desing an experiment: You replace the current light bulb with a new light bulb.

6. You collect data: You turn the light switch, and the light comes on.

7. You analyze your data: You turned the light switch, and the light came on.

8: You compare your predictions with your data.

You predicted: If the current light bulb is replaced by a new light bulb, then light will come on. You observed: After the light bulb was replaced, the light came on.

Hypothesis confirmation: You formulated a hypothesis, you made a prediction, you tested your prediction, and your prediction turned out to be correct. Therefore, you must provisionally accept your hypothesis.

(Hypothesis rejection: You formulated a hypothesis, you made a prediction, you tested your prediction, and your prediction turned out to be incorrect. Therefore, you must reject your hypothesis. Therefore, you must formulate a new hypothesis.)

Technically, we do not speak of proving a hypothesis to be true or false. We speak of failing to reject a hypothesis (and provisionally accepting it) or succeeding to reject a hypothesis.

Failing to reject a hypothesis does not mean that a cause for an effect is determined. For example, imagine there was a citywide blackout which was resolved in the time it took you to replace the light bulb. While you failed to reject your hypothesis and must therefore accept it, the true cause for the light failing to come on was the blackout.

Many more experiments may be necessary to confirm the hypothesis. You could replace the new light bulb with the old light bulb and see whether the light still comes on. Or you could try the old light bulb in a different lamp. Or you could thoroughly examine the old light bulb and check whether its Tungsten filament is broken. ...

The knowledge produced by the scientific method is universal in space and time, empirically testable, and replicable. It is also provisional.


30 September 2020


At this stage you have made up your mind: You have stated a question, researched the problem, found reliable information, analyzed the data, and reached conclusions. All that is left to do is convince others. Not an easy task. Here are a few rules.


29 September 2020

Never argue with lunatics.

Remember the feeble but clever child that would inevitably come in last in every footrace of your childhood? And when he passed the finish line he would yell: "Last one wins!" And from there the argument would ensue.

Without agreed-upon rules or enforced norms you cannot have a meaningful footrace. The same is true for an argument: You and your opponent must agree on some basic rules. The following is a minimum list:

  1. Facts outweigh fantasies.
  2. Actions outweigh words.
  3. A statement may be either true, or undecided, or false.
  4. Mutually exclusive statements cannot be true at the same time.
  5. The truth of a statement is independent of its proponent and the number of its proponents.
  6. The cause must precede the effect in time.
  7. Every cause is an effect of some other cause.
  8. If the premises of an argument are true, and the logic of the deduction is correct, then the conclusion must be valid.

Lunatics do not abide by these rules, either unknowingly or deliberately.

Unfortunately, in the evolution of democratic society, we have reached a stage where ignoring an opinion is considered an undemocratic act. Consequently, we patiently listen to even the most lunatic ideas, ideas that not only lack evidence but that often go against massive evidence to the contrary.

This practice has had dire consequences: The reinforcement of the lunatic's confidence that his conviction represents a legitimate position. The conclusion by the underinformed that facts are fewer and less certain than they in fact are. The folly to take the lunatic's confidence as a measure of the strength of his claims. The abuse of our good will by people with nefarious agendas. The catering of desperate politicians to lunatic superminorities. The adoption of the fear-and-anger business model in the media. The readiness of governments to disinform their own citizens and those of foreign nations. ...

And so, the Protocols of the Elders of Zion, 9-11 conspiracies, birtherism. And so, holocaust denial, climate change denial, mass shooting denial, CoViD denial. And so, QAnon, the "stolen" 2020 U.S. election, anti-vaccine delusion.

But here is the thing:


Would we give television time or newspaper space to someone who claims electricity does not work, or antibiotics, or bridges across rivers? To someone who believes water, or food, or gravity are social constructs? (Admittedly, the number of gravity deniers must be small, not because the evidence is literally just a stone's throw away, but because their lives must be so short.)

Example: Liberal vs. authoritarian

It is interesting how often authoritarians invoke their right to express their views (e.g. Klu Klux Klansmen, conservative clerics). While from a liberal viewpoint this is hypocrisy, it is a logically valid position:

The premise of liberals is that all opinions are valid. Consequently, a liberal must approve of the expression of opposing opinions and cannot engage in activities that would restrict such expression. (Consider the quote attributed to Voltaire: "I do not agree with what you have to say, but I'll defend to the death your right to say it.")

Authoritarians, on the other hand, can use a priori features (e.g. skin colour, gender, age, education) to make a judgement about validity of an opinion. Consequently, an authoritarian may oppose attempts to silence his opinions but can attempt to silence others.

It is easy to see that a liberal often finds herself at a (self-imposed) argumentative disadvantage.

Example: Philosophical objections

Philosophical objections can be construed against any and all rules of argumentation. Consider these two from the list above:

3. A statement may be either true, or undecided, or false.
But how about the following statement: "This statement is false." If the statement is true then it states that it is false. And if it is false then it states that it is true. And so ad infinitum. The problem arises from self-reference, a change in the contents of the statement through a statement outside of the system.

The Austrian mathematician Kurt Gödel (1906-1978) demonstrated that there in a self-referential statement instances may occur that make the statement complete ("This statement is true.") or make it consistent ("Another statement is false."), but do not make it both.

7. Every cause is an effect of some other cause.
The Scotish philosophyer David Hume (1711-1776) pointed out that the statement "Every effect has a cause." is only a belief based on past experiences and not on logical certainty. There might be effects without causes. Two examples: In creationism, the creator is an effect without a cause. The radioactive decay of an individual atom appears to be random, i. e. one cannot predict when the individual radioactive atom will decay.

Keep in mind that the absence of something cannot be proven in principle. And so a perceived creator might have hatched from an egg and "earth rays" might cause an atom to decay.

While these are valid objections, for all practical purposes they are immaterial.

Remember: If someone has made up her mind against you, there is nothing you can do to change her mind. Everyone is entitled to one's own opinion, but not to one's own facts.


Clarify the source of disagreement.

In order to reach agreement, you must first know what the disagreement is about. There are five types of disagreement, best illustrated with a contempoary example.

Disagreement about definitions

"Climate change is a long-term change in the average weather patterns." "Climate change is the long-term heating of Earth’s climate system"

Disagreement about facts/indicators

"The data indicate that climate change is happening."
"The data show a natural variation in global mean surface temperature."

Disagreement about causes

"Climate change is caused by human activity, e.g. anthropogenic emissions of greenhouse gases ."
"Climate change is caused by natural processes, e.g. increased solar output."

Disagreement about goals/consequences

"Climate change will have detrimental consequences for all of humanity."
"Climate change has a positive effect on primary production."

Disagreement about actions/preferences

"In order to curb the effects of climate change, we must introduce a carbon tax on fossil fuels."
"In order to curb the effects of climate change, we must plant trees."


20 September 2020

Use simple language.

The purpose of communication is the dissemination of information, not to look smart or to deceive people. So, whether you are giving a presentation or writing a report keep your audience in mind. Here are seven simple questions you should ask yourself:

  1. What is it that I intend to say?
  2. What images do I want to invoke?
  3. What words best describe those images?
  4. How can I make it shorter?
  5. What does my listener hear, my reader read?
  6. What images do my words invoke?
  7. How are these images interpreted?

Plain language is preferable to technical jargon. Avoid acronyms at the cost of length. Also keep in mind that everybody loves a well-developed story. And storytelling is an art that you have to practice, practice, practice.

Example: The sarcasm trap

When I was a graduate student I went to talk to one Professor Pedersen. While I was waiting for him to finish a letter, I looked out the window.

"Nice weather," I commented.

"Thank you," Professor Pedersen said.

I was puzzled. Fortunately, Pedersen went on to explain that the pullover was a recent purchase.

What had happened?

Two things: Not only did Pedersen misunderstand my words, "nice weather" had become "nice sweater", he also misinterpreted my inflection, assuming that a graduate student would not deride a professor's clothes.

What was intended as a scornful comment about the weather had been interpreted as an honest compliment on a piece of garment.


13 September 2020

Never argue against preferences.

I think that Hamlet is excruciating, The Lord of the Rings unreadable, Star Wars unwatchable. I like rainy days. I like Jazz. I can do without ice cream. In my opinion A Moveable Feast is the best book ever written, Breaking Bad the best television series ever made. I consider the Gettysburg Address unremarkable, democracy a bad idea, volunteering a selfish act. I prefer cats over dogs, and old cats over kittens. ...

You disagree?

So what? These are my preferences.

A preference is a favoured choice. The origin of a preference may be known or not. Values are preferences.

I think of preferences this way:

Imagine at birth you are given a preference basket or utility vault. It already contains several items, what Immanuel Kant's called a priori (e.g. a preference for breast milk over tuna oil, a preference for a warm blanket over cold cement). In the course of your life you will live through certain experiences and you will associate particular actions with particular outcomes. Actions whose outcomes produce pleasure will be added to your preferences, actions whose outcomes produce pain will not.

Note that it requires incentives to change your preferences. Think what it would take for you to change your favourite icecream flavour: Maybe if you found out that the one you prefer is also highly toxic? Or maybe if somebody paid you to switch to ox-tongue ice-cream. But would that really change the preference for flavour?

Example: Economics vs. Preference

When my wife and I met, she claimed that shopping at Costco saves her money.

I knew very little about Costco, the largest warehouse club chain in the world, but what I did know was that you had to buy most items in bulk and that the per item price was relatively low.

The problem was that at the time Kimberley had very little storage space in her apartment. Consequently, when she drove out into the netherworld of low industrial rent where Costco warehouses are often located, she bought very little (e.g. one gigantic bottle of ketchup, 20 cans of tuna, and 40 rolls of toilet paper).

I told her that if she factored in gasoline and time, shopping at Costco would probably turn out to be more expensive than shopping at a Safeway nearby.

"But I like shopping at Costco," she said.

End of argument. Although it was acceptable for me to question her economic argument, I certainly have no right to challenge her preference.

There is a twist though: Kimberley also likes shopping at small, locally-owned businesses, shops whose very existence is threatened by sub-urban, low-cost chain stores. Can I challenge an inconsistency?

Example: The Ultimatum Game

The Ultimatum Game is a tool used in experimental economics to study rational (vs. emotional) behaviour in human interactions under controlled conditions. Here are the rules:

  1. You are offered a large amount of money. There is only one condition: You have to share it with a stranger you will never meet. You have to decide how the money is split (e.g. 50/50, 60/40, 90/10), but you can only make a single offer to the stranger.
  2. Upon receiving your offer, the stranger can decide whether to accept or reject the offered split. If she accepts, you both will receive the agreed-upon share. If she rejects, neither she nor you will get anything.

Your first inclination is to offer 50/50, at least that is my hope. But upon reflection you realize that you could get away with a 60/40 split, for the responder would be a fool to forgo 40%. Or maybe even a 90/10 split? Also, you start thinking how you would react if you were not the proposer but the responder to the offer? (Note that more than half of the responders reject offers that are less than 80/20.)

There are four strategic personalities in this game. Which one describes you best?

  1. The American: Proposes an acceptable minimum and accepts a low offer.
  2. The Hypocrite: Proposes an acceptable minimum and only accepts 50/50.
  3. The Pushover: Proposes 50/50 and accepts a low offer.
  4. The Englishman: Proposes 50/50 and only accepts 50/50.

After many years of presenting this question to students, I found that most undergraduate students are Pushovers and a few are Americans. Less than 4% are Englishman. (I did have the occasional Hypocrite in the class, but I suspect that they did not understand the game.)

I also led discussions where one strategic personality should convince another to switch sides. Never was this done successfully.


30 June 2020


What is this thing called science? Think about it. How would you define science?

What we mean by science depends on the context in which we are using the word, and we can roughly categorize science into three semantic groups:

  1. Science facts: A collection of provisional facts about the natural world that have been established through the scientific method.
  2. The scientific method: A sequence of formal steps that allows us to establish a possible cause for an observed effect.
  3. Scientific institutions: Universities, research centres, scientific organizations, journals, et cetera.

Sometimes a fourth category is added, scientific spirituality or what Einstein called cosmic religiosity. It is a feeling of awe experienced when faced with a natural phenomenon.

Let us have a look at these categories in detail.


31 March 2020


Analysis paralysis: The condition when decision making is postponed until the information is complete and/or consistent. In the real world information is rarely complete and/or consistent.

Belief: A phenomenon that is assumed to be true. The origin of a belief may stem from a combination of facts, misinformation, and/or disinformation.

Data: Measurements of an observable phenomenon. Singular form: datum

Fact/A matter of fact: An observable phenomenon that can be independently verified.

Hypothesis: An idea about the cause for an effect. Where your idea comes from -- logic, insight, a dream, the Bible -- does not matter.

Opinion: A judgement based on beliefs and preferences.

Preference: A favoured choice. The origin of a preference may be known or not. Values are preferences.

Truth: A fact. A term usually avoided by scientists because of the inherently provisional nature of scientific facts. See: Fact


30 March 2020


Bacon, Francis (1597), Meditationes sacrae.

Churchill, Winston (1947), Speech in the House of Commons, 11 November 1947 (https://api.parliament.uk/historic-hansard/commons/1947/nov/11/parliament-bill; Accessed: 20 Apr 2020)

Doyle, Arthur Conan (1892), A Scandal in Bohemia.

Hájek, Alan (2017), Pascal's Wager (https://plato.stanford.edu/entries/pascal-wager/; Accessed: 12 Jun 2020).

Hardin, Garrett (1985), Filters Against Folly.

Jefferson, Thomas (1820), Letter from Thomas Jefferson to William Roscoe, 27 December 1820 (https://founders.archives.gov/documents/Jefferson/98-01-02-1712; Accessed: 20 Apr 2020).

Kahneman, Daniel (2011), Thinking, Fast and Slow.

Lewontin, Richard C. (1970), The Units of Selection. Annual Review of Ecology and Systematics 1: 1 - 18.

Plato (ca. 375 B.C.E.), The Republic: 558c.

The Royal Society (1663), The motto of The Royal Society of London for Improving Natural Knowledge (https://royalsociety.org/about-us/history/; Accessed: 20 Apr 2020).

Russell, Bertrand (1931 - 1935), Mortals and Others.

Sagan, Carl (1990), Cosmos: A Personal Voyage (1990 Update).

Sigmund, Karl, Ernst Fehr, and Martin A. Nowak (2002), The Economics of Fair Play. Scientific American (January 2002): 82 - 87