I just read an article by Maarten Boudry, where he argues that the practice of spotting and pointing out fallacies is over-used in the skeptical community.
https://maartenboudry.blogspot.com/2017/06/the-fallacy-fork-why-its-time-to-get.html
Spotting fallacies has long been popular in skeptical movements, but personally I've never had much time for them. There's so many fallacies, often with non-descriptive names and unclear descriptions, and it seems more like a game of "fit the fallacy to the argument" than actual productive critical thinking.External Quote:
In popular books about skepticism and in the pages of skeptical magazines such as this one, one commonly finds a concise treatment of the most common types of fallacies. The traditional classification is widely known, often by its Latin name: ad hominem, ad ignorantiam, ad populum, begging the question, post hoc ergo propter hoc. Some of them are more obscure, such asignoratio elenchi, affirming the consequent, secundum quid, and ad verecundiam, better known as 'argument from authority'.Most of them date back to the days of Aristotle, others are relative newbies, like the slippery slope fallacy, the genetic fallacy or – for obvious reasons – the reductio ad Hitlerum.
Such lists serve a pedagogical purpose. By learning the most common types of reasoning errors, you will avoid making them yourself, and become better at spotting them when others do. It's a kind of inoculation against irrationality. If only people would learn the list of fallacies, the world would be a far more rational place!
Except this neat little story is wrong.
There's really only one fallacy worth remembering - the one that encompasses all the others. The non sequitur.
Non sequitur is latin (for "it does not follow"), but it's now english, commonly understood to mean a statement that does not follow from what came before - often just an odd utterance by the socially inept. In logic and philosophy it more formally means a conclusion or belief that is not justified by (or does not follow) the premises.
Take the famous ad hominem fallacy. Attacking the person instead of the argument. A plain reading of this would tell us that we must judge arguments on their own merits, not on the character, status, or history of the person giving the argument. Yet in the real world if we are to pit arguments of a chronic liar who has tried to hurt us, against a wise and reliable friend, then we pick the latter with great justification. It does not follow that any two arguments are equally worthy of consideration.
Like most people, I strive to reach correct conclusions. I know that examples of these fallacies are real, but I don't use the classification of arguments according to fallacies as a tool. I simply ask "does it follow?". The answer is often complicated and messy to resolve, but I never felt I was missing anything by failing to check the hundreds of possible fallacies in the list.
So Maarten Boudry's article rang a bell for me. While it might not be time to "get rid of fallacy theory" in its entirety, it may well be time to rein it in and acknowledge that it's more of an interesting classification scheme for non sequiturs than an actual aid to critical thinking.
Last edited: