So, this crank John Gabriel exploded on the Mathmatical Mathematics Memes page on facebook recently, and he’s hilarious. Now, there’s cranks in every area of science of course; most notably in physics (quantum woo), biology (creationists), geology (creationists again), history (creationists again, holocaust-deniers), philosophy (theologians 😉) and of course medicine (alternative medicine, faith healing…), but in mathematics they happen to be rather rare – or at least there are few interesting ones. Or I just haven’t found their hiding place yet.
But I suspect that’s because to be a crank you either have to flat-out lie to people (and what would be the point with math?) or
- Not know enough about the subject to realize you’re wrong, while at the same time
- think you know enough to boldly proclaim your wrongness to the public.
I imagine that’s easier with e.g. physics, where people can read popular books dumbed down for a lay audience (and I don’t mean that in a derogatory way – I love pop science!) and come away thinking that they now know all the important stuff and can start drawing their own conclusions on the subject matter (Spoiler alert: No, you can’t. If you can’t solve a Schrödinger equation, you’re simply not qualified when it comes to quantum physics, period.) But with math I can imagine it being a lot harder to both think you understand something well enough to pontificate about it while at the same time not understanding it enough to realize your pontifications make no damn sense.
John Gabriel manages to do both, and it’s fascinatingly weird. He’s the perfect embodiment of the Dunning-Krüger effect on steroids: He understands so little about modern mathematics that he doesn’t even realize how little he understands, and instead thinks he’s the only one who really gets how math works. In typical crank fashion he rails against “stupid academia” who get so hung up on useless concepts like “reason” or “making any sense whatsoever” that they just don’t realize what a genius he is.
Or it could be that he’s just wrong and makes no fucking sense. It’s a toss-up.
John, let me recite Potholer’s Trichotomy to you:
If something in science doesn’t make sense to you, you have to conclude that either
- Research scientists are all incompetent, or
- they’re all in on a conspiracy to deceive you, or
- they know something you don’t, and you need to find out what that is.
Hint: Try option three first.Potholer54
Interestingly enough, I had read about Gabriel before – years ago on good math, bad math, where he ended up arguing with Mark Chu-Carroll about Cantor’s second diagonal argument. That article is from 2010, but apparently about a year ago Gabriel started a youtube channel, presumably in the hopes to bring more people to his more enlightened (i.e. nonsensical) side and to proclaim the fact that he invented a new calculus!
That’s right, he has reinvented calculus, and his version is much better and simpler and it’s easy to understand for anyone open enough to abandon sense and rigor, unlike all those stupid academics.
And given that I’ve just been made aware of his existence again, I figured I’d give it a go and dissect that guys videos, because
- it’s fun (at least to me) and
- it’s as good a reason as any to explain some of the stuff he gets wrong in some more detail, and any attempt to explain math to people is time well spent in my opinion.
So let’s start with his first video:
This is just a short video on the arithmetic mean – i.e. the “average”. This isn’t as cranky as his other stuff, but it already gives a fascinating glimpse into the way Gabriel thinks. Now, as I said, the arithmetic mean is just the average of a bunch of numbers. We all know how to compute it, we all know why it’s useful – we all remember computing or getting told the average grade in exams, for example. And there is absolutely no reason why I mention that particular example. Here’s what Gabriel’s video description says about it:
The arithmetic mean is one of the most important concepts in mathematics. While just about anyone knows how to construct an arithmetic mean, almost no one understands it.
Right… the average of a bunch of numbers is really hard to grasp. I remember struggling with it in elementary school as well… no, wait, I didn’t. Maybe that’s just because I didn’t realize how awfully complicated it in fact is, after all, almost no one understands it. But Gabriel does, of course.
To compute the arithmetic mean of a bunch of numbers, we just add them all up and divide the sum by how many numbers we had. In mathspeak:
Definition: The arithmetic mean of a finite sequence of real numbers is given by
We’ve all done that for grades: Add up all the grades of all the students in an exam, divide the result by how many students there are and you get the average grade in that exam. Here, by contrast, is Gabriel’s “definition” (and yes, he means definition):
An arithmetic mean or arithmetic average is that value which would represent all the elements of a set, if those elements are made equal through redistribution.
…now I don’t know about you, but… is that even a sentence? What does that mean? “That value which would represent all the elements of a set“? “If those are made equal…” …well, then the set only has one element, doesn’t it? (Sets have no multiplicity – either a number is in a set or it isn’t.) OK, at least then I can guess what he means by “represent”. But “through redistribution“? What does “redistribution” mean in this context?
This is not a definition. This is at best a clumsy attempt at explaining a definition. But he actually calls this a definition, and he runs with it. So here’s a beautiful example of why definitions fucking matter.
He goes on to explain, that you can compute the arithmetic mean by drawing squares. He demonstrates this with three sets of squares, the first one having one square, the second two, the third three. He moves one square from the last set to the first so that every set has two squares, thus “making them equal”, hence the arithmetic mean is two.
Now at least one can understand what his so-called “definition” was supposed to mean, but the immediate problem now is: What if the total number of squares isn’t divisible by the number of sets you have? Then his “redistribution” attempt fails, so according to his definition there is no arithmetic mean in that case. But he also shows us how to compute it using “algebra“, by which he means arithmetic (pun intended – and yeah, he can’t even get that right) – i.e. summing up and dividing the result according to the definition I stated above. But that’s not what his definition says.
See what I mean when I say this guy makes no sense? But yeah, he runs with it:
A useful arithmetic mean is one where it makes sense to redistribute the values.
Example: Three friends each need $2 to buy lunch. They decide to pool their money because one of the friends may not have enough. If the total they have is $6, then it’s evident there is enough money for all three to buy lunch.
Redistribution is accomplished by sharing the money.
A useless arithmetic mean is one where it makes no sense to redistribute the values.
Example: The arithmetic mean of student grades in a given class is a senseless calculation because students cannot share their marks.
Redistribution cannot be accomplished by sharing grades.
…jupp. First, notice how no arithmetic mean appears in his first example. Anywhere. Something costs $2, three friends pool their money, they need at least $6. The conclusion I’m left to draw is, that a “useful arithmetic mean” is one which isn’t even used, despite the name. Quite counter-intuitive.
However, the prime example for an average – namely the average grade in an exam, something everyone has seen hundreds of times in school – is, to him, a “useless arithmetic mean“, because students can’t share grades. How does that even make sense? And don’t think that’s just a term he’s introducing, and that he doesn’t mean the word “useless” in a literal sense. Listen to the derision in his voice when he talks about the “senseless computation“.
Of course it makes sense to compute the average grade – it gives you a good baseline to compare your own grade to, a sense of how well you did in comparison to the others without needing to know everyone’s specific result (which are confidential, after all). It gives you a sense of how difficult the exam was, or how lenient it was graded. But no, that’s all meaningless because students can’t share grades.
But also, why does this matter? Math is abstract, it doesn’t care how you apply it, what you apply it to and whether the result of that application still has any meaningful interpretation in the real world!
Yeah, this is how Gabriel works in a nutshell:
- He takes a mathematical concept with a proper definition which he either doesn’t know, like or understand (or any non-empty subset of the three),
- he visualizes or interprets it in some vague way (“making things equal through redistribution“),
- he insists on his ill-defined vague interpretation to be the actual definition (even though it’s hand-wavy, vague nonsense),
- he labels everything outside of his vague interpretation as “meaningless” and therefore void and draws absurd conclusions from his “definition”,
- he proclaims that he has found the ultimate real meaning of the mathematical concept and rails against stupid academia.
It’s glorious in its arrogance and ignorance.
(Next post on John Gabriel: Calculus 101 (Convergence and Derivatives))