(Previous post on John Gabriel: The Dunning-Krüger Effect in a Nutshell)

Few people can ever begin to match my intelligence and depth of insight. I am not arrogant or deluded.John Gabriel

Yeah. *That’s an actual quote.* I’ve been made aware of Gabriel’s LinkedIn page, where he wrote hilarious posts about his new calculus and his axioms for arithmetic. And (as someone on Mathematical Mathematics Memes pointed out), it’s becoming increasingly plausible that this guy has some mild form of mental illness, or at least a personality disorder. *I mean that without a hint of irony* – the narcissism and ignorance of this guy even dwarf Donald Trump. Here are just some further choice quotes:

“After Euclid and before me,

, ever understood what is a number. That’s quite a big statement, but I have proved it.”not a single mathematics academic

“Georg Cantor, whom I consider one of the greatest fools in mathematics and the reason so many have problems with math.”

“I loathe mainstream academia and it’s hard for me to restrain myself. My tolerance for stupid people has long ceased to exist.”

“I realised many years later, they rejected my discoveries for several reasons, but the one that stood out is the fact that they did not like me personally. Truth or proofs had little to do with the rejection. They decided to libel and defame me, rather than study my ingenious work which is worthy not of one Abel prize, but of ten Abel prizes.”

“One would think that given I am helping future generations of aspiring young mathematicians, they would be grateful and welcome this new knowledge I reveal. But no, my life has all but been destroyed by the efforts and attacks of the most vile scum in mainstream academia.”

“The NC is the first and only rigorous formulation of calculus in human history. That is an incredible accomplishment given that no one before me was able to do this – not even the so-called greats such as Archimedes, Newton or anyone else. It is no longer debatable, but proven fact.”

John Gabriel – Jesus, Aristotle, Newton and Einstein all rolled into one. Praise him.

Yeah. *Verbatim*, people, *verbatim*. And I don’t think he’s a troll either – *he’s been doing this for years*, if not decades, and he takes every piece of criticism as a personal attack. He really seems to think he is god’s gift to humanity. So, let’s continue to *take him down a notch.*

In the second video on John Gabriel’s YouTube channel, he starts ranting about how calculus (unlike his *new calculus*, which is perfect in every way!) is *wrong,* which means I might as well use this opportunity to explain why it’s not and in general *how this stuff actually works*. Unfortunately (or *fortunately*, depending on your aesthetics) that means getting into *serious math territory* – many things that Gabriel gets wrong have to do with the *fundamental definitions* of e.g. convergence, the real numbers etc. However, if we want to see *how wrong Gabriel really is,* we first need to make sure that we all agree what the “official” (i.e. *“right”*) definitions of all those concepts *are,* what *motivates* these definitions and what their *implications* are.

**Disclaimer: **I will assume that we all know and somewhat agree that *rational numbers* are, like, *a thing* – that is, numbers that can be expressed as fractions of integers . The set of all rational numbers is denoted as , the set of all natural numbers – i.e. the numbers – as . I mention this, because I *will have to talk about what “real numbers” really are* in modern mathematics – something that Gabriel really doesn’t seem to grasp. Also: Usually I prefer to have to be a natural number, but I specifically exclude it from here, just for convenience – it allows me to e.g. define a sequence without needing to worry about the case .

**Second disclaimer:** I’m not a historian. I might get some, many or all of the historical details wrong. I’m writing this pretty much from the top of my head. The same holds for all definitions, proofs etc. With respect to the historical stuff, it doesn’t even matter – after all, almost everything to do with actual mathematics has changed since then, and what’s important is the motivation behind this stuff, not the precise historical development, which is why I can’t be bothered to fact check this in detail. With respect to the actual math: It’s waaay more fun to redevelop all the concepts from the top of my head, rather than looking everything up in textbooks. So don’t believe anything, check everything for yourself and see whether it works out. I’m still, like 90% sure that all my definitions are either standard or equivalent to standard definitions, so don’t reject everything I say out of hand either.

## The Origins of Calculus

Calculus was developed by Isaac Newton and Gottfried Leibniz. It’s not quite clear who invented it first; it’s not unlikely that they invented it independently of each other, inspired by similar problems. What we *do* know is that Leibniz* published* his calculus first and it’s *his notations* that we still use today. Newton (of course) claimed *he* invented it first and he used it to prove, that an inverse square law like the one in his theory of gravity *would in fact imply elliptical planetary orbits*. It’s an astonishing feat of intellect – this guy basically came up with a *working, mathematical theory of gravity* to explain planetary orbits, and *invented completely new mathematics just to prove that it works.*

Calculus is (quote Wikipedia)* the mathematical study of continuous change*. Its basic objects of interest are *continuous functions* on the *real numbers* (often described as *“functions whose graph can be drawn in one stroke without lifting the pen”*) and its most important notions (besides continuity) are *derivatives* and (basically the inverse to derivatives)* integrals*. Nowadays, we define the latter using *limits of sequences*, and those we define using ––*criteria*, which we have to thank Augustin-Louis Cauchy and Karl Weierstrass for.

However, in Newton’s and Leibniz’ times the “limit of a sequence” *wasn’t yet a well-defined notion*; instead, they used* infinitesimal numbers* in the development of their theories. So here’s approximately their thought process:

Assume we have some continuous function . As an example, let’s say . Its graph looks like this:

Question: *What is the slope of that function at the point* ? I mean, obviously the function is increasing to the right, but *how fast* is it increasing? Obviously it’s not increasing “*at the same speed*” everywhere – otherwise the graph would just be a straight line. So, how can we find out *“how fast”* the function is increasing at the specific point – and *what does that even mean*?

Well, let’s look at* two* points instead: e.g. and . How fast does the function grow *in the interval from to *? Now *this* we can answer: we know and . So the function has grown by . That’s an *absolute growth* of 2 in the interval of length .

Which means: *on average* the function grows by a factor of in that interval:

That’s how we measure speeds in practice: Note at which time e.g. a car passes a fixed point , at which time it passes a second point and divide the distance by the time it took, i.e. . This will give you the *average speed* in the time period from to .

But of course, it doesn’t give you *the exact slope* at the singular point . But it might give you an idea how to get there: If we decrease the distance between and (assuming the function doesn’t do weird stuff in between), we will be somewhat closer to the exact slope. For example, if we pick , then and thus the average growth is .

And here’s Newton’s and Leibniz’ mental leap: If we decrease the distance between and to the point where it is *infinitesimally small*, then we will get the *exact slope* of at the point (or – the difference between the slopes will *also* be infinitesimally small)!

So, let’s assume we have some* infinitesimally small* (whatever that means), then the *derivative*** of** (i.e. the slope of at the point ) is given by .

For our function, that means:

…and (so the reasoning goes) since is just an *infinitesimally small number* and hence ultimately *negligible*, we can ignore it and get , and hence we finally get the exact value .

Obviously, there are problems with that reasoning: What the hell are those *“infinitesimal numbers”* that are suddenly introduced, that I can can apparently add and multiply and divide by (I mean – I *can’t* divide by zero, but I *can* divide by something that’s *“infinitely close” to zero*?), but then in the end *I just ignore them? What’s that all about? Is this supposed to make sense?* And if is “infinitesimally small”, shouldn’t that mean that would have to be *infinitely large*? Does* that* still make sense? *What’s going on here? Aaaaaaaah!
*

Well… the thing is… it *sort-of works*. At least for relatively simple functions as the exemplary one I used it *yields meaningful results*, regardless of how weird the reasoning to justify the method is. But infinitesimals were never quite satisfactory, which is why Cauchy and Weierstrass tried to put the whole thing on a *more solid basis*.

Interestingly enough, this whole infinitesimal stuff was actually formally grounded in a rigorous way in the 20th century (and resurrected as “non-standard calculus”). But the way *“standard”* mathematicians interpret and think about calculus and real numbers in general is in terms of *cauchy sequences, limits and –-criteria*, so let’s explain the modern foundation for calculus now.

## Sequences, Limits and Differentiability

**Definition:** A ** sequence** of rationals is simply a function – i.e. a function that maps each natural number to some rational number .

Sequences are usually denoted as (or in short just ) and the individual elements as (instead of – i.e. we just write the function argument as an index).

So, why are sequences interesting? Consider the following two examples:

- (i.e. ) and
- (i.e. ).

There’s something fundamentally different about the two: Obviously, if we increase , the first sequence will *strictly increase* as well, while the second one *strictly decreases*. Okay, that’s not too interesting, but if we look closer, we notice that the first sequence is also *unbounded*: Pick an arbitrarily large number – at some point the first sequence will grow larger than (just pick any natural number larger than , then ). For the second sequence however, we can give a *lower bound*; e.g. . Even though strictly decreases, it will never become smaller than .

But of course, we can give a “better” lower bound than – namely . This is also a lower bound, because all the elements of are strictly positive; hence no element will ever be . In fact, is the *largest lower bound* (or *infimum*) of the sequence, and *the larger* a natural number we choose,* the closer the sequence element will be to *.

It’s consequently not completely absurd to suggest that the sequence *approaches* in such a way, that we may meaningfully say that is the *limit* of the sequence . In contrast, does not seem to have such a limit – the sequence just gets larger and larger with no bound in sight (we *could* say that the limit of the sequence is *“infinity”*, but infinity is not a number per se, and infinities are – *without a careful formal treatment!* – problematic anyway). We say *the sequence converges towards *, and *the sequence diverges*. Now let’s properly define those two terms:

**Definition: **Let be a sequence of rationals. Assume there is some rational number such that the following holds:

For any arbitrarily small rational number there is some index such that for any index the distance is smaller than . In logical notation:

Then we say the sequence **converges** **to** and write or .

If no such exists, we say the sequence **diverges**.

Okay, this looks a bit complicated, so let’s explain it in more detail: We say a sequence converges to some number , if we can get *“arbitrarily close”* to by making the index of our sequence larger. This *“arbitrarily close”* we can express formally by thinking about it as a kind of game: You tell me *how close* to you want to be, by giving me an (arbitrarily small) distance . Then I’ll give you an index in return, such that *all subsequent elements* in the sequence are *closer to* than your chosen distance – i.e. *for all subsequent indices* , we have . If I can *always* give you such an initial index,* no matter how small* a distance you choose, then I can adequately say that the sequence converges towards .

Alright? So far, so good. Now we can use limits of sequences to define the *limit of a function at a point *. Why should we? Well, look at the function , for example. This function is *not well-defined* at , because then the *denominator would be * – i.e. *“doesn’t exist”.* But, you know, here’s what this function looks like:

In fact, the function is equal to the function *everywhere except at* ! Annoying, but if we build a sequence that converges to (for example the sequence ), then we can define as the limit of the sequence (the resulting, now everywhere-defined, function is called the *continuous extension* of ), which happens to work out nicely and give us . *Problem solved!*

*…eeeexcept,* of course, that this only makes sense if the sequence *converges at all,* and – more notably – that the limit* does not depend on the specific sequence* . So instead of defining the limit of a function using sequences, we will use another –-criterion:

**Definition:** Let be a function on rationals (i.e. ) and . If there is some number such that the following holds:

For every arbitrarily small rational number , there exists some such that for every with we have . In logical Notation:

Then we call * the limit of at * and write .

The idea being a similar game as in the definition of convergence for sequences: You tell me any arbitrarily small distance to the (supposed) limit you want to have, and in return I will give you a distance , such that if any is closer to than , then will be closer to than . If I can always give you such a , no matter which you pick, then I win and is indeed the limit of at .

Alright, and now we can finally define derivatives using function limits – the idea being, that instead of picking an “infinitesimal number”, we take the function limit of the quotients:

**Definition:** Let be a function on rationals and . If the limit

exists, we call ** differentiable at **. If is differentiable at every point in we call

**differentiable**, and the function

the ** derivative of **.

You’ll note, that this is exactly what Newton and Leibniz did; except that we got rid of those pesky infinitesimals and only used notions, that are formally and rigorously defined – there’s no room for ambiguity anymore. Furthermore, all of this works perfectly and beautifully – for example, all of the following highly desirable properties (assuming all the occuring limits exist) hold and can be easily proven using the above definitions (left as an exercise):

- The limit of a convergent sequence is unique,
- if and only if

…and we didn’t even touch the real numbers yet!

(Next post on John Gabriel: Calculus 102 (Cauchy Sequences and the Real Numbers))

Hello Crank!

I see you started with Euler’s Blunder but you said nothing about it – just like you’ve pretty much said nothing about anything else that follows.

Do you or don’t you agree that Euler wrote this in his Elements of Algebra?

Oh come on now hippy! Humour me. Chuckle.

Why would I care what Euler wrote? It doesn’t matter. Euler’s writings don’t define what is or isn’t mathematics. Euler was certainly capable of being mistaken, that doesn’t invalidate any modern mathematics, let alone the formal foundations of calculus, which aren’t based on Euler anyway, they’re largely based on ideas developed by Weierstrass. And Weierstrass too probably made mistakes at some points. Which is why we base mathematics on *ideas* and not on some famous guy’s authority.

@Jazzpirate

If you knew or even understood the very mainstream mathematics you claim to defend, you would know that it is due to Euler that you still peddle nonsense like 1/3 = 0.333…

Euler knew more algebra than you or any of the fools in academia the last 200 years and his Elements of Algebra very much influenced the way ALL mainstream mathematicians think or don’t think.

The formal “foundations” of mainstream mythmatics are a joke and based on ill-formed concepts which is something one like you would not be able to grasp due to your limited intellectual capacity.

Mainstream calculus is based on a bogus formulation in more ways than one. My free eBook which is the most important mathematics book ever written debunks everyone of your delusional claims:

https://drive.google.com/file/d/1CIul68phzuOe6JZwsCuBuXUR8X-AkgEO/view

“it is due to Euler that you still peddle nonsense like 1/3 = 0.333…” – No. It is entirely due to the definition of the floating point representation of numbers. There’s no sensical definition of what number “0.333…” is even supposed to represent, that preserves basic arithmetic laws and the euclidean axiom and does NOT entail that it’s equal to 1/3.

“Euler knew more algebra than you” – No. I know more algebra than even existed at Euler’s times. Because it evolved massively over the last 100 years. Also, arithmetics is not algebra.

“based on ill-formed concepts” – They’re not ill-formed. They are sufficiently well-defined to be axiomatizable and proofs on them are entirely computer-verifiable. Fragments are even decidable. Can you claim the same about your “axioms”?

Counter question, since you disagree:

– Do you agree that “0.333…” *means* (the result of evaluating) the series \sum_{i=1}^\infty 3*10^{-i}? If not, what else would it mean?

– Do you agree that an infinite series should evaluate to the limit of the sequence of its partial sums? If not, what else would an infinite series denote?

– Do you agree that the limit of a sequence should be *the* number that can be arbitrarily closely approximated by going along the sequence (assuming such a number exists)? If not, how do you define the limit of a sequence?

“it is due to Euler that you still peddle nonsense like 1/3 = 0.333…” – No. It is entirely due to the definition of the floating point representation of numbers.

Nonsense, Floating point representation (fpr) is just another way of representing rational numbers. fpr does NOT represent any bogus “real” number.

“There’s no sensical definition of what number “0.333…” is even supposed to represent,”

You don’t even know what your own mainstream theory claims. According to your flawed mainstream theory, 0.333… IS the LIMIT.

” that preserves basic arithmetic laws and the euclidean axiom and does NOT entail that it’s equal to 1/3.”

All nonsense. There are NO axioms or postulates in Euclid’s Elements. Stupid people like you did not understand and this is why they decided to believe (axiomatize).

“Euler knew more algebra than you” – No. I know more algebra than even existed at Euler’s times.

Crank! If you did, then you wouldn’t be arguing. You don’t even know half of what Euler knew.

” Because it evolved massively over the last 100 years.”

False. Classic algebra has not advanced even ONE iota past Euler. If you are talking about “abstract algebra”, well, this is NOT algebra but something entirely different.

“Also, arithmetics is not algebra.”

So? Who said these are the same?

“based on ill-formed concepts” – They’re not ill-formed.

They are ill-formed concepts as I have proved beyond any shadow of doubt.

” They are sufficiently well-defined to be axiomatizable and proofs on them are entirely computer-verifiable.”

Look stupid, FOL(first order logic) is based on bogus axioms, There are NO axioms in geometry or sound mathematics. There is no place for “belief” or “religion” in rational thought, That nonsense belongs to your flawed mainstream theory. The word “axiom” and “postulate” appear NOWHERE in the Elements of Euclid, but you have never studied the same so you don’t have a clue.

” Fragments are even decidable. Can you claim the same about your “axioms”?”

Non-sequitur. There are no axioms in the New Calculus, only in your bogus and dysfunctional mythmatics.

“Counter question, since you disagree:

– Do you agree that “0.333…” *means* (the result of evaluating) the series \sum_{i=1}^\infty 3*10^{-i}? ”

No idiot. I have never agreed. There is NO such thing as \sum_{i=1}^\infty 3*10^{-i}. Infinity is a JUNK concept. You cannot sum an infinite series. You can only find its limit if it converges and the limit happens to be a RATIONAL NUMBER. The limit of the series 0.3+0.03+… is 1/3.

“If not, what else would it mean?”

It is nonsense that YOU preach. Perhaps you should ask yourself what it means, because to my super intelligent mind, it is clearly syphilitic thinking.

“– Do you agree that an infinite series should evaluate to the limit of the sequence of its partial sums?”

NO. The limit of a CONVERGENT series is a RATIONAL NUMBER or it is an INCOMMENSURABLE MAGNITUDE (NOT an irrational number because there is NO such thing. A number by definition describes the measure of a magnitude or size. There are NO numbers that describe the measure of pi, e, sqrt(2), etc.)

Furthermore, since there is NO such thing as an “infinite series”, your question is misdirected. The limit in any case doe NOT care if the terms are all in the series or even there at all. Moreover, it is fairly easy to prove that even if 0.3+0.03+… hypothetically could be summed, that the sum WILL NEVER be 1/3 because 1/3 has no MEASURE in base 10.

” If not, what else would an infinite series denote?”

It’s nonsense because “infinite series” is a MISNOMER. A series consists only of partial sums and possible an ellipsis at the end to denote there is no last term.

“– Do you agree that the limit of a sequence should be *the* number that can be arbitrarily closely approximated by going along the sequence (assuming such a number exists)?”

No. Because there may be NO number describing the “limit”. As for arbitrary closer, well that is just syphilitic and meaningless nonsense.

“If not, how do you define the limit of a sequence?”

It can only be defined for a CONVERGENT sequence. If it is measurable, then a RATIONAL NUMBER exists which describes it as in the case of 0.3+0.03+… If it is not measurable, then the limit exists as some quantity whose measure cannot be determined – not even by the gods!

“Floating point representation (fpr) is just another way of representing rational numbers” – and how is the representation DEFINED? What does it mean to put “…” after 0.333?

“There are NO axioms or postulates in Euclid’s Elements.” – First of all: Yes, there are. Second of all, the “Euclidean axiom” is not in Euclid’s Elements. It is a name for the axiom “For each element (of some field) a there exists a natural number n such that n>a”. This is *called* the Euclidean Axiom.

“If you are talking about “abstract algebra”, well, this is NOT algebra but something entirely different.” – *sigh*. Then you need to clarify what you mean by “algebra” if you don’t mean what everyone else means.

“They are ill-formed concepts as I have proved beyond any shadow of doubt.” – you haven’t. However, every computer implementation of the axioms proves beyond any shadow of a doubt that they are well-defined.

“Infinity is a JUNK concept.” – then what is “0.333…” supposed to mean?

“It can only be defined for a CONVERGENT sequence. If it is measurable, then a RATIONAL NUMBER exists which describes it as in the case of 0.3+0.03+… If it is not measurable, then the limit exists as some quantity whose measure cannot be determined – not even by the gods!” – this is not a definition. It is entirely incomprehensible mumbling. I asked for a definition. What does “the limit of a sequence” mean in your world?

1. No, there are NO axioms or postulates in Euclid’s Elements. Whether you like this or not, it is a FACT.

2. I don’t give a crap about what you call the “Euclidean axiom”. What I know beyond any shadow of doubt is that there are NO axioms in Euclid’s Elements. I read Greek and I am a mathematician. What are you? Chuckle.

3. “It is a name for the axiom “For each element (of some field) a there exists a natural number n such that n>a”. This is *called* the Euclidean Axiom.”

That statement is true but it is not called an axiom, never mind Euclidean axiom – anywhere in the original Elements.

“They are ill-formed concepts as I have proved beyond any shadow of doubt.” – you haven’t.

Actually, I have. You are simply not intellectually capable of understanding. Hardly surprising given the morons who taught you. Rather than argue with you, I invite others to read my free eBook:

https://drive.google.com/file/d/1CIul68phzuOe6JZwsCuBuXUR8X-AkgEO/view

“However, every computer implementation of the axioms proves beyond any shadow of a doubt that they are well-defined.”

Poppycock! You don’t even have a clue what that means, much less the fact that there are no axioms.

“Infinity is a JUNK concept.” – then what is “0.333…” supposed to mean?

You tell me idiot! I don’t subscribe to nonsense created by that Swiss moron Euler.

“It can only be defined for a CONVERGENT sequence. If it is measurable, then a RATIONAL NUMBER exists which describes it as in the case of 0.3+0.03+… If it is not measurable, then the limit exists as some quantity whose measure cannot be determined – not even by the gods!” – this is not a definition. It is entirely incomprehensible mumbling. I asked for a definition. What does “the limit of a sequence” mean in your world?

It is an explanation you idiot. This discussion is over because you are evidently not able to comprehend even the simplest concepts.