dissonant wrote:Man, you need to relax. We name theorems after people for a lot of reasons. Respect is a big part of it. But a lot of the time it is just for ease of communication. I really can't think of a snappy name for Green's Theorem for instance. But I would immediately understand what you meant if you said it.
Ease of communication? How's saying 'Newton's fist law of motion' any easier than 'The Law of momentum-and-force'? The latter is surely quite a lot more descriptive, strangely. 'Euclid's theorem' is a lot shorter than 'The fundamental theorem of arithmetic' and both are equally vague in name. Naming theorems after people essentially causes confusion if people had more than one interesting result. The 3 and the 1 are quite arbitrary of course in Newton's case.
Just saying: The law of momentum-and-force, the law of gravity, and the law of conservative force is quite a lot more descriptive and stops confusion. But then you aren't idolizing your mathematical idols like a teenage girl idolizing Laiho do you? Be it God, Raymond van Barneveld, Newton, Knuth, or Fabio Lanzoni, people always need some thing to idolize. You could also ask yourself why people tend to idolize people they don't really know, could it be that.. ghasp.. if you do know them from a close distance they're not gods any more but just random people?
To say Wiles proved Fermat's Last Theorem only for the glory seems completely unfounded. I have read that he came across the theorem in his youth, it was obviously a personal obsession of his. It seems naive to say that he was only in it for the fame, although I am sure that was a part of it.
I never said it was the only part, I said it mattered to him that he
was the one that proved it. Not that it was proven in the end by some one. Your sentence seems to imply we agree here. Typical really, one sees it every where, vegetarians do not care for that meat isn't eaten, they care for it that they
aren't the ones that eat it. I have it too, I find it a lot worse if for instance a friend of mine cries and I was the one that caused it, it's disgusting, that said cries is all that matters, not who caused it, the damage is done and that I was the one that caused it isn't relevant. Truly disgusting aspect of people here. And in the end we all try to delude ourselves that we're caring to heal it.
Also, Perelman is surely much more famous for rejecting his Field's medal rather than accepting it.
True, ahaha. Poor guy though, he retired from maths after that, the pictures taken of him in the trains and all, ahah, he's apparently scared to go out of (his mother's) house.
Newton never talked about infinitesimals; he focused on the concept of derivative, which he called "fluxions". In the Principia, he doesn't even use these; it's all Euclidean geometry and limiting arguments. There is not a single mathematical gap in the first book of the Principia (which is the more mathematical part). Even in his other, more liberal work on infinite series, saying he "divided by zero" is a gross misrepresentation of the semi-formal arguments he presented, and anachronistic to the point of irrelevance.
Newton quite did divide by zero to produce his derivatives and his methodology often returned simply absurd errors and the approach was then to simply go another path until it's no longer absurd. Derivative functions in the days of Leibniz and Newton were never rigorous mathematics, in fact, the concept didn't even exist really then as it was never quite clearly stated what axioms are used
. They used naïve imported explicitly and implicitly axioms they saw around them and deemed 'common sense'. There was no definition of what 'real numbers' really were back then but Newton used the standard addition and added a certain constant p a lot to a given x to leave x. That's an additive identity per definition, and where he operated the additive identity is the only element that does not have a multiplicative inverse. Yet he still divided by that p a lot of times to make his derivatives. That p is often just called 0, in Newton's day people had the idea that it was a different object, often called such things as 'the smallest possible thing next to zero' or 'infinitely small', real numbers are dense, there is no 'smallest possible next to zero', if there is no other element between p and 0 then p = 0.
The free encyclopaedia will all love and hate says:
'The product rule and chain rule, the notion of higher derivatives, Taylor series, and analytical functions were introduced by Isaac Newton in an idiosyncratic notation which he used to solve problems of mathematical physics. In his publications, Newton rephrased his ideas to suit the mathematical idiom of the time, replacing calculations with infinitesimals by equivalent geometrical arguments which were considered beyond reproach. He used the methods of calculus to solve the problem of planetary motion, the shape of the surface of a rotating fluid, the oblateness of the earth, the motion of a weight sliding on a cycloid, and many other problems discussed in his Principia Mathematica. In other work, he developed series expansions for functions, including fractional and irrational powers, and it was clear that he understood the principles of the Taylor series. He did not publish all these discoveries, and at this time infinitesimal methods were still considered disreputable.
The 'naïve' concept of the infinitesimal in newton's time equals zero. The infinitesimal cannot exist as a real number, this is quite trivial. Essentially Newton's derivatives usually worked but not always
and to that it's not mathematical, but rather engineering.
Though it is true that that Newton divided by zero is as much naïve as his work, as he never explicitly said that dx was a real number, or that it was a number or what it really was. He never really said what he was doing, as was the norm in those days. Russell tore it down at the fundamental when he showed that things can go wrong if one is not extremely precise about what one is actually doing. It's just to assume that dx was to be a real number, because it really couldn't be any thing else. But there's no way to proof that it wasn't or was a real number, as he never really defined dx.
Euclid proved that there is not a finite number of primes, and his proof stands perfectly today. This is not what is called the "Prime Number Theorem".
I know, I meant that Euclid gave a 'rough' proof of the fundamental theorem of arithmetic, the prime number theorem came quite some time later. Also, his prove doesn't stand, his conclusion stands. It was dumb luck actually. I could point out one flaw in his proof. To begin with his proof depends on the existence of a total order on the naturals and he didn't prove it was there. He just naïve assumed that if one has two natural numbers n,m, unequal, that one is larger than the other, he never proved it. It makes sense to human intuition, naïve mathematics opposed to axiomatic.
Euler's lack of rigour is far deeper, in fact, it's essentially what filled the entirety of mathematics until Russell tore it all down, the point is that Euler never was explicit about what axioms precisely he proved from. And a lot of axioms in those days were implicitly assumed because they just 'made sense' to the common naïve realist perception. There was really no hard way in that time to why one couldn't implicitly assume the theorem to prove to be true and then beg the question because from what axioms it was being proven was not set in stone. The only thing that stopped people was the subjective line of 'that it just didn't feel good', hardly mathematics of course.