Have we ever lost mathematics?
If you study the history of modern mathematics one of the recurring themes is the collapse of the foundations. A realisation that the assumptions underlying a topic were not as strong as might be hoped. There are three classics, which (with a broad brush) might be described as:
- The collapse of Analysis
The problems with infinity and infinitesimals had been known to the Greeks who discussed their paradoxes. In the seventeenth century, however, arguments and proofs involving infinitesimals became more acceptable. A powerful system started to emerge developed most clearly by Newton and Liebniz: the Calculus. Some still had issues, Berkeley famously mocked the “ghosts of departed quantities”, but most persisted and were greatly rewarded. The problems did eventually arise in mathematical settings, for example in Fourier’s study of heat, and it required most of the nineteenth century for Cauchy and Weierstrass to give a rigorous version based on limits. - The collapse of Geometry
With issues arising in the basic assumptions of Analysis, mathematicians looked for firmer ground. Many felt that Geometry and, in particular the great work of Euclid’s Elements might provide this. Unfortunately there was a wrinkle in the Elements, often known as the Parallel Postulate, a statement that seemed far too complicated to be an assumption. Many had tried to show that the parallel postulate could be proved from the other axioms, and failed. Some had actually glimpsed something further, notably Khayyam and Saccheri. In fact, as Bolyai and Lobachevsky would eventually show, there are perfectly good geometries that obey all the axioms apart from the Parallel postulate. With the zoo of examples that started to be considered, could geometry did not look like such a good foundation. - The collapse of Arithmetic
Instead of geometry, therefore other systems were considered, ideas from logic, and Cantor’s set theory were brought into play. Hilbert hoped that a system could be created that was consistent, complete and decidable. Many took on the challenge, Russell and Whitehead created the magnificent Principia Mathematica that famously does not prove that 1+1=2 until page 362. As the ideas started to become clearer, however, the way was left clear for a deeper issue. The results of Kurt Gödel showed that, if we want to include arithmetic, we cannot hope for a system for mathematics that is both complete and consistent. Furthermore, we cannot even prove the consistency of our systems without resorting to a more powerful one (who’s own consistency cannot be proved without a more powerful system still). Some even wonder whether arithmetic is consistent!
As these collapses hit the ideas that fields rest on, one would expect there to be some consequences. Some areas that turn out to be fallacies. Yet this does not seem to be the case. The fundamental ideas of calculus remained the same, although one had to be careful about the exact functions you were talking about. The discovery of non-Euclidean geometries simply revealed additional worlds, all the old results held but some now needed to note an additional assumption. Even the work on undecidability leads, most obviously through Alan Turing, to the theoretical underpinnings of computers. In fact studying the deeper issues seems to open up new areas but not harm those that have been established.
I therefore have a question:
Have we ever lost any mathematics?
Are there mathematical areas that have simple collapsed, having been accepted widely as true, even rigorous? I would like to rule out the case where an area has been rendered unimportant by the development of different techniques. In that case the results still hold, but are no longer as interesting.
Edit: 10/5/12
Some great discussion on math overflow, including one serious candidate, Italian algebraic geometry.
The Parallel postulate went through periods of there being accepted proofs of it.
When?
We are about to lose a great deal of mathematics. Specifically we
are going to lose a lot when Mathematica and Maple disappear.
These are both supported by companies. And when those
companies fail, the software will disappear. They will take the
algorithms with them. Can’t happen? I believe it will, based on
both history and reason.
For history, look at Macsyma which was the premier computational
algebra system running on Symbolics machines. Symbolics, as a
company, failed. The source code for Macsyma is unavailable.
Bill Schelter recovered an early version, pre-Symbolics, which is
now called Maxima. But the company extensions are lost.
Axiom nearly disappeared. Derive is gone, buried somewhere
inside Texas Instruments. Magnus is morobund.
But, you’ll say, why would Wolfram Research (Mathematica)
disappear? Well, how many companies can you name that are
50 years old? 75? 100? Not many. And when a company like
Wolfram Research goes out of business, the software is considered
a major asset. They can’t just give it away. But it would take a
company to maintain it… and that company disappeared.
Mathematics last more than 100 years. But a lot of the best
computational maths algorithms are proprietary and will be lost.
In fact, they are already “lost” in some sense because the original
author(s) never documented the code sufficiently that the algorithm
could be recreated.
Ideally NIST would have fully documented computational
mathematical algorithms in the Digital Library of Mathematical
Functions (DLMF) (http://dlmf.nist.gov) but, so far, nobody has
even tried. I believe this is vital to the future development of
computational mathematics and that we need to begin now
while the original authors are still alive.
We should also make an effort to keep the original source code
for systems like Derive alive. These are the “Newton’s Notebooks”
of computational mathematics.
Tim Daly
daly@axiom-developer.org
On a pedantic point, I can name quite a few companies that are more than a 100 years old, eg Shell, Ford and if you want tech IBM. This does not, however, undermine your point which is a really important one, if a little different to my question. I think there is an additional issue here of compatibility, we have always been able to mix mathematics from the whole spectrum of the subject. How can we ensure that that is still possible with different computational implementations?
This is a great question, Edmund!
The appropriate standard for rigor in the math classroom is always a contentious point. I find it almost ridiculous to think that axiomatic approaches or two-column proofs are the only legitimate approaches. I think you point to a simple fact; Mathematicians have been wrong (and young ones are wrong quite often), but the ideas of mathematical fields have remained more or less constant through the process of refounding.
I’m finishing complex Analysis, my last class in a long sequence of Calc 1, 2, 3, then Analysis, then complex, then more analysis, then more complex, then more analysis, and more complex. The ideas stay the same every year, but the proofs get longer. Bleh.
An important aspect of a proof is to help understand the ideas, and you should be getting something more from each layer of the onion. On the other hand the lack of any serious loss of mathematics does suggest that mathematical usage and practice is good at removing things that are not correct.
Pingback: The Giant’s Shoulders Blog Carnival Is Here! | Medical Heritage Library
Pingback: 87th Carnival of Mathematics | Random Walks
Pingback: Education Depression « Maxwell's Demon