In mathematics you don’t understand things. You just get used to them. (John von Neumann)
I heard this quote some time ago, but Csaba Szepesvari mentioned it again in his class a few days ago.
This quote is very interesting. Do I believe it? Let me explain:
One may compare mathematics with a natural language. Both of them have some similar constructions like grammar, vocabulary, phrases, and etc.
When I am trying to prove something in mathematics, I am manipulating assumptions and other theorems in order to convey (prove) something.
What about the time I am speaking or writing in a natural language? Is it anything other than manipulating words in order to persuade others that something is true?
One may ask herself what the differences between these two languages are? Or does our brain perform these two things differently?
I am not sure, but I guess they are in some extent similar. The brain probably manipulate abstracted things the same (the grammar is the same).
However, the main difference may be that in natural languages, we often have an external object that relates to words and define their meaning, but in mathematics this sometime may be not true – specially for those fields of mathematics that a mathematician cannot find a helping physical correspond.
For instance, when I am thinking about derivatives, I can related it to the speed of an object or even I can imagine a plot in a figure and relate the derivative with its slope. Or when I am thinking about partial differential equations with some boundary conditions, I can relate them to an actual physical phenomenon like a water in a kettle which is heated from below.
It may mean that I am actually trying to use natural language tools for understanding mathematical phenomena. If this is true, I can say that we get used to mathematics in almost the same way we get used to a natural language – specially when we can find a physical real-world corresponding thing. Well, might be not!
By the way, you may like to see another quotation from von Neumann:
You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage. (Suggesting to Claude Shannon a name for his new uncertainty function.)