The development of formal systems to leverage human invention and insight has been a painful, centuries-long process. (…) In the twelfth century, the Hindu mathematician Bhaskara said, “The root of the root of the quotient of the greater irrational divided by the lesser one being increased by one; the sum being squared and multiplied by the smaller irrational quantity is the sum of the two surd roots.” This we would now express in the form of an equation, using the much more systematically manageable set of formal symbols shown below. This equation by itself looks no less opaque than Bhaskara’s description, but the notation immediately connects it to a large system of such equations in ways that make it easy to manipulate.

– Gilles Fauconnier and Mark Turner

I don’t hate math per se; I hate its current representations. Have you ever tried multiplying Roman numerals? It’s incredibly, ridiculously difficult. That’s why, before the 14th century, everyone thought that multiplication was an incredibly difficult concept, and only for the mathematical elite. Then Arabic numerals came along, with their nice place values, and we discovered that even seven-year-olds can handle multiplication just fine. There was nothing difficult about the concept of multiplication—the problem was that numbers, at the time, had a bad user interface.

— Bret Victor

Taken from Notes on Notation and Thought