The letter x, now the universal shorthand for the unknown, was not always the standard. In the 17th century, the French mathematician René Descartes revolutionized the way humans thought about numbers by assigning specific letters to specific roles. Before Descartes, the French mathematician François Viète had used vowels for unknowns and consonants for known values, a system that was logical but cumbersome. Descartes flipped this convention, reserving the letters x, y, and z for unknowns and a, b, and c for known constants. This decision was so influential that it persists in classrooms and laboratories today, turning a simple letter into a global symbol for uncertainty and potential. The choice of x to represent the unknown was likely influenced by the Arabic word for unknown, shay, which was transliterated into Spanish as xay, and eventually shortened to x. This linguistic journey from the Middle East to the pages of Descartes' La Géométrie in 1637 transformed algebra from a rhetorical discipline into a symbolic one, allowing mathematicians to manipulate abstract concepts as if they were concrete numbers.
Ancient Rhetoric and Geometry
Long before the invention of symbols, ancient mathematicians solved problems using words and geometry rather than equations. The Moscow Mathematical Papyrus, dating back to approximately 1500 BC, contains problems known as Aha problems, which asked the solver to find an unknown quantity described entirely in text. A typical problem might state that a quantity taken seven times and added to four equals ten, requiring the solver to deduce the original number without a single symbol. In ancient Greece, Euclid took this further by describing algebraic relationships through geometric figures. In his work Elements, written around 300 BC, Euclid would describe the distributive property not as an equation, but as a statement about rectangles and lines. He would say that if two straight lines are cut into segments, the rectangle contained by the two lines is equal to the sum of the rectangles contained by the uncut line and each segment. This geometric algebra was so dominant that for centuries, the concept of a variable was tied to physical shapes rather than abstract numbers. Even Diophantus of Alexandria, writing around 200 AD, used a form of syncopated algebra that introduced symbols for powers and unknowns, yet lacked the modern symbols for equality or inequality, forcing readers to interpret the meaning through context and convention.The Calculus Revolution
The concept of a variable underwent a radical transformation in the 17th century with the birth of calculus. Isaac Newton and Gottfried Wilhelm Leibniz independently developed this new branch of mathematics to study how quantities change over time. They introduced the idea of a fluent, a quantity that varies continuously, and its corresponding fluxion, the rate of change. In this framework, variables were no longer static placeholders but dynamic entities that flowed and shifted. Newton described these variables as quantities that were generated by motion, a concept that was intuitive but lacked rigorous definition. It was not until the 19th century that the mathematician Karl Weierstrass formalized the notion of a limit, replacing the vague idea of a variable tending toward a value with a precise logical structure. Weierstrass introduced the epsilon-delta definition, which removed the need for variables to actually vary or move, treating them instead as static symbols that could be substituted with any element from a given set. This shift from a dynamic to a static view of variables laid the foundation for modern analysis, allowing mathematicians to prove theorems about functions without relying on the physical intuition of motion or change.