In the year 1557, a Welsh mathematician named Robert Recorde published a book titled The Whetstone of Witte, introducing a symbol that would eventually become the most recognized mark in all of mathematics. Before this publication, writers struggled to express equality using words like is equal to or is as great as, a cumbersome process that slowed the advancement of algebra. Recorde, seeking a more efficient way to write, decided that nothing could be more equal than two parallel lines of the same length. He drew these lines, creating the equals sign, and placed it between expressions to show they were balanced. This simple invention transformed the way humans thought about balance and relationship in numbers, turning abstract concepts into visual statements that could be manipulated with precision. The symbol appeared on the third page of his chapter on the rule of equation, commonly called Algebrers Rule, marking the birth of modern algebraic notation.
The Scale of Unknowns
An equation functions much like a weighing scale where weights are placed on two pans to achieve balance. When equal quantities of grain are placed into the two pans, the scale remains level, and the weights are said to be equal. If a quantity of grain is removed from one pan, an equal amount must be removed from the other pan to keep the scale in balance. This analogy holds true for all equations, whether they involve simple numbers or complex variables. The expressions on the two sides of the equals sign are called the left-hand side and the right-hand side, and very often the right-hand side is assumed to be zero. This does not reduce the generality of the equation, as this can be realized by subtracting the right-hand side from both sides. The process of finding the solutions, or expressing the unknowns in terms of parameters, is called solving the equation. Such expressions of the solutions in terms of the parameters are also called solutions, and the values of the unknowns that satisfy the equality are called solutions of the equation.The Battle of Five Degrees
For centuries, mathematicians sought a general formula to solve polynomial equations of any degree, but the quest hit a wall at the fifth degree. Equations of degree one, two, three, or four could be solved algebraically using a finite number of operations involving just the coefficients. However, equations of degree five or more could not always be solved in this way, a fact demonstrated by the Abel-Ruffini theorem. This discovery changed the landscape of algebra, forcing mathematicians to develop new methods to compute efficient accurate approximations of the real or complex solutions of univariate algebraic equations. The theorem showed that while solutions exist, they cannot always be expressed using elementary algebraic operations. This limitation led to the development of numerical methods and the study of root finding of polynomials, which remains an active area of research today. The inability to solve quintic equations in closed form was a pivotal moment that separated the solvable from the unsolvable in the history of mathematics.