Free to follow every thread. No paywall, no dead ends.
Natural number: the story on HearLore | HearLore
Natural number
The Ishango bone, discovered in the Democratic Republic of the Congo and dating back approximately 20,000 years, bears notched marks that suggest early humans were already performing arithmetic with natural numbers long before written language existed. This artifact, now housed in the Royal Belgian Institute of Natural Sciences, serves as the earliest known physical evidence of natural number arithmetic, proving that the concept of counting is as old as human consciousness itself. Before the invention of numerals, our ancestors used their fingers to represent quantities, a method so intuitive it remains the foundation of modern counting. They would put down a tally mark for each object, allowing them to test for equality, excess, or shortage by striking out a mark and removing an object from a set. This primitive abstraction was the first step in transforming the physical world into a system of abstract relationships, turning the tangible act of holding up fingers into the conceptual understanding of quantity.
The Birth of Zero
For millennia, the number zero was a ghost in the machine of mathematics, existing as a concept of absence but lacking a symbol of its own. The Babylonians used a placeholder digit for zero as early as 700 BCE, yet they omitted it when it would have been the last symbol in a number, treating it as a positional convenience rather than a true number. It was not until the 7th century CE that the Indian mathematician Brahmagupta formally introduced the numeral 0, allowing it to function as a distinct number with its own arithmetic properties. Even earlier, the Olmec and Maya civilizations had developed 0 as a separate number as early as the 4th century CE, but their usage remained confined to Mesoamerica and never spread to the rest of the world. In medieval Europe, the calculation of the date of Easter, known as the computus, utilized 0 as a number beginning with Dionysius Exiguus in 525 CE, though it was never denoted by a numeral. Standard Roman numerals, which dominated Western thought for centuries, had no symbol for 0, instead employing the Latin word nulla, meaning none, to denote a zero value. The struggle to define zero as a number rather than a placeholder was a pivotal moment in the history of mathematics, transforming it from a mere absence into a fundamental building block of the natural number system.
The Philosophers of Number
The first systematic study of numbers as abstractions is usually credited to the Greek philosophers Pythagoras and Archimedes, who began to treat numbers not merely as tools for counting but as objects of philosophical inquiry. Some Greek mathematicians treated the number 1 differently than larger numbers, sometimes even not as a number at all, creating a hierarchy where the unit was distinct from the multitude. Euclid, for example, defined a unit first and then a number as a multitude of units, thus by his definition, a unit is not a number and there are no unique numbers, as any two units from indefinitely many units is a 2. However, in his definition of a perfect number which comes shortly afterward, Euclid treats 1 as a number like any other. In definition VII.3, a part was defined as a number, but here 1 is considered to be a part, so that for example 6 is a perfect number. This tension between the unit and the multitude persisted for centuries, influencing how mathematicians approached the very nature of existence and quantity. The debate over whether 1 was a number or the building block of numbers shaped the trajectory of mathematical thought, leading to the complex definitions that would emerge in the 19th century.
Common questions
What is the earliest known physical evidence of natural number arithmetic?
The Ishango bone, discovered in the Democratic Republic of the Congo and dating back approximately 20,000 years, serves as the earliest known physical evidence of natural number arithmetic. This artifact, now housed in the Royal Belgian Institute of Natural Sciences, bears notched marks that suggest early humans were already performing arithmetic with natural numbers long before written language existed.
When was the numeral 0 formally introduced as a distinct number?
The Indian mathematician Brahmagupta formally introduced the numeral 0 in the 7th century CE, allowing it to function as a distinct number with its own arithmetic properties. The Babylonians used a placeholder digit for zero as early as 700 BCE, yet they omitted it when it would have been the last symbol in a number, treating it as a positional convenience rather than a true number.
Who defined the natural numbers as a multitude of units in ancient Greece?
Euclid defined a unit first and then a number as a multitude of units, thus by his definition, a unit is not a number and there are no unique numbers. Some Greek mathematicians treated the number 1 differently than larger numbers, sometimes even not as a number at all, creating a hierarchy where the unit was distinct from the multitude.
What are the five Peano axioms established in 1889?
Giuseppe Peano published a simplified version of Richard Dedekind's axioms in 1889, establishing that 0 is a natural number, every natural number has a successor which is also a natural number, 0 is not the successor of any natural number, if the successor of one number equals the successor of another, then the numbers are equal, and the axiom of induction which ensures that if a statement is true of 0 and implies its truth for the successor, it is true for every natural number.
How is the natural number 0 defined in set theory?
In the realm of set theory, the natural number 0 is defined as the empty set, a collection containing no elements, while the number 1 is defined as the set containing the empty set, and 2 is the set containing 0 and 1. This construction, proposed by John von Neumann, defines each natural number n as a set containing n elements in a way that creates an iterative definition satisfying the Peano axioms.
Why are natural numbers not closed under subtraction?
Subtracting a larger natural number from a smaller one results in a negative number, and the lack of additive inverses means that the natural numbers form a commutative semiring rather than a ring. The procedure of division with remainder, or Euclidean division, is available as a substitute, ensuring that for any two natural numbers a and b with b not equal to 0, there exist natural numbers q and r such that a equals bq plus r.
In 1889, Giuseppe Peano published a simplified version of Richard Dedekind's axioms in his book The principles of arithmetic presented by a new method, establishing a framework that would become the bedrock of modern arithmetic. This approach, now called Peano arithmetic, is based on an axiomatization of the properties of ordinal numbers, asserting that each natural number has a successor and every non-zero natural number has a unique predecessor. The five Peano axioms state that 0 is a natural number, every natural number has a successor which is also a natural number, 0 is not the successor of any natural number, if the successor of one number equals the successor of another, then the numbers are equal, and the axiom of induction which ensures that if a statement is true of 0 and implies its truth for the successor, it is true for every natural number. These axioms do not explicitly define what the natural numbers are, but instead comprise a list of statements or axioms that must be true of natural numbers, however they are defined. This formalization allowed mathematicians to move beyond intuition and construct a rigorous logical system that could support the entire hierarchy of number systems, from integers to complex numbers.
The Set of Nothing
In the realm of set theory, the natural number 0 is defined as the empty set, a collection containing no elements, while the number 1 is defined as the set containing the empty set, and 2 is the set containing 0 and 1. This construction, proposed by John von Neumann, defines each natural number n as a set containing n elements in a way that creates an iterative definition satisfying the Peano axioms. In this definition, each natural number is equal to the set of all natural numbers less than it, formalizing the operation of counting the elements of a set through the existence of a bijection. The sentence a set A has n elements can be formally defined as there exists a bijection from A to n, and A is a subset of B if and only if A is a subset of B. This order is a well-order, meaning every non-empty set of natural numbers has a least element, a property that distinguishes the natural numbers from other number systems. The rank among well-ordered sets is expressed by an ordinal number, denoted as omega, which represents the order type of the natural numbers. This set-theoretic construction provides a rigorous foundation for the natural numbers, grounding them in the fundamental concepts of logic and set theory.
The Limits of Division
While addition and multiplication are closed operations on natural numbers, meaning the result of adding or multiplying two natural numbers is always a natural number, subtraction and division often lead outside the set. Subtracting a larger natural number from a smaller one results in a negative number, and dividing one natural number by another commonly leaves a remainder. The procedure of division with remainder, or Euclidean division, is available as a substitute, ensuring that for any two natural numbers a and b with b not equal to 0, there exist natural numbers q and r such that a equals bq plus r. The number q is called the quotient and r is called the remainder of the division of a by b, and these numbers are uniquely determined by a and b. This Euclidean division is key to the several other properties, such as divisibility, and algorithms like the Euclidean algorithm, which form the basis of number theory. The lack of additive inverses, which is equivalent to the fact that the set of natural numbers is not closed under subtraction, means that the natural numbers form a commutative semiring rather than a ring. This limitation defines the boundaries of the natural number system, separating it from the integers and rational numbers, and highlighting the unique properties that make it the foundation of counting and ordering.
The Hierarchy of Infinity
Georg Cantor discovered at the end of the 19th century that the concepts of cardinal and ordinal numbers, which apply to finite sets, can be generalized to infinite sets, leading to two different concepts of infinite numbers. The most common number systems used throughout mathematics are extensions of the natural numbers, in the sense that each of them contains a subset which has the same arithmetical structure. If the difference of every two natural numbers is considered to be a number, the result is the integers, which include zero and negative numbers. If the quotient of every two integers is considered to be a number, the result is the rational numbers, including fractions. If every infinite decimal is considered to be a number, the result is the real numbers, and if every solution of a polynomial equation is considered to be a number, the result is the complex numbers. This hierarchy, built up by purely set-theoretic means from a few simple assumptions about natural numbers, demonstrates that the natural numbers are the foundation upon which the entire structure of mathematics is constructed. As the mathematician Paul Halmos noted, numbers make up the foundation of mathematics, and the natural numbers are the first step in this grand construction, providing the basis for all subsequent number systems.