In the early 7th century, a legal code in England dictated that a wound one inch deep warranted a fine of one shilling, establishing the inch not as a geometric abstraction but as a tangible measure of human injury and value. This early reference, preserved in the Textus Roffensis manuscript from 1120, reveals that the inch was once a variable unit rooted in the physical world rather than a fixed constant. The definition of the inch has shifted dramatically over centuries, evolving from the width of a man's thumb to the precise length of three specific grains of barley. Before the 1950s, the inch was a source of international confusion, with the United States and the United Kingdom measuring the same unit to slightly different lengths, creating a discrepancy of less than one ten-thousandth of an inch that nonetheless caused significant errors in large-scale engineering and surveying projects. The modern inch, now defined as exactly 25.4 millimeters, is the result of a complex history involving kings, mathematicians, and the desperate need for global industrial standardization.
Barleycorns and Royal Decrees
For centuries, the legal definition of the inch relied on the humble barleycorn, a grain of barley that served as the fundamental unit of English Long Measure. A statute from 1324 under King Edward II of England legally defined the inch as three grains of barley, dry and round, placed end to end lengthwise. This definition persisted for hundreds of years, appearing in both English and Welsh medieval law tracts, including the Laws of Hywel Dda from the 10th century. King David I of Scotland attempted to refine this further in his Assize of Weights and Measures around 1150, defining the Scottish inch as the width of an average man's thumb at the base of the nail, requiring the calculation of the average of small, medium, and large men's measures. By 1814, mathematics teacher Charles Butler recorded the old legal definition as three grains of sound ripe barley taken from the middle of the ear, well dried, and laid end to end. However, Butler noted that because the length of a barleycorn could not be fixed, the inch derived from this method remained uncertain. By 1843, the legal definition had shifted to a physical standard kept in the Exchequer chamber at Guildhall, rendering the barleycorn obsolete as a primary measure.The War of Two Inches
The 19th and 20th centuries witnessed a silent war between the United States and the United Kingdom over the exact length of the inch, resulting in two slightly different standards that coexisted for decades. In 1866, the United States adopted a conversion factor of 1 meter equal to 39.37 inches, while the United Kingdom defined the inch in terms of the Imperial Standard Yard. By 1893, the US inch was effectively defined as 25.4000508 millimeters, whereas the UK inch was 25.399977 millimeters, a difference of less than one ten-thousandth of an inch. This discrepancy became critical when Carl Edvard Johansson began manufacturing gauge blocks in 1912. Johansson, a Swedish engineer, created a compromise by manufacturing gauge blocks with a nominal size of 25.4 millimeters, accurate to within a few parts per million of both official definitions. His blocks became the de facto standard for manufacturers internationally, forcing other producers to follow his definition. By 1935, industry in 16 countries had adopted the industrial inch, effectively endorsing Johansson's pragmatic choice and bridging the gap between the American and British standards before the official international agreement.