In 1801, English chemist Charles Hatchett discovered a new element within a mineral sample sent from Connecticut, which he named columbium after the poetic name for the United States, Columbia. This discovery marked the beginning of a century-long scientific identity crisis, as the element would be mistaken for tantalum, renamed, and debated by chemists across Europe and America. Hatchett's initial finding was not pure niobium but a mixture of the new element with tantalum, leading to decades of confusion. The element's true identity remained elusive until 1846, when German chemist Heinrich Rose proved that tantalum ores contained a second element, which he named niobium after Niobe, the daughter of Tantalus, reflecting the close relationship between the two elements. Despite the chronological precedence of the name columbium, the International Union of Pure and Applied Chemistry officially adopted niobium in 1949, though the name columbium persists in American metallurgy to this day.
A Century Of Confusion
The scientific community struggled to distinguish between columbium and tantalum for over a century, with English chemist William Hyde Wollaston incorrectly concluding in 1809 that the two elements were identical despite significant differences in the density of their oxides. Wollaston's conclusion was disputed in 1846 by Heinrich Rose, who identified two distinct elements in tantalite samples and named them niobium and pelopium, though pelopium was later found to be identical to niobium or a mixture of niobium and tantalum. The confusion persisted until 1864, when Christian Wilhelm Blomstrand and Henri Étienne Sainte-Claire Deville, along with Louis J. Troost, demonstrated unequivocally that there were only two elements, not three or more. Blomstrand was the first to prepare the metal in 1866 by reducing niobium chloride in an atmosphere of hydrogen, while de Marignac produced tantalum-free niobium on a larger scale by the same year. Articles claiming the existence of ilmenium continued to appear until 1871, illustrating the depth of the scientific misunderstanding that surrounded niobium's identity.The Steel That Built The Modern World
Although niobium was first used commercially in the early 20th century for incandescent lamp filaments, this application quickly became obsolete with the replacement of niobium by tungsten, which has a higher melting point. The true revolution came in the 1920s when scientists discovered that niobium improves the strength of steel, a property that remains its predominant use today. Out of 130,000 tonnes of niobium mined in 2006, an estimated 90% was used in high-grade structural steel, where it forms niobium carbide and niobium nitride to improve grain refining, retard recrystallization, and enhance toughness, strength, formability, and weldability. These microalloyed steels, containing less than 0.1% niobium, are widely used in modern automobiles, pipeline construction, and high-strength low-alloy steels. In some highly wear-resistant applications, such as Crucible CPM S110V stainless steel, niobium content can reach as high as 3%, demonstrating the element's versatility in enhancing material properties.