In 1817, a German chemist named Friedrich Stromeyer was examining a sample of zinc carbonate that had turned a strange yellow color when heated. While pure zinc oxide remained white, this impure sample shifted hues, hinting at a hidden presence within the mineral. Stromeyer was not looking for a new element, but rather investigating why the zinc carbonate sold by a specific supplier was behaving differently than expected. His persistence in heating and roasting the material led to the isolation of a new metal, which he named cadmium after the Latin word for calamine, the zinc ore in which it was found. Simultaneously, Karl Samuel Leberecht Hermann was studying the same discoloration in zinc oxide and suspected the culprit was arsenic due to the yellow precipitate formed with hydrogen sulfide. It was a race of discovery in the laboratories of Germany, where two men independently identified the same impurity that would eventually become one of the most toxic and useful metals in history. For the next century, Germany remained the sole significant producer of this metal, hoarding the knowledge of its existence while the rest of the world remained unaware of the element hiding within their zinc supplies.
The Color of Danger
The true potential of cadmium as a pigment was recognized in the 1840s, yet early scarcity limited its use in the art world. When finally available, cadmium sulfide and cadmium selenide offered painters the most brilliant and durable yellows, oranges, and reds ever created. These colors were so intense that manufacturers had to tone them down significantly before grinding them into oils or watercolors to prevent them from overwhelming a canvas. Despite the brilliance, the toxicity of the metal posed a constant threat to the artists who handled it daily. Painters developed a routine of applying barrier creams to their hands to prevent absorption through the skin, even though studies suggested that dermal absorption was less than one percent. The danger was not merely theoretical; the metal was a potent poison that could accumulate in the body over decades. By the 1970s and 1980s, the demand for these vibrant pigments began to wane as environmental and health regulations tightened, forcing the industry to seek safer alternatives. The very colors that defined the masterpieces of the 19th and 20th centuries were now being scrutinized as potential health hazards, turning a symbol of artistic freedom into a subject of regulatory caution.The Battery Revolution
As the industrial scale production of cadmium began in the 1930s and 1940s, the primary application shifted from pigments to the coating of iron and steel to prevent corrosion. By 1944, 62 percent of the cadmium used in the United States was dedicated to plating, and by 1956, that figure stood at 59 percent. However, the landscape of cadmium consumption changed dramatically with the rise of the nickel-cadmium battery. In 2006, these rechargeable cells accounted for 81 percent of all cadmium consumption in the United States, compensating for the steep decline in other uses. The battery consisted of a positive nickel hydroxide electrode and a negative cadmium electrode plate separated by an alkaline electrolyte, creating a nominal cell potential of 1.2 volts. This technology powered everything from portable electronics to emergency lighting, but the environmental cost was high. The European Union eventually placed limits on cadmium in electronics, reducing the allowable content to 0.002 percent by 2006. As the world moved toward nickel-metal hydride and lithium-ion batteries, the era of the nickel-cadmium battery began to close, leaving behind a legacy of power and pollution.