The first stone tool, crafted by hammering flakes off a pebble approximately two million years ago, marked the beginning of a species that would eventually reshape the planet. This simple act of observation and trial and error was the genesis of technology, transforming hominids from passive observers of nature into active shapers of their environment. The discovery of fire, described by Charles Darwin as possibly the greatest ever made by man, followed shortly after, with evidence of continuous human fire use dating back at least 1.5 million years. This mastery of fire did more than provide warmth; it allowed early humans to cook food, increasing its digestibility and nutrient value, which in turn promoted an increase in hominid brain size. The cooking hypothesis suggests that this dietary shift was a critical driver in human evolution, enabling the development of language and more complex social structures. By 790 thousand years ago, archaeological evidence of hearths indicates that fire had intensified human socialization, creating a focal point for community interaction that may have contributed to the emergence of language itself. As the Paleolithic era progressed, these early technological advances expanded to include clothing and shelter, with evidence of clothing dating from 90 to 120 thousand years ago and shelter from 450 thousand years ago. These innovations allowed humanity to migrate out of Africa around 200 thousand years ago, initially moving to Eurasia and adapting to colder regions through the use of fur and hides.
The Wheel And The City
The invention of the wheel, estimated to have occurred between 5,500 and 3,000 BCE, revolutionized trade, war, and the very concept of labor. While the oldest known wooden wheel, discovered in the Ljubljana Marsh of Slovenia, dates back 5,100 to 5,350 years, the concept likely emerged independently in Mesopotamia, the Northern Caucasus, and Central Europe. The ancient Sumerians utilized the wheel not only for transportation but also for the potter's wheel, which enabled the mass production of pottery and the creation of complex goods. This technological leap facilitated the transition from early Neolithic villages to the first cities, such as Uruk, and the first civilizations, such as Sumer. The wheel served as a transformer of energy, leading to the development of water wheels, windmills, and treadmills, which harnessed nonhuman power sources to drive machinery. Alongside the wheel, the ancient world saw the construction of sophisticated infrastructure, including the stone-paved streets of Ur and the 50-kilometer Minoan road connecting the palaces of Gortyn and Knossos. The ancient Romans took this infrastructure to new heights with the Cloaca Maxima, a primary sewer system begun in the sixth century BCE that is still in use today, and a complex network of aqueducts extending over 450 kilometers to transport water across long distances. These engineering marvels allowed for the development of running water in private homes, including bathtubs and flush toilets, demonstrating a level of urban sophistication that would not be matched for centuries.
Starting in the United Kingdom in the 18th century, the discovery of steam power set off the Industrial Revolution, a period of wide-ranging technological discoveries that fundamentally altered the human experience. This era saw the application of science to practical ends in agriculture, manufacturing, mining, metallurgy, and transport, leading to the widespread adoption of the factory system. The Second Industrial Revolution, which followed a century later, brought rapid scientific discovery, standardization, and mass production, introducing technologies such as electricity, light bulbs, electric motors, railroads, automobiles, and airplanes. These advancements led to significant developments in medicine, chemistry, physics, and engineering, accompanied by consequential social changes like the rise of skyscrapers and rapid urbanization. Communication was revolutionized by the invention of the telegraph, the telephone, the radio, and television, shrinking the world and connecting distant populations. The 20th century brought a host of innovations, including the discovery of nuclear fission in the Atomic Age, which led to both nuclear weapons and nuclear power. Analog computers asserted dominance in processing complex data, but the invention of the transistor in 1947 significantly compacted computers and led the digital transition. Information technology, particularly optical fiber and optical amplifiers, allowed for simple and fast long-distance communication, ushering in the Information Age and the birth of the Internet. The Space Age began with the launch of Sputnik 1 in 1957, and later the launch of crewed missions to the moon in the 1960s, marking humanity's first steps beyond its home planet.
The Cost Of Progress
While technological change has been the largest cause of long-term economic growth, it has also brought about significant negative impacts, including pollution, resource depletion, and social harms like technological unemployment. The presence of contaminants in the environment, such as lead sulfide flux used by the Inca Empire in the smelting of ores, has been a concern since ancient times. As technology has advanced, so too has the negative environmental impact, with increased release of greenhouse gases, including methane, nitrous oxide, and carbon dioxide, into the atmosphere, causing the greenhouse effect and global warming. The relationship between technology and the environment has sparked a surge in investment in solar, wind, and other forms of clean energy since the 1970s. Security and privacy concerns have also grown with the increasing reliance on technology, as seen in the 2022 incident where North Korea used Blender.io to launder over $20.5 million in cryptocurrency and steal over $600 million worth of cryptocurrency from the game Axie Infinity. The impact of technology on social hierarchies has been profound, with automation both substituting and complementing labor. While past automation has created new, higher-paying jobs to compensate for those lost, the rise of artificial intelligence has sparked debate among economists and policymakers about whether it will follow the same trend. A 2017 survey found no clear consensus on whether AI would increase long-term unemployment, and the World Economic Forum's 2020 report predicted that AI would replace 85 million jobs worldwide while creating 97 million new jobs by 2025.
The Mind And The Machine
The philosophy of technology has emerged as a critical discipline over the past two centuries, studying the practice of designing and creating artifacts and the nature of the things so created. Initially, technology was seen as an extension of the human organism that replicated or amplified bodily and mental faculties, but thinkers like Karl Marx framed it as a tool used by capitalists to oppress the proletariat. Second-wave philosophers like Ortega shifted the focus to daily life and living in a techno-material culture, arguing that technology could oppress even the members of the bourgeoisie who were its ostensible masters. Third-stage philosophers like Don Ihde and Albert Borgmann represented a turn toward de-generalization and empiricism, considering how humans can learn to live with technology. The debate between technological determinism, which asserts that technologies cause unavoidable social changes, and social constructivism, which argues that technologies are shaped by cultural values and economic incentives, continues to shape our understanding of the field. Cultural critic Neil Postman distinguished tool-using societies from technological societies and from what he called technopolies, societies that are dominated by an ideology of technological and scientific progress to the detriment of other cultural practices. The ethics of technology has become an interdisciplinary subfield that analyzes technology's ethical implications, exploring issues ranging from genetically modified organisms to the use of military robots and algorithmic bias. The field of AI ethics includes robot ethics, which deals with the ethical issues involved in the design, construction, use, and treatment of robots, as well as machine ethics, which is concerned with ensuring the ethical behavior of artificially intelligent agents.
The Future And The Risk
Futures studies is the study of social and technological progress, aiming to explore the range of plausible futures and incorporate human values in the development of new technologies. Emerging technologies, such as nanotechnology, biotechnology, robotics, 3D printing, and blockchains, are novel technologies whose development or practical applications are still largely unrealized. In 2005, futurist Ray Kurzweil claimed the next technological revolution would rest upon advances in genetics, nanotechnology, and robotics, with robotics being the most impactful of the three. Genetic engineering will allow far greater control over human biological nature through a process called directed evolution, which some thinkers believe may shatter our sense of self. Nanotechnology will grant us the ability to manipulate matter at the molecular and atomic scale, which could allow us to reshape ourselves and our environment in fundamental ways. Estimates on the advent of artificial general intelligence vary, but half of machine learning experts surveyed in 2140 believe that AI will accomplish every task better and more cheaply than humans by 2063, and automate all human jobs by 2140. This expected technological unemployment has led to calls for increased emphasis on computer science education and debates about universal basic income. Existential risk researchers analyze risks that could lead to human extinction or civilizational collapse, looking for ways to build resilience against them. Future technologies may contribute to the risks of artificial general intelligence, biological warfare, nuclear warfare, nanotechnology, anthropogenic climate change, and global warming, though technologies may also help us mitigate asteroid impacts and gamma-ray bursts. In 2019, philosopher Nick Bostrom introduced the notion of a vulnerable world, one in which there is some level of technological development at which civilization almost certainly gets devastated by default.
The Animal And The Art
The use of basic technology is not unique to humans, as evidenced by tool use among chimpanzees, other primates, dolphins, and crows. Researchers have observed wild chimpanzees using basic foraging tools, pestles, levers, using leaves as sponges, and tree bark or vines as probes to fish termites. West African chimpanzees use stone hammers and anvils for cracking nuts, as do capuchin monkeys of Boa Vista, Brazil. Tool use is not the only form of animal technology use; for example, beaver dams, built with wooden sticks or large stones, are a technology with dramatic impacts on river habitats and ecosystems. The relationship of humanity with technology has been explored in science-fiction literature, for example in Brave New World, A Clockwork Orange, Nineteen Eighty-Four, and Isaac Asimov's essays, as well as in movies like Minority Report, Total Recall, Gattaca, and Inception. This relationship has spawned the dystopian and futuristic cyberpunk genre, which juxtaposes futuristic technology with societal collapse, dystopia, or decay. Notable cyberpunk works include William Gibson's Neuromancer novel, and movies like Blade Runner and The Matrix. The history of technology is also a history of movements, from the 1960s hippie counterculture's preference for locally autonomous, sustainable, and decentralized technology, termed appropriate technology, to the backlash against technology represented by Luddism and the Unabomber Manifesto. The transhumanism movement, founded upon the continued evolution of human life beyond its current human form through science and technology, gained wider popularity in the early 21st century, while singularitarians believe that machine superintelligence will accelerate technological progress by orders of magnitude.