— Ch. 1 · Defining Climate Sensitivity —
Climate sensitivity.
~3 min read · Ch. 1 of 7
The year 1750 marks the start of the Industrial Revolution, when humans began burning significant amounts of fossil fuel like coal. This action increased atmospheric carbon dioxide levels from 280 parts per million to over 415 ppm by 2020. Scientists define climate sensitivity as the change in Earth's surface temperature that results from a doubling of this gas concentration. It serves as a key measure for understanding how much warming will occur due to radiative forcing. The concept helps researchers grasp the magnitude and extent of future climate changes.
Radiative Forcing Fundamentals
Energy reaching Earth as sunlight must balance with energy leaving as heat radiation. An imbalance between these flows creates what scientists call radiative forcing. In the context of long-term climate sensitivity from 1750 to 2020, a 50% increase in atmospheric carbon dioxide resulted in a forcing of about plus 2.1 watts per square meter. A warmer planet radiates heat faster until a new balance is reached at higher temperatures. Factors contributing to this include greenhouse gases, solar variability, aerosols, and land use changes like deforestation.