It’s not hard to have heat on the brain this summer, especially here in the United States. Since the government began keeping records in 1895,this June was the hottest June – 2 degrees Fahrenheit above the 20th Century average. The past six months have been the hottest first half of the year on record since 1895, and for the 12 month period ending June 30th, it was also the hottest year on record since the United States began keeping records. Given that this followed a record warm winter and that the last decade has seen a tie for the hottest year on record, climate change has once again become a topic of debate. That being the case, I thought I might review what’s often lost in discussions of climate change: the basic chemistry that underlies what we know about how it works.
The science of climate change is almost two centuries old. To the best of our knowledge, it began with Joseph Fourier, who first noted in the 1820s that, given Newton’s laws of cooling, the Earth should be much colder than it actually is given its distance from the Sun. He developed a number of hypotheses for the origin of the extra heat, one of which was the possibility that the atmosphere itself traps the heat that makes life possible. Other scientists built on Fourier’s work, notably John Tyndall, who in the 1850s not only demonstrated that gasses trapped heat, but also determined how well each of those gasses trapped heat. Carbon dioxide was one of those gasses that traps heat well. The main components of the atmosphere – oxygen and nitrogen – don’t trap heat well.
This simple fact – that increasing carbon dioxide concentration in a gaseous mixture will increase temperatures – is undeniable. It is something that you can demonstrate at your own house for a couple of bucks worth of materials.