The problem all along, of course, is that people jump to conclusions. Sure, concentrated CO2 exposed to infrared will get somewhat warmer than everyday air. But this only proves that everyday air (99.96% of which is nitrogen, oxygen and argon) is more transparent to IR and less apt to be heated that way. Air molecules, CO2 included, initially acquire heat by contact with warmer surfaces. Via mutual collisions and convective transport, this heat gets spread around within an airmass.
To some slight degree, CO2 also has the option of acquiring heat by radiative transfer. But, rather ironically, it cannot radiatively transfer this heat to the nitrogen, oxygen and argon molecules which surround it because, as said, they are largely infrared-transparent. As a result, an excited CO2 molecule is obliged to share its heat just like the rest of them do, by bumping into other molecules. In short, there’s nothing special about CO2 in a real-world context. Outnumbered 2500 to 1, CO2’s energy is lost in a busy buzz of collisions, its radiative properties wasted.
Moreover, any heated gas radiates infrared — and in this case 99.96% of the gas consists of molecules other than CO2.
Yet no one seriously imagines that back-radiation from 99.96% of the air has a role in raising the earth’s surface temperature.
Only when CO2 comes up do we lose touch with reality.
Here’s a succinct point: Immersed in the vacuum of space, the earth has but one means of losing heat: radiation. And what does carbon dioxide do? It radiates.
It’s amazing that so few people have bothered to give this theory a second look.
Via email By Alan Siddons