Thinking about recent arguments on climate, I have been trying to work out the logic of the air/sea interaction in order to make sense of the pattern of warming. My current conclusion is that, on a very simple model, what we observe is qualitatively about what we would expect if at some point around the year 2000 net energy input to the system had for some reason declined. The purpose of this post is to sketch the argument and see if people commenting see anything wrong with it.
My model is a very simple one. The whole atmospheric system is one object with uniform temperature, the sea another object with uniform temperature. The atmosphere gains heat via net radiation input from the sun, loses it via conduction to the sea. At any instant the temperature of the atmosphere is atmospheric heat/atmospheric heat capacity, and similarly for the sea.
If atmospheric temperature were constant for a sufficiently long time the two temperatures would approach equality, but atmospheric temperature has been rising due to net energy input from the sun. Heat loss to the sea by conduction is proportional to the temperature difference between atmosphere and sea. If the net input from the sun had been constant for a sufficiently long time, the equilibrium of the system would be a constant temperature difference between atmosphere and sea. That gives a constant net heat increase for the atmosphere (radiation in, conduction out) and for the sea (conduction in). Atmosphere and sea are warming at the same rate because if atmosphere warmed faster the temperature difference would be increasing which would increase heat flow from atmosphere to sea which would decrease the rate of warming of the atmosphere, increase that of the sea, until the two equalized. I am ignoring the fact that as the whole system warmed it would radiate more out to space. For simplicity I assume that that effect is small over the range of temperatures I will be looking at, so we can assume constant net radiative input. I am ignoring the fact that if the process went on long enough the sea would boil. I'm looking at much shorter time period than that—decades not millenia.
Now assume that something changes, reducing net radiative input just enough so that heat coming into the atmosphere via radiation is just equal to heat leaving the atmosphere via conduction. The atmosphere stops warming. But it's still warmer than the sea—that's why it is losing heat via conduction. And since it is still losing heat by conduction, the sea continues to warm. Gradually that warming reduces the temperature difference, reducing the rate of heat transfer from atmosphere to sea, slowing the rate at which the sea is warming. If net radiative input remains constant, atmospheric temperatures will gradually start to go up again. If, on the other hand, net radiative input from the sun declines at the same rate at which heat transfer by conduction is declining, atmospheric temperature will remain constant, sea temperature will continue to rise but at a declining rate.
Figure 1 is a graph of atmospheric temperature taken from skepticalscience, a pro-warming site (i.e. one that argues that AGW exists and is a very serious problem that needs to be dealt with):
Figure 1
eyeballing it, temperatures appear to flatten out sometime between 1998 and 2002 and remain roughly constant thereafter.
Figure 2 is the same graph of ocean heat that I discussed in a previous post.
Eyeballing it, the rate of warming appears to decline about 2003.
This is a very simple model and a very simple description of the graphs. Since both graphs have a lot of noise, a simple description may be the best we can do. As should be obvious, my point is only qualitative. I have not made any calculation of how large the heat flow should be from atmosphere to sea as a function of temperature difference and I have not offered data on the actual size of the temperature difference over time.
All of that would require a much more elaborate analysis. My point is only that the observed pattern of atmospheric temperature going flat followed by ocean warming slowing but not stopping is the pattern one would expect if net radiative input dropped. It is thus consistent with the idea of a pause in warming, not in the sense of temperature increase of the whole system going to zero—the sea is still warming—but of the temperature increase of the system slowing in the way to be expected if atmospheric warming stopped.
Two questions for commenters:
1. Have I made any mistake in my analysis of the simple model? The only problem I see is that the drop in rate of warming on Figure 2 looks too abrupt—on my model it ought to be a gradual change. But it's a noisy graph.
1. Have I made any mistake in my analysis of the simple model? The only problem I see is that the drop in rate of warming on Figure 2 looks too abrupt—on my model it ought to be a gradual change. But it's a noisy graph.
2. Are there obvious ways in which making the model more realistic would change the conclusion? In particular, are there obvious improvements which would justify the claim, discussed in my earlier post, that there is no pause because the "missing heat" is going into the ocean? I take that as meaning that what has changed is not the net radiative input from the sun but the conductive loss to the sea, that the flattening of atmospheric temperature is due to an increase in the latter not a decrease in the former.
---
P.S. A couple of people commenting on this (one on G+) argue that I have the heat flow backwards, that the input from the sun mostly goes to the land and sea and is then transferred up to the atmosphere rather than going in the other direction as in my model. I have not yet figured out the implications for making sense of the data, assuming they are correct.
No comments:
Post a Comment