Last week a research team including the Twin Cities’ John Abraham published a finding that made headlines in the New York Times, Rolling Stone and at least a dozen other major news outlets around the world:
The earth’s oceans are warming at an accelerating rate, with increases in the last four decades that are about 40 percent higher than the figures published a mere five years ago in the latest global assessment by the Intergovernmental Panel on Climate Change.
Since the ocean absorbs more than 90 percent of all the excess solar heat trapped by our manmade blanket of greenhouse gases, having an accurate fix on its rate of warming is a really big deal. Also, because water increases in volume as it warms, the ocean’s temperature trend alone is considered to have contributed one-third to one-half of the ongoing rise in sea level (the melting of the great ice sheets provide the rest).
No wonder this paper, published in the journal Science, brought more press attention than any past piece of work to Abraham, a professor of engineering at the University of St. Thomas, who has donated his considerable interest and expertise to a number of notable climate projects, including another study of ocean temperatures that we discussed here not quite two years ago.
As Abraham was quick to point out when we spoke Wednesday afternoon, the new findings do not mean that the IPCC has substantially understated the likely effects of ocean warming in its forecasts of what climate change may bring through the rest of this century.
That’s because the projected scenarios are based on extensive climate modeling, using a large suite of measurements in which ocean temperatures are just one component. And it has been a much-discussed “conundrum” in climate science, Abraham said, that the ocean data haven’t fit well with the rest.
Scientists thought this was probably because it has been so much harder to measure water temperatures around the world at depths of up to 2,000 meters than to stick a thermometer in the soil or send it aloft on a balloon. Probably, but until now not certainly, and the disparity invited pushback:
So you’ve got a case where the measurements don’t agree with the models. What do you think most people felt was the problem? Most people felt the models were wrong. I mean, are you going believe an actual measurement or a computer simulation? And, why should we believe what the models say about the future, if they got the past wrong?
This was especially so in what some of us call the deniasphere, because the deniers want the models to be wrong. It’s a testament to the modelers that they stuck to their guns, they didn’t change the models to fit the data, even though it appeared they were wrong.
The new work looks at four recent studies that attempted, in very different ways, to correct systematic errors in ocean measurements that until recently relied on fairly primitive instruments, and now employ the so-called Argo sensors – automated cylinders that rise and fall, collecting temperature and depth data simultaneously. These are obviously more capable but also much more expensive, therefore more sparingly deployed.
One of four studies used complex computing to “propagate” readings from “data-rich” areas of good instrumentation to areas with little if any; another used satellite altimeters to precisely measure the rising sea surface in areas where little temperature data had been gathered directly.
What we did in this paper was combine four different studies of ocean heating. None of the data was new, but the way we looked at it together was new, and it enabled us to say: Four different studies, from four different approaches, agree that prior estimates were about 40 percent too low.
The ocean is warming faster than we thought, and the models have had it right all along. This gives us more confidence that our estimates of what’s going to happen in the future are correct.
By this point I wanted to know more about just how all that flawed data was gathered in the first place, and our conversation moved into the engineering territory that always makes talking with Abraham such a treat for me. He started by introducing me to a venerable gizmo called an “expendable bathythermograph,” or XBT.
These are little, like, torpedos that are dropped in the ocean and they just fall, unspooling wire as they go, sending temperature information back up to the ship. They’re for one-time use; when they get to the end of the wire they break off and fall to the ocean floor.
They were invented by the U.S. Navy, which wanted to know where the thermocline is – the layer where temperature changes rapidly [from warmer above to colder below] – which is important to the Navy because submarines are hard to find under the thermocline. And the Navy didn’t need a lot of precision for either temperature or depth readings – just where the thermocline was, within a meter or two.
These are dropped in the hundreds of thousands every year, the backbone of ocean temperature measurements. When climate scientists learned about them, they said, hold on a second! You’re telling us the Navy knows ocean temperatures all over, back to the 1950s? Maybe we could use that to figure out how much the oceans have warmed.
The main issue with these devices is, they don’t tell have a sensor to tell you how deep they are. So that’s determined with a stopwatch: You know, they’re falling at six meters per second, so the data coming in at five seconds is coming from 30 meters down.
But here’s the problem: If you drop these things in warm ocean water, they fall faster than if you drop them in the Arctic, because cold water’s more viscous.
In addition, if you drop them from a research vessel, the researchers know they’re supposed to drop them from three meters off the water, so they’re moving at a certain rate when they hit the surface. But now a lot of researchers hire commercial vessels to drop them, like from a container ship bringing Hondas from Japan to the U.S., or a grain ship moving from South America to Africa. If you hire Honda to do it, they might be dropped from 20 meters up, and are moving a lot faster when they hit.
So these factors changed what we call the fall rate, and so we had to correct the fall rate by factoring in data on how the instrument was dropped, where it was dropped.
I ventured that this sounded like a job for an engineering guy with a specialty in fluid mechanics, maybe from the University of St. Thomas. Abraham laughed.
Well, that’s how I got into this work, back in 2010, at a climate conference. I went because I was interested in the topic, but wasn’t doing any research on it, and I was talking with some guys from NOAA and I asked, What’s your biggest question about ocean temperature measurement? And they said, we don’t know how fast these devices fall. And I said, I’m a fluid mechanics guy, let me go work on that – and six months later I helped solve the problem.
One of the tools that we used at the University of St. Thomas was called computational fluid dynamics, using computer simulation that’s something like a wind tunnel measuring air flow over a car or an airplane. The force between a fluid and an object is drag, and it determines the fall rate of an XBT.
So I calculated the drag on these devices for different water temperatures, different drop heights from the ship, for different weights of different versions of the devices. In addition, we performed measurements by going out into the ocean with the research vessel, and taking along super-accurate temperature sensors called CTDs, for conductivity temperature depth.
The CTDs are sensors on a wire, too, and we dropped them at a meter per second, measuring temps every fraction of a second. Meanwhile, you drop a bunch of XBTs in the same location, and compare the temperature recordings – and this enables you to refine the drag measurement, with a combination of computer simulation and real-life experiments. And that greatly improved the accuracy of data from the historical measurements.
Another major advance came with the Argo sensors, which Abraham says have “revolutionized climate science”:
Since 2005, we have a relatively uniform distribution of temperature sensors in the ocean, automated devices that go up and down about 2,000 meters, recording temperature all along their trajectory, then surfacing to send their data to a satellite.
But you’ve only got about 3,500 of these in the ocean, so there are thousands and thousands of miles of ocean that are unmeasured. So you have to estimate the temperature at points between a pair of sensors, and how you do this interpolation is subject to error. Our lead author, Lijing Cheng, came up with a way to do this – a reliable way of propagating data from areas of the ocean with lots of data to areas with lots of gaps – and this was our major contribution [among the four studies summarized in the new paper].
Having read about development of a so-called Deep Argo, I asked Abraham if getting a new layer of measurements in the really deep ocean, below 2,000 meters, was likely to bring another dramatic change in what we know about ocean warming.
I don’t think so. Deep Argo will be important, but our estimates of deep ocean warming are already pretty good and besides, only about 10 percent of the heat is going below 2,000 meters. So even if we were off by 20 percent on what’s happening in the deep ocean, it’s only a 2 percent difference in the ocean overall.
What’s important is to ensure continued funding of the projects going on now. If the U.S. government decided to stop funding Argo, that would put us back to 2005.
We’d also like to be able to go under ocean ice shelves, and to do that we’re using instrumented animals – turtles and seals with temperature sensors on them. But that, too, is just a small component of what we need. The really important thing is to maintain a continuous monitoring system without cessation.
The full Science paper, “How fast are the oceans warming?”, can be read here without charge.