Nonprofit, nonpartisan journalism. Supported by readers.


UST’s John Abraham talks about latest findings: that the oceans are warming faster than thought

photo of an iceberg in the sea
REUTERS/Lucas Jackson
The new work looks at four recent studies that attempted, in very different ways, to correct systematic errors in ocean measurements that until recently relied on fairly primitive instruments.

Last week a research team including the Twin Cities’ John Abraham published a finding that made headlines in the New York Times, Rolling Stone and at least a dozen other major news outlets around the world:

Studio portrait of John Abraham
John Abraham
The earth’s oceans are warming at an accelerating rate, with increases in the last four decades that are about 40 percent higher than the figures published a mere five years ago in the latest global assessment by the Intergovernmental Panel on Climate Change.

Since the ocean absorbs more than 90 percent of all the excess solar heat trapped by our manmade blanket of greenhouse gases, having an accurate fix on its rate of warming is a really big deal. Also, because water increases in volume as it warms, the ocean’s temperature trend alone is considered to have contributed one-third to one-half of the ongoing rise in sea level (the melting of the great ice sheets provide the  rest).

No wonder this paper, published in the journal Science, brought more press attention than any past piece of work to Abraham, a professor of engineering at the University of St. Thomas, who has donated his considerable interest and expertise to a number of notable climate projects, including another study of ocean temperatures that we discussed here not quite two years ago.

As Abraham was quick to point out when we spoke Wednesday afternoon, the new findings do not mean that the IPCC has substantially understated the likely effects of ocean warming in its forecasts of what climate change may bring through the rest of this century.

That’s because the projected scenarios are based on extensive climate modeling, using a large suite of measurements in which ocean temperatures are just one component. And it has been a much-discussed “conundrum” in climate science, Abraham said, that the ocean data haven’t fit well with the rest.

Scientists thought this was probably because it has been so much harder to measure water temperatures around the world at depths of up to 2,000 meters than to stick a thermometer in the soil or send it aloft on a balloon. Probably, but until now not certainly, and the disparity invited pushback:

So you’ve got a case where the measurements don’t agree with the models. What do you think most people felt was the problem? Most people felt the models were wrong. I mean, are you going believe an actual measurement or a computer simulation? And, why should we believe what the models say about the future, if they got the past wrong?

This was especially so in what some of us call the deniasphere, because the deniers want the models to be wrong. It’s a testament to the modelers that they stuck to their guns, they didn’t change the models to fit the data, even though it appeared they were wrong.

The new work looks at four recent studies that attempted, in very different ways, to correct systematic errors in ocean measurements that until recently relied on fairly primitive instruments, and now employ the so-called Argo sensors – automated cylinders that rise and fall, collecting temperature and depth data simultaneously. These are obviously more capable but also much more expensive, therefore more sparingly deployed.

One of four studies used complex computing to “propagate” readings from “data-rich” areas of good instrumentation to areas with little if any; another used satellite altimeters to precisely measure the rising sea surface in areas where little temperature data had been gathered directly.

What we did in this paper was combine four different studies of ocean heating. None of the data was new, but the way we looked at it together was new, and it enabled us to say: Four different studies, from four different approaches, agree that prior estimates were about 40 percent too low.

The ocean is warming faster than we thought, and the models have had it right all along. This gives us more confidence that our estimates of what’s going to happen in the future are correct.

By this point I wanted to know more about just how all that flawed data was gathered in the first place, and our conversation moved into the engineering territory that always makes talking with Abraham such a treat for me. He started by introducing me to a venerable gizmo called an “expendable bathythermograph,” or XBT.

These are little, like, torpedos that are dropped in the ocean and they just fall, unspooling wire as they go, sending temperature information back up to the ship. They’re for one-time use; when they get to the end of the wire they break off and fall to the ocean floor.

They were invented by the U.S. Navy, which wanted to know where the thermocline is – the layer where temperature changes rapidly [from warmer above to colder below] – which is important to the Navy because submarines are hard to find under the thermocline. And the Navy didn’t need a lot of precision for either temperature or depth readings – just where the thermocline was, within a meter or two.

These are dropped in the hundreds of thousands every year, the backbone of ocean temperature measurements. When climate scientists learned about them, they said, hold on a second! You’re telling us the Navy knows ocean temperatures all over, back to the 1950s? Maybe we could use that to figure out how much the oceans have warmed.

The main issue with these devices is, they don’t tell have a sensor to tell you how deep they are. So that’s determined with a stopwatch: You know, they’re falling at six meters per second, so the data coming in at five seconds is coming from 30 meters down.

But here’s the problem: If you drop these things in warm ocean water, they fall faster than if you drop them in the Arctic, because cold water’s more viscous.

In addition, if you drop them from a research vessel, the researchers know they’re supposed to drop them from three meters off the water, so they’re moving at a certain rate when they hit the surface. But now a lot of researchers hire commercial vessels to drop them, like from a container ship bringing Hondas from Japan to the U.S., or a grain ship moving from South America to Africa. If you hire Honda to do it, they might be dropped from 20 meters up, and are moving a lot faster when they hit.

So these factors changed what we call the fall rate, and so we had to correct the fall rate by factoring in data on how the instrument was dropped, where it was dropped.

I ventured that this sounded like a job for an engineering guy with a specialty in fluid mechanics, maybe from the University of St. Thomas. Abraham laughed.

Well, that’s how I got into this work, back in 2010, at a climate conference. I went because I was interested in the topic, but wasn’t doing any research on it, and I was talking with some guys from NOAA and I asked, What’s your biggest question about ocean temperature measurement? And they said, we don’t know how fast these devices fall. And I said, I’m a fluid mechanics guy, let me go work on that – and six months later I helped solve the problem.

One of the tools that we used at the University of St. Thomas was called computational fluid dynamics, using computer simulation that’s something like a wind tunnel measuring air flow over a car or an airplane. The force between a fluid and an object is drag, and it determines the fall rate of an XBT.

So I calculated the drag on these devices for different water temperatures, different drop heights from the ship, for different weights of different versions of the devices. In addition, we performed measurements by going out into the ocean with the research vessel, and taking along super-accurate temperature sensors called CTDs, for conductivity temperature depth.

The CTDs are sensors on a wire, too, and we dropped them at a meter per second, measuring temps every fraction of a second. Meanwhile, you drop a bunch of XBTs in the same location, and compare the temperature recordings – and this enables you to refine the drag measurement, with a combination of computer simulation and  real-life experiments. And that greatly improved the accuracy of data from the historical measurements.

Another major advance came with the Argo sensors, which Abraham says have “revolutionized climate science”:

Since 2005, we have a relatively uniform distribution of temperature sensors in the ocean, automated devices that go up and down about 2,000 meters, recording temperature all along their trajectory, then surfacing to send their data to a satellite.

But you’ve only got about 3,500 of these in the ocean, so there are thousands and thousands of miles of ocean that are unmeasured. So you have to estimate the temperature at points between a pair of sensors, and how you do this interpolation is subject to error. Our lead author, Lijing Cheng, came up with a way to do this – a reliable way of propagating data from areas of the ocean with lots of data to areas with lots of gaps – and this was our major contribution [among the four studies summarized in the new paper].

Having read about development of a so-called Deep Argo, I asked Abraham if getting a new layer of measurements in the really deep ocean, below 2,000 meters, was likely to bring another dramatic change in what we know about ocean warming.

I don’t think so. Deep Argo will be important, but our estimates of deep ocean warming are already pretty good and besides, only about 10 percent of the heat is going below 2,000 meters. So even if we were off by 20 percent on what’s happening in the deep ocean, it’s only a 2 percent difference in the ocean overall.

What’s important is to ensure continued funding of the projects going on now. If the U.S. government decided to stop funding Argo, that would put us back to 2005.

We’d also like to be able to go under ocean ice shelves, and to do that we’re using instrumented animals – turtles and seals with temperature sensors on them. But that, too, is just a small component of what we need. The really important thing is to maintain a continuous monitoring system without cessation.


The full Science paper, “How fast are the oceans warming?”, can be read here without charge.

Comments (12)

  1. Submitted by Steve Rose on 01/18/2019 - 02:24 pm.

    This is as far as I got: “Since the ocean absorbs more than 90 percent of all the excess solar heat trapped by our manmade blanket of greenhouse gases,” That blanket of greenhouse gases was here long before man, and it is what makes this planet habitable. Without it, Earth would be a planet like Mars. It seems that Mars wasn’t always like Mars; it was once like Earth before it lost its atmosphere. The Washington Post published an interesting article telling about it.

    Actually, the title of this MinnPost column set a red flag, “Warming Faster Than Thought”. In other words, until now, we didn’t know what was going on, but now we do. That flies in the face of the settled science claims.

    • Submitted by Ron Meador on 01/18/2019 - 05:21 pm.

      Generally I don’t answer comments on my column, as I’ve already had my say, but Mr. Rose raises an important-sounding point in a wrongheaded and confusing way.

      It is certainly true that the earth has been made habitable for most of its present and historical occupants by a heat-retaining atmosphere that goes way back before man started to burn wood, let alone coal.

      However, the >90 percent reference in my piece (and the Science paper) concerns only the EXCESS heat being retained because of industrial-age emissions of CO2. Or, as the paper puts it:

      “Climate change from human activities mainly results from the energy imbalance in Earth’s climate system caused by rising concentrations of heat-trapping gases. About 93% of the energy imbalance accumulates in the ocean as increased ocean heat content (OHC).”

      Obviously the ocean is also absorbing some share of what we could call “normal” heating created by solar radiation and a naturally retentive atmosphere, but I have no idea what that number is (nor why it should matter to this discussion).

      I thought I had written all this fairly simply and clearly, but in case others are reading me the same way Mr. Rose has, I am happy to add this clarification and to thank him kindly for nudging me to do so.

      • Submitted by Steve Rose on 01/23/2019 - 10:19 am.

        Ron, in your response you referred to “EXCESS heat” and “”normal” heating”. Excess compared to what? What is normal? We know from ice core samples that long ago CO2 ppm numbers like 5600, 4000, and 2000 were normal. Now 400 ppm is considered crisis level. In the 40 years between 1959 and 1999, Earth’s population doubled. There are a lot of variables and little control; we are like little children who cannot reach the thermostat.

        I recommend an exhibit at the Minneapolis Institute of Arts, “Egypt’s Sunken Cities” Two of Egypt’s greatest cities, major trade hubs only 1200 years ago, are now completely covered over by the Mediterranean Sea. Today, totally submerged is normal, just as it has been for hundreds of years.

    • Submitted by markb913 Bohnhorst on 01/19/2019 - 09:58 pm.

      The whole point is that the blanket is getting thicker. That’s what decades of dedicated science has shown.The instruments by which we measure the temperature of the oceans were pretty dodgy; scientists have known that. Now that we have much better instruments, and better ways of looking at the previous data, we see that the oceans really are warming fast.

      This really concerns me, and should concern everyone. About 6 months ago, I read a really interesting book, The Ends of the World. It’s about mass extinctions. It is a humbling book; it puts the short span of human existence in cosmic perspective. Chapter 4 is about the Permian extinction, which happened 252 million years ago. This was “the worst thing that ever happened in the history of life on earth.” When I read it, I thought, wow, that was really really awful; good thing we’re not anywhere near that.

      Then I heard about a study from several months ago that dealt with the warming of the oceans. As I recall, the study found that ocean warming was the main culprit in the End-Permian mass extinction, and the mechanism of action was the loss of oxygen in the oceans. When water warms,it holds a lot less oxygen; and if it warms enough, essentially all the life dies.The study said that, if we keep on our current course, by 2300 we will have gotten half way to the conditions that caused the End-Permian Mass Extinction. So, four or five hundred years of a fossil-fuel based economy will get is half way to “the worst thing that every happened to life on earth.”

      Ron, I understand you normally do not respond to comments, but if you do, could you confirm whether or not my reading of this new research is true?


  2. Submitted by Steven Bailey on 01/18/2019 - 07:47 pm.

    I am in my 50’s and every single prediction of environmental, climate damage, etc. has been shown to be to be underestimated. The climate is changing far worse than was feared. Toxic chemicals are leeching into waters that were never to be a problem. We have radioactive waste leaking from failed containment sites all over the US from our nuclear weapons programs and that is just here. Glyphosate is being found now in every blood sample that is being tested along with PFOA. We have problems which may be unsolvable in the best of circumstances. The deniers are no longer a bunch of ignorant rubes to be laughed at, or ridiculed, they are a containment issue.

    • Submitted by Steve Rose on 01/22/2019 - 03:57 pm.

      Really? Here is a partial list of failed climate predictions, including some epic ones.

      My personal favorite, “Senior members of the UN’s climate science body admit a claim that Himalayan glaciers could melt away by 2035 was unfounded”.

      • Submitted by Eric Flesch on 01/23/2019 - 07:53 pm.

        Steve, the first “prediction” your link claims is failed says,

        “Under the IPCC ‘Business as Usual’ emissions of greenhouse gases the average rate of increase of global mean temperature during the next century is estimated to be 0.3°C per decade ”

        A century has not passed and so why would this be a fail? Your source is garbage, Steve. And you have bought into the garbage and are repeating it.

        • Submitted by Steve Rose on 01/24/2019 - 04:02 pm.

          There is a link within the first link to a 2015 article, a quarter century after UN’s 1990 IPCC report, repeated here for your convenience:

          “According to 1990 IPCC Report, warming since 1990 is still within natural variability”

          “According to the 1990 IPCC Report, an additional 0.5C global warming would need to be observed before natural variability could be distinguished with high confidence from an “enhanced greenhouse effect” due to man-made emissions.”

          After 30 years of that not occurring at all, ever, not once, its time to call it into question.

  3. Submitted by Ray J Wallin on 01/19/2019 - 04:56 pm.

    That our oceans have warmed 40% faster than expected over a short span of a decade may have nothing to do with the theories that the UST professor refers to. Perhaps warming is not as smooth as we think.

    The chaotic behavior of our climate is rarely ever talked about, much less included in a theory. Theories have ocean and the atmospheric temperatures rising at a continual rate or acceleration, with no wobbles or jumps in the predictions.

    But chaotic behavior is why we cannot predict our weather more than a few days out. It is why Minnesota warms from January to June every year, but not smoothly. Chaotic behavior also describes ice-core temperature samples going as far back as research can take us. Yet climate change predictions remain smooth.

    I realize that the mass of the ocean smooths out data, but even given this, should we always expect smooth temperature gradations? Maybe the ocean’s temperature will rise a bit more this decade and a bit less next decade. Can we know?

  4. Submitted by markb913 Bohnhorst on 01/19/2019 - 10:43 pm.

    The research goes back to the 1950s. This is not a “short span.”
    The research is about actual measurements. It turns out that the actual measurements, over a span of decades, are consistent with fundamental understandings (“theories”) that are based on physics.
    The assertion that “theories” all assume “smooth” changes is unsupported and false. The post includes a classic “denailist” confusion between climate and weather. Chaotic behavior of weather is talked about all the time. Furthermore, at the level of the climate models, variations based on different in-puts (e.g., volcanic activity, el nino events, solar radiation increases/decreases) result in a range of outputs, not the single smooth curve that denialists posit. The average may look like a smooth curve, but the range of possible outcomes from year to year is not smooth at all. Inexorably, however, as we add multiple Hiroshimas worth of energy to our planet every hour, the atmosphere and the oceans are warming; and averaged over multiple years and decades, we are finding that the increases indeed are consistent with the basic physics on which climate science rests.

Leave a Reply