Recently, we have been following a lot of climate change related stories, from the unprecedented rainfall of Hurricane Harvey, to California’s devastating wildfires, to the record-setting cold of the bomb cyclone.
WATCH: Miles, after surviving the bomb cyclone, ventures out into snowy Boston to figure out what happened and what can be done in the future.
In putting together these pieces, one thing we repeatedly marveled at was the data collected by satellites used to monitor and forecast weather. For example, check out this amazing footage of the center of the bomb cyclone captured by NOAA’s GOES East satellite:
The crisp images helped us wrap our heads around the science of what’s happening in these extreme weather events. But it also got us thinking–how do all these satellites work?
Launch of the satellite era
In the days before satellites, weather and climate knowledge were rudimentary. The state of the art was equal parts old wives’ tales and basic temperature, pressure, and wind speed readings telegraphed between weather stations–not quite enough to create a field with robust predictive power.
Only once the first weather satellites started going up in the 1960s was there enough data being collected to develop effective multi-day forecasts.
This preponderance of data would have been overwhelming if not for another development around the same time: the rise of computers.
Before computers, it was rather difficult to crunch the numbers necessary to produce forecasts. British mathematician Lewis Fry Richardson, helping establish this field of numerical weather forecasting, reportedly spent months trying to do this by hand only to be rewarded with a “wildly inaccurate six-hour forecast.” Richardson hypothesized that effective forecasting would be possible if he had a few helpers.
“Richardson describes a scheme for predicting the weather before it actually happens, a scheme involving a roomful of people, each computing separate sections of the equations, and a system for transmitting the results as needed from one part of the room to another,” reads a NASA historical sketch. “Unfortunately, Richardson’s estimated number of human calculators needed to keep pace with weather developments was 64,000, all located in one very large room.”
That never came to fruition, but numerical forecasting using computers took off in the 1950s and expanded once weather satellites started going up.
Nowadays, there are 16 operational near-Earth satellites that the United States runs through its NOAA agency. These provide much of the data for most of the country’s short and long term weather forecasting and climate change monitoring. How do they do this? We will look at two satellites: Suomi NPP, which tracks sea surface temperatures, and Jason-3, which measures sea level height.
Taking the Earth’s temperature
Suomi NPP is a new NOAA/NASA polar-orbiting satellite used to “address the challenge of acquiring a wide range of land, ocean, and atmospheric measurements for Earth system science while simultaneously preparing to address operational requirements for weather forecasting.”
Swooping along a polar orbit at a height of 512 miles allows Suomi to image all of the Earth’s surface about once every 12 hours. Twelve operational NOAA satellites are polar-orbiting, allowing for data-gathering on a level unattainable with ground-based weather stations or even weather balloons. Four other satellites are geostationary, meaning their orbit exactly matches the rotation of the Earth so they appear to always hover over a single point (one over the Atlantic and one over the Pacific).
WATCH: A lovely retro animation showing how a polar-orbiting satellite can image the whole world. Credit: EUMETSAT.
But how can Suomi, circling 512 miles above the Earth, take the temperature of an ocean’s surface? It can’t exactly dip a thermometer into the water all the way from space…
Instead, Suomi relies on a state-of-the-art infrared camera. Basically, if you’ve seen the movie Predator, you know how infrared vision can detect heat.
What we humans feel as heat is actually just an emission of light our eyes cannot see, below the frequency we perceive as red. Hence: infrared, which means “below red.”
If our eyes, like those of the Predator or a snake, were evolved to see infrared, we would be able to tell the temperature of objects just by looking at them. Being the clever creatures we are, we have expanded our view beyond what our eyes are evolved to see with devices like night vision cameras.
In essence, that’s what Suomi uses to take Earth’s surface temperature. It’s Visible Infrared Imaging Radiometer Suite (VIIRS) instrument is perhaps the world’s most complicated and precise infrared camera, able to detect these surface temperatures with the power equivalent of two incandescent light bulbs.
VIIRS might also be the most expensive infrared camera in the world: it’s manufacturer, defense contractor giant Raytheon, charges roughly half a billion dollars per camera. A lot of money, sure, but worth every penny for the amazing real-time data it collects for the good of humanity.
For example, monitoring the temperature of the ocean’s surface helps predict how hurricanes will develop, since these storms draw their energy from warm surface waters. And tracking how the oceans are warming allows researchers to track the effects of climate change on important ecosystems like coral reefs and algal blooms.
Measuring the Earth’s rising sea levels
Jason-3 is another polar-orbiting satellite managed by NOAA. This one is used primarily log the height of the world’s oceans.
Measuring the height of surface water is tougher–you need a very accurate distance measurement, something a more passive camera like Suomi’s VIIRS can’t provide. Instead, Jason-3 has to use a more active approach.
On board the spacecraft is a radar instrument called Poseidon-3B. This dish looks down at the Earth and sends out radio waves, listening for their echo as they bounce off the water below.
Since scientists know how fast these radio waves travel (at the speed of light), timing how long the signal takes to make the round trip to the Poseidon-3B dish gives them the distance traveled–the height of the sea.
Easy, right? Well, there are a few things that can skew the measurement.
For example, the scientists need to know exactly where the Jason-3 satellite is when it sends and receives these signals, otherwise they can’t calculate the distance the radar waves traveled with any appreciable precision. This is solved by shooting a laser from the ground, which bounces off a mirror on the satellite and returns to Earth, giving an accurate measurement of distance. This, along with cross-referencing Jason-3’s GPS signals with Earth-based stations as well as other satellites, gives scientists an incredibly accurate position for the spacecraft.
Another issue is that the speed of radio waves is affected by the atmosphere they have to pass through: the more moisture is in the air, the slower the waves travel. This is accounted for by actually sending two different radio signals, of different frequencies. The effect is also proportional to the frequency of the signal, so the slight difference in echo return times allows scientists to correct for atmospheric conditions. As a bonus, the double frequency method can also show whether it’s raining or not!
All of these corrections work together to produce an incredibly precise reading of sea level height: Jason-3, orbiting at 830 miles above sea level, can tell the height of the ocean with an accuracy of less than an inch.
Having this accurate, real-time measurement of sea level height helps determine the stormiest seas and helps alert maritime operations. It also shows where and how sea levels are rising due to climate change.
Living with eyes in the skies
It’s hard to overstate just how much satellites have improved our understanding of weather and climate. But, perhaps just as importantly, they’ve changed how we visualize the world.
Check out this image, one of the first taken by the weather satellite that started it all, TIROS-1.
This photo was so important and high-value that prints were airlifted to President Eisenhower as soon as they were developed.
Compare that to this image, taken just this past December by a version of VIIRS on the brand new NOAA-20 satellite.
CAPTION: The first image of the new VIIRS device on the NOAA-20 satellite sees smoke from the Thomas fire in California. Credit: NASA/NOAA.
Nowadays, we sometimes take these images for granted. But being able to see our planet so clearly is an incredible human accomplishment, one to celebrate. Perhaps we can band together and use the data provided by these satellites to achieve an even bigger and more important goal: curbing climate change.
Banner image credit: NOAA.