In the next decade, US Navy scientists will be able to predict the weather as far as 90 days into the future with the help of mathematical models, satellites, and submarine drones.
The mathematical models are the most important element in the ocean and weather prediction cocktail. But making those models perform at a level where they can be reliable so far into the future requires data from everywhere, including more places under the sea. That’s where the submarine drones make the difference.
Improved data from drones is one of the key elements of making naval environmental forecasting significantly better in the years ahead, Navy Research Lab scientist Gregg Jacobs said.
Today, the Slocum glider is the most recognizable drone that the Navy and others use in research. These 5 foot-long sea robots collect data on their environment every few seconds and can descend to depths of 4,000 feet. The Navy plans to increase the number of those drones from 65 to 150 by 2015.
Submarine drones like the Slocum collect data on salinity and temperature at various spots in the ocean. For the Navy, that’s key to figuring out where to park submarines since temperature and salinity can determine how fast sound can travel. Finding the right spot can make a parked submarine much more difficult to detect. But the bigger value of the undersea drones is all the data they’ll contribute to ocean models and our ability to predict future weather.
The Slocum isn’t the only underwater drone the military is developing. In its fiscal year 2015 budget request, the Defense Advanced Research Projects Agency wants $19 million for its Upward Falling Payload Program to “develop forward-deployed unmanned distributed systems [drones] that can provide non-lethal effects or situational awareness over large maritime areas.” That’s a spending increase of nearly 60 percent over last year.
Today, researchers use separate models to forecast for the ocean, atmosphere, waves and ice. This approach is inconsistent, according to Jacobs. He says that bringing together lots of different models and methods of measurement “in a single system modeling the whole earth environment will bring consistency and extended range forecasts out to 90 days” within the next decade.
The Navy is trying to make that happen in a couple of ways. First, there’s the Navy Ocean Forecast System, a complex computer program that uses meteorology, oceanography, satellite and sensor data to see into the future of the ocean, allowing a detailed view into the physics of water. The Navy uses this information specifically to predict the behavior of eddies, or big swaths of ocean currents. They work sort of the way atmospheric cold and warm pressure fronts do, but while cold fronts are often the size of continents and move over a span of days, eddies are hundreds of kilometers large and move over periods of months. They can also be extremely deep and hard to analyze.
The Navy recently announced a deal to share the Navy Ocean Forecast System software with the National Ocean and Atmospheric Administration.
Not only will sea bots help researchers understand the ocean in greater detail, they’ll also allow the Navy to know how much confidence to put into a forecast at any one time. That’s key, since knowing what the weather might be in three months is less important than knowing when your model is breaking down.
“Forecasts have errors,” Jacobs said, “sometimes large and sometimes small and the errors vary across different areas and throughout the forecast time. Taking the uncertainty in the forecast into account is critical in operation decisions from tactical to strategic. This tells planners where and when significant risk lies for the operations because of larger uncertainty in the forecast.”
That may not sound terribly exciting until you consider that the ability to rapidly update weather predictions, and avoid overconfidence in bad predictions, helped the allies win perhaps the most famous battle of 1944 when meteorologist Sverre Petterssen demonstrated the feasibility of short-term weather prediction based on the precise collection of weather data. He was instrumental in helping the allies successfully launch the D-Day assault on the beach at Normandy.
“During the World War II D-Day invasion planning, he and other meteorologists convinced Dwight Eisenhower to postpone the operation by one day rather than the proposed 14 days. The acceptance of the one day delay saved lives. This work was followed by Lewis Richardson who proposed a method of forecasting weather by solving the governing equations through a computational technique. At the time the process was proposed in 1922, there was not computational power to apply the method. The proposed method has provided the basis for what is conducted in environment forecasting now,” Jacobs said.
The success of Petterssen’s prediction changed history. But that Allied win wasn’t the only outcome of his prognosticating powers. It’s also the reason we have computers.
With the feasibility of short-term weather prediction effectively proven, Hungarian-born mathematician John von Neumann and RCA engineer Vladimir Zworykin arrived at the office of Admiral Lewis Strauss in the fall of 1945 to request money for a grand project, a machine that could perform the calculations necessary to predict future weather patterns.
Von Neumann had been an important player in the successful completion of the Manhattan Project and was developing a reputation in Washington, D.C., as one of the nation’s most important scientific minds. What von Neumann and Zworykin were after was a machine that would allow the user to input any date or set of coordinates, Tokyo in the year 2010 for instance, and receive a weather forecast.
More provocatively, for von Neumann and Zworykin, the ability to predict the weather was a necessary first step to using it like a weapon. In his Outline of Weather Proposal, Zworykin laid out how “The eventual goal to be attained is the international organization of means to study weather phenomena as global phenomena and to channel the world’s weather.”
Von Neumann agreed: “This would provide a basis for a scientific approach for influencing the weather… This control would be achieved by perfectly timed and calculated explosions of energy.”
The research they did with the money that they received played a key role in the development of random access memory, or RAM, which makes up the basis of modern day computation. But we never did make a machine that could predict the weather infinitely into the future, much less control it.
Today, we know that the weather is too chaotic a system to be predicted with the sort of precision that von Neumann envisioned. But, thanks to better simulations, we can create a much clearer picture of the weather and, more importantly, how the weather is evolving on a minute-by-minute basis. The Navy now uses massive super computers to run data to allow for rapidly updating forecasts and projections. Exponentially increasing computer power at decreasing cost is another reason weather prediction will get much better in the next decade, Jacobs said.
Despite all this progress, weather data is a strategic advantage that we’re on the verge of losing, and not because of the machines in the water, but the ones in space. Six ofNASA’s 13 earth-monitoring satellites will no longer be in operation by 2016. This will likely result in a gap of earth-monitoring capability that could persist through 2017 or even beyond. President Barack Obama’s FY15 budget requests $2 billion for NOAA satellites. But satellite spending is hardly safe. The budget request cuts spending for Navy satellite communications to $41,829,000 from $66,196,000 and the Navy Satellite Control Network $20,806,000 from $35,657,000.
The 2015 Air Force budget requests money to begin research on polar weather satellites to replace the current aging system. But the new satellites likely would not be ready until 2020.
So we’ll have more drones but fewer satellites, at least in the near term.
A 2011 Government Accountability Office report warned that without improvements to our earth-monitoring capabilities, we “will not be able to provide key environmental data that are important for sustaining climate and space weather measurements.” The GAO updated the report last year and found a lot of improvement, but also further cause for alarm,stating: “Potential gaps in environmental satellite data beginning as early as 2014 and lasting as long as 53 months have led to concerns that future weather forecasts and warnings—including warnings of extreme events such as hurricanes, storm surges, and floods—will be less accurate and timely.”
Perhaps it goes to show that for all the money and brain power the government has spent trying to understand fronts, winds and even undersea eddies, the hardest place to predict the weather is Washington, DC.
This originally appeared on Defense One. Also on our sister site:
When Does Cyber Spying Become a Cyber Attack?
After Crimea, Sweden Flirts With Joining NATO