On Wednesday, May 12, 2010, Northrop Grumman Corporate Vice President and Chief Technology Officer Alexis Livanos addressed the AIAA Annual Conference in Crystal City, Virginia. Below are his remarks.

Climate Change: Practical Actions for Practical Applications

Let's start with what we know about the climate change. To begin, we know that climate change is not so much a single, current problem as it is a collection of problems cresting the horizon. Some of these problems will be large, some small. Some will be provincial, some global. Some we understand. Others are poorly understood and, some not yet even foreseen.

These problems will affect areas as divergent as national security concerns associated with population shifts, to agriculture, fisheries, forestry, transportation, water management, infrastructure construction, emergency response preparedness, air quality, even homeland security and many others. Each of these problems requires practical solutions. What do I mean by "practical"? Well, within the context of climate change, a practical solution is one that is available, and useful, to local or regional policy-makers to address local or regional climate change problems and challenges.  Furthermore, because local or regional policy-makers are laypersons in the worlds of science and climate physics, practical solutions must be accessible to people operating at all levels of climate change understanding.

This is a tall order because each of those many and varied practical solutions requires an immense amount of data to be analyzed – data to identify the problem; data to identify the best solutions; data to execute those solutions our policy-makers settle on; data to monitor the progress and effectiveness of solutions long-term.  So, here is something else we can say about practical climate change solutions: They require a lot of data from a lot of independent sources – data that all needs to be reliable, accurate, and sustained over time.

This requirement for access to data, per se, is not problematic. We already collect tremendous amounts of it – not all that we need by any means, but tremendous amounts nonetheless. And much of it is directly related or applicable to the many problems of climate change. 

Governments, universities, private laboratories and other entities around the world collect vast amounts of Earth-monitoring data, climate information, and other related scientific data which are analyzed through a patchwork of national agencies, laboratories, and other data providers. We have an alphabet soup of organizations that provide enormous amounts of data: NASA, NOAA, USGS, EPA, DOE, JPL and others in the United States, and many more internationally.  Governmental or non-governmental, commercial, or academic, all of the data collected is important, and the science and policy communities have established requirements for other types of data that are not yet available operationally.

Valuable data is currently being collected by capable systems. And there is a great deal that we can do with that data to mitigate many of the coming problems associated with climate change. 

But before we can optimize the exploitation of current data, our policy-makers must have access to a very particular kind of tool set – tools that allow them to identify, execute, and monitor the best policy or measure for a given problem. Let’s discuss this tool set in terms of a  supply chain, and let’s take an inventory of the supply chain’s components to find out what we have, what we lack, and what we need to acquire before we can place it into action. 

The supply chain I'm talking about has three components. The first of these comprises the observing systems. There are too many to list them all here. Fortunately, the World Meteorological Organization (WMO) and the Committee on Earth Observations (CEOS) collaborate to maintain a current inventory of space-based systems and sensors.  Polar-orbiting and geostationary satellites monitor most of the 26 essential climate variables identified by the Intergovernmental Panel on Climate Change (IPCC).   Current space sensors collect data on everything from coastline erosion to sea level rise, from ozone to air quality, from surface temperatures to volcanos, from sea ice to agricultural production.

Increasingly, space-based systems are augmented by growing sophistication and use of high altitude, long endurance air-breathing platforms. NASA’s GLOPAC program is an exciting current example of the synergy between space and air. Instruments  NASA has mounted on a Global Hawk unmanned aircraft are generating an enormous body of data; data on our planet’s water cycle and air quality used to correlate data collected with sensors on the space-based Aqua and Aura platforms. This is technological symbiosis and it is serving the interests of climate science in very important ways.  

In addition, air-breathers are doing incredibly important work in their own right.  Let me explain how.

There are about 30,000 different data sets that the Intergovernmental Panel on Climate Change has identified to establish a sufficient knowledge base for understanding how climate change impacts life on our planet.  Of that 30,000, only a very small number come from the tropics. This knowledge gap is all the more astonishing when you consider that 2 billion people – and at least half of all species on the planet – depend directly on tropical forests to survive. And we all understand how important the forests of these zones are as carbon sinks. The data returned from over-flights of Costa Rica have already proven the value of those sensors to our understanding of carbon sequestration.

If climate understanding is to be anything more that an academic exercise, we must come to understand things like the drought patterns and the migration of humans and other species that these changes will induce.

Today, that represents a pretty tall order. Data from the tropics comes from either a biologist who hand-counts species on site – one square meter at a time – or from that vast, space-based global view that I alluded to a moment ago. That leaves an enormous gap in the kind of ecosystems monitoring central to future mitigation efforts. Such technologies as multi-spectrum sensing and laser radar – or LIDAR – are critical to filling that gap. And airborne missions like GLOPAC and missions conducted by 3001 for Conservation International are critical for getting those sensors over the areas to be monitored.

But sensing gaps are not the only problem. In the aggregate, data provided by these sensors possess a diffuse and patchwork quality that makes their full potential difficult to realize. And this brings us to the second component of the supply chain – the computer models that consume, assimilate and establish trends from sensor data.

Earth science models developed to provide both trend and forecast products are extremely important to our understanding of our climate and to the progress of applying climate science to forecast future scenarios. We can certainly give due credit to the DOE, EPA, NASA, NOAA, NSF and affiliated universities for the constant improvements and evolution of these models. A general trend for climate models is toward enabling broader practical utility by providing forecasts on a spatial and temporal resolution conducive to use at the regional to local scale.

For all their progress and improvement in forecast skill, model developers are striving to overcome the limitation of only being able to generate illustrations of the climate on a global scale and to a generalized timeline. The drive is to make them versatile enough for practical uses at the local level. All in all, the ocean of sensor data – and the computer models these data feed – and the good science they enable, are ready to increasingly contribute to benefits beyond the research domain – and into the practical applications domain.

As the research community continues to advance the practical benefits of the second component of the policy-maker’s supply chain – the computer models – the applied sciences community is working on the third component – decision support.  

The goal of the decision support component of the policy-maker’s supply chain is to synthesize and integrate climate data and modeling products into practical, decision-quality knowledge. Knowledge that is accessible on a practical basis is needed to support individual policy makers and managers with the best available information to guide their decisions and actions.

Currently, too much of the data generated by the many sensors I have mentioned are segregated from each other – and a similar situation exists for the computer models they feed.  Integrating products from sensors and models and consolidating the output into something that supports the decisions of local and regional policy-makers is a key benefit to everyone involved. This kind of informed decision support opens up a new world of benefits and opportunities that we are just beginning to imagine.

We are all familiar with the Global Earth Observing System of Systems, or GEOSS. It provides a framework that is enabling integration of the Earth observing systems around the globe and making Earth information universally available for the benefit of our world’s people. GEOSS is a “macro” approach. As such, it serves to inform participating governments and organizations on what is working, and what is needed, including enhancements to overcome limitations for the local or regional policy-maker trying to mitigate a local or regional climate change problem.

What I suggest is needed is something less “macro” and more “micro”; Something that bridges the gap between the mass of distributed scientific data on one hand, and the ability to translate that data into reliable, practical, decision-quality knowledge on the other.  We need the third component of the policy-makers supply chain to be readily accessible tools and portals – enabling a practical decision-support component.

So what does such decision support look like? There are any number of ways this vision can be manifest.  One way is through the establishment of what might be called Climate Knowledge Integration Centers. Think of these as the information portals – terminals to tailored information solutions – that are broadly accessible to national, regional, local and private decision makers.

One can foresee such centers being staffed with indigenous analysts and experts, equipped with high-powered computer infrastructure. The professional staffs, operational processes, and even the broadband networks of these centers can be closely interfaced with other government agencies, state, local, as well as international and public institutions. In this way, these centers access the wealth of international data, and integrate it into useable information relevant to a specific problem or decision in a specific region. These centers can house local experts to verify and validate the right solutions, in the right places, at the right time.

Imagine the advantages of such decision-quality information at the hands of our regional and local policy-makers. Take public health, for example. The instructive decision support supplied by these Climate Knowledge Integration Centers might be delivered in the form of regional heat forecasts. Let’s look at Washington D.C.

Data from the IPCC’s models give us the difference between the mean number of oppressive days per year currently, and the number projected fifty years from now.  An oppressive day is defined as anything over 105 degrees. Currently, the mean number of such days is 10.1 per year for this city. Fifty years from now, the IPCC forecasts that number to be 39.4, for an increase of 29.3 more oppressive days per year. Now, at an historic rate of 26.37 deaths per million, Washington’s 5.3 million residents could suffer as many as 36 more deaths per million fifty years from now.

Agricultural stress is another important subject for mitigation by these knowledge centers. We know that agricultural stress can be quantified based on water availability versus irrigation use. Based on this, a Climate Knowledge Information Center could take a satellite image of a given area and break it down by grid cells – each cell about 25 kilometers squared. Then, by dividing the number of grid cells used for agriculture by the water availability over each grid cell, a number can then be assigned to each cell. Any cell with a number greater than one could be portrayed as brown, indicating an area of projected agricultural stress. These methods of forecasting regional conditions are collectively called Regional Climate Downscaling.

They would be the forte of the Climate Knowledge Information Center and could allow local policy-makers to inform the public to prepare for additional heat stress in urban and agricultural areas, avoiding unnecessary casualties and crop losses that could otherwise ensue.  Lives can be saved and local economies spared.

Or consider the critical task of water management. The effect of sea-level rise and storm surge will be important issues for coastal area planning. Simulations of storm frequency and intensity from regional climate models, coupled with coastal inundation models can provide planners with critical adaptation information. For example, using Regional Climate Downscaling, it is possible to predict the average number of years between flooding events for a given area.

By taking the average years between events and modeling them with forecasted sea level rises, we can get a good idea of just how much we can expect the time between such events to be reduced. This would give regional planners the adaptation information they need to prepare for, and mitigate, flood damage.

As you can understand, there is an enormous difference to peoples’ lives between being able to perform these functions on a global scale, and being able to reliably perform and deliver them regionally and locally. Currently the gap between these two capabilities is wide. Climate Knowledge Information Centers could fill it to a large degree.

Fortunately, many of the technologies, and a great deal of the sciences that are needed, already exist to attack this problem. Some of them can be harvested from national security applications developed over the years. I’ll share an example.

The pace of data-gathering technology initially offered our military commanders unprecedented situational awareness of the battle space on one hand, but with the penalty of severe information overload on the other. Commanders who needed to forward specific information relevant to a unit as small as an infantry squad or a pair of attack jets were required to “sip water from a fire hose,” as they put it. They had to be able to pull out that tiny thread of relevant information from a mountain of data – and quickly. What they needed was the ability to access and integrate their data, make sense of it, and share it on a network among their forces. The trick is to only deliver the right data, to the right people, at the right time.

The solution was a combination of computing power and expertise that allowed different categories of intelligence – imagery, communications intercepts, and others – to be combined and correlated into a massive body of data. The breakthrough was the ability of the commander to tailor his information search by time and map coordinates. All the while, other commanders in other places were able to do the same thing at the same time for their specific needs. This problem has been a tough nut to crack and the technologies designed to do it continue to progress and improve. But I contend that they are effective in more than just military applications for national security, they are adaptable to the needs of climate change decision support for national security applications as well.

Today, the global debate on climate change is in sharp contrast to our technical abilities to understand how to adapt and mitigate impacts. Waiting for consensus to address the evolving problems and challenges associated with climate change is not a safe course of action. Once data collection is streamlined and rationalized to feed climate models; once validated computer models are systematically delivering forecasts to decision tools, then we will have arrived at a critical juncture and humankind can fully realize the benefits. For how we choose to solve the problems associated with climate change will largely foreordain the result. Those solutions that will have the greatest impact on people’s lives will be those enacted in the shortest, most responsive time, at the most localized level. We will need to take care not to give in to the easy, but futile, expectation of awaiting some global consensus to form an ark that will carry us to our rescue.

In addition, the solutions we choose to implement will have to be solutions that are consistent with our nature as humans and as Americans. As such, they will have to harness, not reject, the ingenuity that marks our species. The solutions will have to harness, not reject, the free market system that has proven so successful at generating prosperity. Those solutions will work best that spring from what we do best, which is innovate. It may now be within our power to ensure that our responses to the coming climate changes are not merely reactive, but proactive. The climate changes that crest the horizon may be beyond our immediate control. How we choose to meet them is not.

My colleagues and I at Northrop Grumman stand ready to work together with our community to bring innovative, effective, solutions that serve people across the country and around the world.