WASHINGTON, DC, March 20, 2014 (ENS) – The White House, federal agencies and private partners Wednesday unveiled the Climate Data Initiative, a project that gives local leaders across the country information to plan for the impacts of climate change while building resilience to rising temperatures already affecting Americans.
The Climate Data Initiative makes use of the federal government’s extensive, freely-available climate-relevant data resources to stimulate innovation and private-sector entrepreneurship in support of national climate-change preparedness, the White House said in a statement.
“Climate change is a fact. And when our children’s children look us in the eye and ask if we did all we could to leave them a safer, more stable world, with new sources of energy, I want us to be able to say yes, we did,” said President Barack Obama in his State of the Union Address on January 28, a sentence the White House quoted in launching the new initiative.
The Climate Data Initiative also fulfills a promise in Obama’s Climate Action Plan announced last June – ensuring that communities across America have access to the information and tools they need to protect themselves from harm today and potential damage in the future.
This means connecting regional and city planners, resource managers, farmers, hospitals, and businesses with data-driven tools to help them better understand, manage, and prepare for the real-world impacts associated with climate change.
With leadership from the National Oceanic and Atmospheric Administration, NOAA, and the National Aeronautics and Space Administration, NASA, the administration is launching climate.data.gov.
This new climate-focused section of Data.gov, the federal government’s open data platform hosted by the General Services Administration, will make federal data about the climate more open, accessible, and useful to citizens, researchers, entrepreneurs, and innovators.
First, the initiative will focus on providing data and tools related to sea-level rise and coastal flooding. For instance, maps of future sea-level rise can help builders decide where to break ground out of harm’s way. Other online tools can help water utility operators identify potential threats to the local water supply.
NOAA and NASA are launching an innovation challenge to encourage entrepreneurs, technologists, and developers to create and deploy data-driven visualizations and simulations that help people understand their exposure to coastal-inundation hazards and other vulnerabilities.
This Coastal Flooding Challenge will culminate in a two-day event on April 12-13, as part of broader activities around the International Space Apps Challenge – a global mass collaboration inviting teams of problem solvers to leverage publicly available data in their design solutions for global challenges.
The U.S. Geological Survey, U.S. Department of Homeland Security, U.S. Department of Defense, and National Geospatial-Intelligence Agency are releasing a collection of datasets containing mapping information about hundreds of thousands of the nation’s infrastructure units and geographical features, including bridges, roads, railroad tunnels, canals, and river gauges.
These data, which have been reviewed by all the agencies involved and deemed “non-sensitive,” are being made available through user-friendly mapping services on Geoplatform.gov and Climate.data.gov.
But it’s one things to release data and another to make it easy to access. Of the 20 terabytes of data NOAA gathers every day, only a small percentage is easily accessible to the public.
NOAA now is seeking input from industry, nonprofits, research laboratories, universities, and private-sector partners on ways to make all this data available to the public in a rapid, scalable manner.
The release of massive amounts of government data will be supported by the philanthropic and private sectors.
Google is committing to provide one petabyte (1,000 terabytes) of cloud storage to house satellite observations, digital elevation data, and climate and weather model datasets drawn from government open data and contributed by scientists as well as 50 million hours of high performance cloud computing on the Google Earth Engine geospatial analysis platform.
To leverage these resources, Google announced new partnerships with the Desert Research Institute, the University of Idaho, and the University of Nebraska to provide drought mapping and monitoring for the entire continental United States in near real-time and also to model water consumption by vegetation across the entire planet.
Intel, Microsoft, and Esri will create maps, apps, and other tools and programs to help local officials and other stakeholders understand the climate risks specific to their communities.
Esri, a mapping giant, is embarking on a two-part initiative. First, Esri will develop and publish a series of free and open “maps and apps” developed in partnership with 12 cities that help address the most urgent climate-relevant needs shared among thousands of users of Esri’s ArcGIS platform, such as preparing for droughts, heat waves, or flooding.
Esri also announced Wednesday a climate-focused geo-collaboration portal to discover, contribute, and share resources on confronting the impacts of climate change.
Additionally, Esri is hosting a Climate Resilience App Challenge to inspire more than 2,500 developers to create mapping and analytical tools that help communities see, understand, and prepare for climate risks. Prizes will be awarded and the resulting apps will be openly shared in July.
Microsoft Research announced a new program to provide 12 months of free cloud computing resources to 40 awardees selected from project proposals submitted by June 15, 2014. Each award provides up to 180,000 hours of free cloud computing time and 20 terabytes of cloud storage.
Intel Corporation, as part of its Code for Good program, is sponsoring three regional partnerships including “”hackathon” events focused on climate resilience in the Chesapeake Bay, New Orleans, and San Jose. In each place, Intel and local partners will convene teams of engineering and computer science specialists and students to develop new software applications and tools focused on climate-change resilience.
Environmental groups are making their unique contributions.
During its annual hackathon event, EcoHack, the World Resources Institute says it will feature a dedicated climate track leading to a climate data visualization, app, or website.
In response to the new Climate Data Initiative, the Environmental Defense Fund and the UCLA Luskin Center for Innovation announced the newest version of the Los Angeles Solar and Efficiency Report, or LASER, a climate mapping tool to help local leaders identify opportunities to invest in clean energy jobs and strengthen climate resiliency in vulnerable communities.
Using data from multiple climate models and other public data, LASER maps illustrate what climate change is going to look like in the Los Angeles region in a few decades, with a focus on impacts on the 3.6 million Los Angeles County residents living in environmentally vulnerable communities already burdened by air pollution and other environmental hazards.
“EDF is proud to answer President Obama’s Climate Data Initiative call to action, and along with the UCLA Luskin Center for Innovation, we look forward to joining a national movement that leverages data-driven insights to help communities effectively prepare for the impacts of climate change,” said Jorge Madrid, coordinator, partnerships and alliances with the Environmental Defense Fund.