U.S. Geological Survey News Feed

New Simulations of 1811-1812 New Madrid Earthquakes Show Strong and Prolonged Ground Shaking in Memphis and Little Rock

U.S. Geological Survey News Feed - July 30, 2015 - 4:00pm
Summary: Computer simulations of earthquake shaking, replicating the quakes that occurred in 1811-1812 in the New Madrid seismic zone (NMSZ), indicate that future large earthquakes there would produce major, prolonged ground shaking

Contact Information:

Heidi  Koontz ( Phone: 303-202-4763 );



Computer simulations of earthquake shaking, replicating the quakes that occurred in 1811-1812 in the New Madrid seismic zone (NMSZ), indicate that future large earthquakes there would produce major, prolonged ground shaking. The 1811-1812 events were some of the largest in the United States since its settlement by Europeans, and the NMSZ spans portions of seven states: Illinois, Indiana, Missouri, Arkansas, Kentucky, Tennessee and Mississippi. 

Scientists from the Universidad Nacional Autónoma de México, the U.S. Geological Survey, San Diego State University, AECOM (formerly URS Corporation), and the University of Memphis simulated a set of 20 hypothetical, yet plausible earthquakes located along two currently active faults in the NMSZ. The hypothetical earthquake scenarios range in magnitude from 7.0 to 7.7, and consider various possible epicenters. 

”Based on our simulations, were the 1811-1812 earthquakes to repeat today, more than 8 million people living and working near the New Madrid seismic zone would experience potentially damaging ground shaking at modified Mercalli intensities ranging from VI to VIII,” said Leonardo Ramirez-Guzman, lead author of the paper that appears in the July 30 edition of the Bulletin of the Seismological Society of America.

“Strong ground shaking in the greater Memphis metropolitan area could last from 30 seconds to more than 60 seconds, depending on the magnitude and epicenter of a potential seismic event,” said Ramirez-Guzman, a professor at Universidad Nacional Autónoma de México and former USGS contract scientist.

The simulations also demonstrate the importance of fault rupture directivity (seismic energy focused along the direction of faulting), especially when combined with the wave channeling effects of the Reelfoot rift, a buried, northeast-southwest trending geologic valley in the NMSZ. In particular, future large earthquakes on the approximately 80-mile long NMSZ fault show strong shaking at vibration frequencies that pose a risk for mid-rise to high-rise buildings and tall bridges. This fault is thought to be responsible for the December 16, 1811 magnitude 7-7.7 earthquake. Some of the earthquake simulations showed strong shaking focused to the northeast as far as 100-200 miles away near Paducah, Kentucky and Evansville, Indiana, and to the southwest 150 miles toward Little Rock, Arkansas. An example of this earthquake shaking focusing effect can be seen here.

While it’s not possible to know which direction a fault will rupture once an earthquake starts, knowing that there is an increased chance of strong shaking along these geologically-defined corridors is a valuable aid in better characterizing seismic hazard and minimizing earthquake risk.

Earthquakes pose a significant risk to nearly 150 million Americans. The USGS and its partners in the multi-agency National Earthquake Hazard Reduction Program are working to improve earthquake monitoring and reporting capabilities via the USGS Advanced National Seismic System (ANSS). More information about ANSS can be found on the ANSS website.

Peak ground-motion variability for a magnitude 7.7 earthquake. Warmer colors indicate stronger ground motions. The stronger ground motions are extended further northeast and southwest caused by the channeling effect of the Reelfoot rift (RFR) The fault is displayed as a thick black continuous straight line, with the epicenter indicated by the triangle. (high resolution image 1.3 MB)

USGS Awards $4 Million to Support Earthquake Early Warning System in California and Pacific Northwest

U.S. Geological Survey News Feed - July 30, 2015 - 2:00pm
Summary: The U.S. Geological Survey has awarded approximately $4 million this week to four universities – California Institute of Technology, University of California, Berkeley, University of Washington and University of Oregon – to support transitioning the “ShakeAlert” earthquake early warning system toward a production stage

Contact Information:

Leslie Gordon, USGS ( Phone: 650-329-4006 ); Deborah  Williams-Hedges, Caltech ( Phone: 626-395-3227 ); Robert  Sanders, UCB ( Phone: 510-643-6998 );



Additional Contacts: Hannah Hickey, UW, 206-543-2580, hickeyh@uw.edu and Jim Barlow, UO, 541-346-3481, jebarlow@uoregon.edu

RESTON, Va.— The U.S. Geological Survey has awarded approximately $4 million this week to four universities – California Institute of Technology, University of California, Berkeley, University of Washington and University of Oregon – to support transitioning the “ShakeAlert” earthquake early warning system toward a production stage. A functioning early warning system can give people a precious few seconds to stop what they are doing and take precautions before the severe shaking waves from an earthquake arrive.

The USGS has additionally spent about $1 million to purchase new sensor equipment for the EEW system. These efforts are possible because of a $5 million increase to the USGS Earthquake Hazards Program for EEW approved by Congress earlier this year.

Under the new cooperative agreements, the USGS and its four university partners will collaborate to improve the ShakeAlert EEW system across the west coast of the United States, and will continue to coordinate across regional centers in southern California, northern California, and the Pacific Northwest. The USGS and its university partners will continue development of scientific algorithms to rapidly detect potentially damaging earthquakes, more thoroughly test the system, and improve its performance. In addition, they will upgrade and construct approximately 150 seismic sensors to improve the speed and reliability of the warnings. They will also develop user training and education, and add additional test users. There are currently 70 organizations that are test users, from sectors such as utilities and transportation, emergency management, state and city governments, and industry.

In 2006 the USGS began funding multi-institutional, collaborative research to start the process of testing earthquake early warning algorithms on real-time seismic networks within the USGS Advanced National Seismic Network. Today, the ShakeAlert demonstration EEW system is issuing alerts to the group of test users across the U.S. west coast in California, Oregon and Washington. In California, this is a joint effort, where state legislation was passed directing the California Office of Emergency Services and USGS to partner on development of an early warning system. The new awards will expand the number of end users and is another step to improve the speed and reliability of ShakeAlert.  

During the August 2014 magnitude-6.0 South Napa earthquake, an alert was issued providing a nine-second warning to the City of San Francisco. During a May 3rd, magnitude-3.8 event in Los Angeles, an alert was issued 3.3 seconds after the earthquake began, meaning the warning was sent before the secondary, or “S” waves that have the potential for the strongest shaking, had even reached the Earth’s surface. An electronic alert message that travels at the speed of light can outrun the slower earthquake S-waves, providing valuable seconds of warning. Those few seconds of warning can be enough time to stop a commuter train or an elevator, open fire-house doors, stop delicate surgery and “duck, cover, and hold on.”

The plans for ShakeAlert were evaluated by a scientifically rigorous peer-review process: a panel of experts praised the progress achieved and recommended the proposed improvements. The successes of this effective ShakeAlert collaboration among the USGS and the universities led Congress to appropriate $5 million to the USGS in fiscal year 2015 to accelerate the process of migrating towards a public EEW system. In addition to USGS and university partners, the ShakeAlert system involves the participation of state and local governments, end users, and private-sector partners.

New England Maps Adding Trails

U.S. Geological Survey News Feed - July 30, 2015 - 11:30am
Summary: Several of the new US Topo quadrangles for New Hampshire and Vermont now display parts of the Appalachian National Scenic Trail (A.T.) and other selected public trails Newly released US Topo maps for New Hampshire, Vermont, Connecticut, Massachusetts and Rhode Island now feature selected trails and other substantial updates

Contact Information:

Mark Newell, APR ( Phone: 573-308-3850 ); Larry Moore ( Phone: 303-202-4019 );



Several of the new US Topo quadrangles for New Hampshire and Vermont now display parts of the Appalachian National Scenic Trail (A.T.) and other selected public trails. Also, parts of the new maps for Connecticut and Massachusetts feature segments of the New England National Scenic Trail as well as sections of the A.T. Further, all of these revised New England maps, to include new US Topo maps for Rhode Island, highlight significant additions to the new quads such as map symbol redesign, enhanced railroad information and new road source data.

“US Topo maps are the ‘gold standard’ for mapped information,” said Fred Dieffenbach, who coordinates environmental monitoring along the A.T. for the National Park Service, “And the inclusion of the Appalachian National Scenic Trail in this latest update illustrates the significance of this prized resource to the American public.”

For East Coast residents, recreationalists and visitors who want to explore the featured New England trails by biking, hiking, horseback or other means, the new trail features on the US Topo maps will be useful.

The Appalachian NST is a public footpath that traverses more than 2,100 miles of the Appalachian mountains and valleys between Katahdin, Maine (northern terminus), and Springer Mountain, Georgia (southern terminus). The Trail winds through scenic, wooded, pastoral, wild, and culturally resonant lands along this ancient mountain range. With more than 99% of the A.T.’s corridor on Federal or State land, it is the longest continuously marked, maintained, and publicly protected trail in the United States.

“The National Park Service has committed significant resources to understanding the environmental health of the lands and resources that characterize the Appalachian Trail along its entire length,” Dieffenbach continued. “It is extremely gratifying to know that its inclusion in the most recent update was a high priority, and clearly validates the efforts of all the people involved with the management of the A.T.”

The New England NST covers 215 miles from Long Island Sound across long ridges to scenic mountain summits in Connecticut and Massachusetts. The trail offers panoramic vistas and close-ups of New England’s natural and cultural landscape: trap rock ridges, historic village centers, farmlands, unfragmented forests, quiet streams, steep river valleys and waterfalls 

The USGS partnered with the National Park Service, the Appalachian Trail Conservancy and other organizations to incorporate the trail data onto the updated New England US Topo maps. These two NST’s join the Ice Age National Scenic Trail, the Pacific Northwest National Scenic Trail the North Country National Scenic Trail, Pacific Crest National Scenic Trail, and the Arizona National Scenic Trail as being featured on the new US Topo quads. The USGS hopes to eventually include all National Scenic Trails in The National Map products.

Some of the other data for new trails on the maps is provided to the USGS through a nationwide “crowdsourcing” project managed by the International Mountain Biking Association (IMBA).  This unique crowdsourcing venture has increased the availability of trail data available through The National Map mobile and web apps, and the revised US Topo maps.

During the past two years the IMBA, in a partnership with the MTB Project, has been building a detailed national database of trails. This activity allows local IMBA chapters, IMBA members, and the public to provide trail data and descriptions through their website. MTB Project and IMBA then verify the quality of the trail data provided, ensure accuracy and confirm the trail is legal. 

These new maps replace the first edition US Topo maps for these eastern states and are available for free download from The National Map, the USGS Map Locator & Downloader website , or several other USGS applications.

To compare change over time, scans of legacy USGS topo maps, some dating back to the late 1800s, can be downloaded from the USGS Historical Topographic Map Collection

For more information on US Topo maps: http://nationalmap.gov/ustopo/

Updated 2015 version of the Mount Washington, New Hampshire quadrangle with orthoimage turned on. (1:24,000 scale) (high resolution image 1.1 MB) Scan of the 1893 USGS quadrangle of the Mount Washington, New Hampshire area from the USGS Historic Topographic Map Collection(1:62,500 scale) (high resolution image 1.8 MB) Updated 2015 version of the Mount Washington, New Hampshire with orthoimage turned off to better see the various trail networks. (1:24,000 scale) (high resolution image 1.2 MB)

The National Trails System was established by Act of Congress in 1968. The Act grants the Secretary of Interior and the Secretary of Agriculture authority over the National Trails System.  The Act defines four types of trails. Two of these types, the National Historic Trails and National Scenic Trails, can only be designated by Act of Congress.  National scenic trails are extended trails located as to provide for maximum outdoor recreation potential and for the conservation and enjoyment of nationally significant scenic, historic, natural, and cultural qualities of the area through which such trails may pass.

There are 11 National Scenic Trails:

  • Appalachian National Scenic Trail
  • Pacific Crest National Scenic Trail
  • Continental Divide National Scenic Trail
  • North Country National Scenic Trail
  • Ice Age National Scenic Trail
  • Potomac Heritage National Scenic Trail
  • Natchez Trace National Scenic Trail
  • Florida National Scenic Trail
  • Arizona National Scenic Trail
  • New England National Scenic Trail
  • Pacific Northwest National Scenic Trail

Landslides Triggered by Nepal Earthquakes

U.S. Geological Survey News Feed - July 28, 2015 - 4:18pm
Summary: A new report from the U.S. Geological Survey provides critical landslide-hazard expertise to Nepalese agencies and villages affected by the April 25, magnitude 7.8 earthquake that shook much of central Nepal A Scientific Look at What Happened and What Could Happen this Monsoon Season

Contact Information:

Dave Frank ( Phone: (509) 368-3107 ); Leslie Gordon ( Phone: (650) 329-4006 );



Villagers in Kerauja, Nepal standing below a large rock slide that resulted in one fatality. (high resolution image 8.7 MB)

MENLO PARK, Calif. — A new report from the U.S. Geological Survey provides critical landslide-hazard expertise to Nepalese agencies and villages affected by the April 25, magnitude 7.8 earthquake that shook much of central Nepal. The earthquake and its aftershocks triggered thousands of landslides in the steep topography of Nepal, and caused nearly 8,900 fatalities. Hundreds of those deaths were due to landslides, which also blocked vital road and trail lifeline routes to affected villages.

Landslides caused by the earthquakes continue to pose both immediate and long-term hazards to villages and infrastructure within the affected region. Several landslides blocked rivers, creating temporary dams, which were a major concern for villages located downstream. The report provides a rapid assessment of landslide hazards for use by Nepalese agencies during this current monsoon season.

With support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance, and in collaboration with earthquake-hazard organizations from both the United States and Nepal, the USGS responded to this landslide crisis by providing expertise to Nepalese agencies and affected villages. In addition to collaborating with an international group of remote-sensing scientists to document the extent and spatial distribution of landsliding in the first few weeks following the earthquake, the USGS conducted in-country landslide hazard assessments for 10 days in May and June. Much of the information obtained by the USGS in Nepal was conveyed directly to affected villages and government agencies as opportunities arose. Upon return to the United States, data organization, interpretation and synthesis immediately began in order to publish a final report.

This new report provides a detailed account of the assessments performed in May and June, with a particular focus on valley-blocking landslides because they have the potential to pose considerable hazard to many villages in Nepal. The results include an overview of the extent of landsliding, a presentation of 74 valley-blocking landslides identified during the work, and a description of helicopter-based video resources that provide over 11 hours of high resolution footage of approximately 1,000 km (621 miles) of river valleys and surrounding areas affected by the earthquakes. A description of site-specific landslide-hazard assessments conducted while in Nepal and detailed descriptions of five noteworthy case studies are also included. The report ends with an assessment of the expectation for additional landslide hazards in the summer monsoon season following the earthquakes.

The full report, USGS OFR 2015-1142, “Assessment of Existing and Potential Landslide Hazards Resulting from the April 25, 2015 Gorkha, Nepal Earthquake Sequence” is available online, as well as the video footage collected during the research.

Aerial photographs showing landslides triggered by the April and May 2015 Gorkha earthquake sequence in central Nepal. A, Widespread ridgetop landsliding in Gorkha district. The Kerauja rock slide (cover image of report) is wide scar on ridge visible in photograph background (arrow). B, Partially breached Gogane landslide dam in Rasuwa district of Nepal. Top of scarp below village (arrow) is approximately 400 m above river level. C, Rock falls in the Urkin Kangari Valley, Sindhupalchok district. Image shows approximately 1,200 m relief between top of foreground cliffs and valley floor. (high resolution image 3 MB) Photographs showing the Langtang, Nepal debris avalanche, which destroyed the entire village of Langtang. An estimated 200 people were killed in this single event. A, Oblique northwest view of deposit with cliff in which the debris became airborne. Homes in foreground were pushed over by the ensuing airblast. B, Aerial view of debris avalanche deposit showing location of the Langtang River tunnel through ice and debris. (high resolution image 2.2 MB)

New Magnolia State Maps Adding Trails

U.S. Geological Survey News Feed - July 28, 2015 - 11:30am
Summary: Several of the 772 new US Topo quadrangles for Mississippi now display parts of the Natchez Trace National Scenic Trail and other selected public trails. Further significant additions to the new quadrangles include map symbol redesign, enhanced railroad information and new road source data Newly released US Topo maps for Mississippi now feature selected trails and other substantial updates

Contact Information:

Mark Newell, APR ( Phone: 573-308-3850 ); Larry Moore ( Phone: 303-202-4019 );



Several of the 772 new US Topo quadrangles for Mississippi now display parts of the Natchez Trace National Scenic Trail and other selected public trails. Further significant additions to the new quadrangles include map symbol redesign, enhanced railroad information and new road source data. For Gulf Coast residents, recreationalists and visitors who want to explore the featured Mississippi trails by biking, hiking, horseback or other means, the new trail features on the US Topo maps will be useful.

Historically, the 450-mile foot trail that became known as the Natchez Trace was the lifeline through the Old Southwest. The Old Natchez Trace footpath ran through Choctaw and Chickasaw lands, connecting Natchez, Mississippi, to Nashville, Tennessee. Today, the current trail network consists of five separate trails totaling more than 60 miles.

"The inclusion of the Natchez Trace National Scenic Trail onto the US Topo maps will be an excellent tool for publicizing the trail to visitors,” said Greg Smith, Natchez Trace National Scenic Trail Coordinator for the National Park Service. “ The trail traverses three states and provides an opportunity for users to experience the unique cultural and natural aspects of the Old Natchez Trace."

The USGS partnered with the National Park Service to incorporate the trail data onto the Mississippi US Topo maps. The Natchez Trace National Scenic Trail joins the Ice Age National Scenic Trail, the Pacific Northwest National Scenic Trail the North Country National Scenic Trail, Pacific Crest National Scenic Trail, and the Arizona National Scenic Trail as being featured on the new US Topo quads. The USGS plans to eventually include all National Scenic Trails in The National Map products.

Some of the other data for new trails on the maps is provided to the USGS through a nationwide “crowdsourcing” project managed by the International Mountain Biking Association (IMBA).  This unique crowdsourcing venture has increased the availability of trail data available through The National Map mobile and web apps, and the revised US Topo maps.

During the past two years the IMBA, in a partnership with the MTB Project, has been building a detailed national database of trails. This activity allows local IMBA chapters, IMBA members, and the public to provide trail data and descriptions through their website. MTB Project and IMBA then verify the quality of the trail data provided, ensure accuracy and confirm the trail is legal. 

These new maps replace the first edition US Topo maps for the Magnolia State and are available for free download from The National Map, the USGS Map Locator & Downloader website, or several other USGS applications

To compare change over time, scans of legacy USGS topo maps, some dating back to the late 1800s, can be downloaded from the USGS Historical Topographic Map Collection

For more information on US Topo maps: http://nationalmap.gov/ustopo/

Updated 2015 version of Tupelo, Mississippi US Topo quadrangle with orthoimage turned on. (1:24,000 scale). (high resolution image 1.4 MB) Scan of the 1921 legacy topographic map quadrangle of the Tupelo, Mississippi area from the USGS Historic Topographic Map Collection. (high resolution image 2 MB) Updated 2015 version of Tupelo, Mississippi US Topo quadrangle with orthoimage turned off. (1:24,000 scale) (high resolution image 1.2 MB)

The National Trails System was established by Act of Congress in 1968. The Act grants the Secretary of Interior and the Secretary of Agriculture authority over the National Trails System.  The Act defines four types of trails. Two of these types, the National Historic Trails and National Scenic Trails, can only be designated by Act of Congress.  National scenic trails are extended trails located as to provide for maximum outdoor recreation potential and for the conservation and enjoyment of nationally significant scenic, historic, natural, and cultural qualities of the area through which such trails may pass.

There are 11 National Scenic Trails:

  • Appalachian National Scenic Trail
  • Pacific Crest National Scenic Trail
  • Continental Divide National Scenic Trail
  • North Country National Scenic Trail
  • Ice Age National Scenic Trail
  • Potomac Heritage National Scenic Trail
  • Natchez Trace National Scenic Trail
  • Florida National Scenic Trail
  • Arizona National Scenic Trail
  • New England National Scenic Trail
  • Pacific Northwest National Scenic Trail

Mount McKinley Elevation Survey Results Coming Soon

U.S. Geological Survey News Feed - July 23, 2015 - 11:30am
Summary: A team of four climbers has recently returned from the highest point in North America with new survey data to determine a more precise summit height of Mount McKinley. It is anticipated the new elevation finding will be announced in late August Climbers return from the top of Mount McKinley to begin data analysis

Contact Information:

Mark Newell, USGS ( Phone: 573-308-3850 ); Sue Mitchell, UAF GI ( Phone: 907-474-5823 ); Vicki Childers, NOAA/NGS ( Phone: 301-713-3211 x161 );



A team of four climbers has recently returned from the highest point in North America with new survey data to determine a more precise summit height of Mount McKinley. It is anticipated the new elevation finding will be announced in late August.

The ability to establish a much more accurate height has grown with advances in surveying technologies since 1953 when the last official survey of Mount McKinley was recorded. The new elevation will eventually replace the formerly accepted elevation of 20,320 feet.

”Determining an updated elevation for the summit of Mount McKinley presents extraordinary challenges,” said Suzette Kimball, acting director of the USGS.  “The USGS and its partners are using the best available modern GPS survey equipment and techniques to ensure the new elevation will be determined with a high level of accuracy and confidence.”

Unique circumstances and variables such as the depth of the snow pack and establishing the appropriate surface that coincides with mean sea level must be taken into account before the new Mount McKinley elevation can be determined.

In 2013, an elevation was calculated for Mount McKinley using a technology known as Interferometric Synthetic Aperture Radar (ifsar). The 2013 elevation was slightly lower than the summit’s 20,320 foot height. Ifsar is an extremely effective tool for collecting map data in challenging areas such as Alaska, but it does not provide precise spot or point elevations. This new survey used GPS instruments that were placed directly on the summit to measure a specific point on the surface, thus giving a more defined spot elevation. 

The USGS, along with NOAA’s National Geodetic Survey (NGS), and the University of Alaska Fairbanks (UAF), are the primary partners supporting the survey of McKinley’s summit. The survey party included three GPS experts and mountaineers from CompassData (a subcontractor for Dewberry), and a scientist/climber from UAF’s Geophysical Institute.

Now that the data collection expedition is completed, the NGS, UAF, USGS and CompassData are in the process of analyzing the data.

"CompassData was honored to help the USGS and NOAA on this nationally important project,” said Blaine Horner, a member of the climbing team. “Our experience surveying around the world put us in a unique position to perform this work."

The team began their ascent, with the needed scientific instruments in tow, on June 16. With diligent work and mostly favorable weather, the team safely returned to their starting point ahead of schedule.

"We had nearly perfect weather on the mountain,” said Tom Heinrichs, Director of the UAF Geographic Information Network of Alaska and part of the climbing team. “The logistics on the mountain all went well. The summit survey was successful and our preliminary look at the data indicates we will get a good solution for the summit elevation."

Mount McKinley is part of Denali National Park. The Park hosts more than 530,000 visitors each year, with about 1,200 who attempt to climb Mount McKinley. In a typical year, about half of those who begin a McKinley climb reach the summit. The six million acre park will celebrate its 100th anniversary in 2017. The mountain was first summited in 1913.

Agustin (Udi) Karriere (front) and Rhett Foster from CompassData establishing the 11,000 foot camp, preparing to move to the next camp and summit ascent. (Photo: Tom Heinrichs, UAF) (Larger image) Rhett Foster from CompassData on a ridge leading to the 17,000 foot base camp. (Photo: Tom Heinrichs, UAF) (Larger image) Tom Heinrichs from the University of Alaska Fairbanks and Agustin (Udi) Karriere from CompassData traveling low on the mountain towards the next base camp, towing needed science and camp equipment. (Photo: Rhett Foster, CompassData) (Larger image) On top of North America! Blaine Horner from CompassData poses with GPS equipment on the top of Mount McKinley. (Photo: Agustin Karriere, CompassData) (Larger image)

Climate Change Reduces Coral Reefs' Ability to Protect Coasts

U.S. Geological Survey News Feed - July 22, 2015 - 3:00pm
Summary: Coral reefs, under pressure from climate change and direct human activity, may have a reduced ability to protect tropical islands against wave attack, erosion and salinization of drinking water resources, which help to sustain life on those islands

Contact Information:

Mariska  van Gelderen, Deltares ( Phone: +31 (0)6 13 67 13 70 ); Leslie Gordon, USGS ( Phone: 650-329-4006 ); Nanci  Bompey, AGU ( Phone: 202-777-7524 );



Aerial photograph of Kwajalein Atoll showing its low-lying islands and coral reefs. (High resolution image) Aerial photograph of Kwajalein Atoll showing its low-lying islands and coral reefs. (High resolution image)

SANTA CRUZ, Calif. — Coral reefs, under pressure from climate change and direct human activity, may have a reduced ability to protect tropical islands against wave attack, erosion and salinization of drinking water resources, which help to sustain life on those islands. A new paper by researchers from the Dutch independent institute for applied research Deltares and the U.S. Geological Survey gives guidance to coastal managers to assess how climate change will affect a coral reef’s ability to mitigate coastal hazards.  

About 30 million people are dependent on the protection by coral reefs as they live on low-lying coral islands and atolls. At present, some of these islands experience flooding due to wave events a few times per decade. It is expected that this rate of flooding will increase due to sea level rise and coral reef decay, as the remaining dead corals are generally smoother in structure, and do less to dissipate wave energy. Loss of coral cover not only causes increased shoreline erosion but also affects the sparse drinking water resources on these islands, which may eventually make these islands uninhabitable.  In order to prevent or mitigate these impacts, coastal managers need know to what extent their reef system may lose its protective function so that they can take action. The new study gives guidance on a local reef’s sensitivity to change. The new research has been accepted for publication in “Geophysical Research Letters,” a journal of the American Geophysical Union.

To gain insight into effects of changing conditions on coral reefs, the study authors used Xbeach (an open-source wave model). The computer model was first validated using field measurements obtained on the Kwajalein Atoll in the Marshall Islands in the Pacific Ocean, and was then used to investigate what the effects on water levels, waves, and wave-driven runup would be if certain reef properties change. Reef roughness, steepness, width and the total water level on the reef platform are all important factors for coastal managers to consider when planning mitigating measures.

The results suggest that coasts fronted by relatively narrow reefs with steep faces and deeper, smoother reef flats are expected to experience the highest wave runup and thus potential for island flooding. Wave runup increases for higher water levels (that are expected with sea level rise), higher waves, and lower bed roughness (as coral degrades and becomes smoother), which are all expected effects of climate change. Rising sea levels and climate change will have a significant negative impact on the ability of coral reefs to mitigate the effects of coastal hazards in the future.

The research paper, “The influence of coral reefs and climate change on wave-driven flooding of tropical coastlines,” is published as an open-access paper and available online.

Quataert, E., C. Storlazzi, A. van Rooijen, O. Cheriton, and A. van Dongeren (2015), The influence of coral reefs and climate change on wave-driven flooding of tropical coastlines, Geophysical Research Letters, 42, doi:10.1002/2015GL064861

Deltares is an independent institute for applied research in the field of water and subsurface. Visit http://www.deltares.nl and follow us on Twitter @deltares or LinkedIn.

Detailed Flood Information Key to More Reliable Coastal Storm Impact Estimates

U.S. Geological Survey News Feed - July 21, 2015 - 5:30pm
Summary: CORAM, N.Y. -- A new study that looked in part at how damage estimates evolve following a storm puts the total amount of building damage caused by Hurricane Sandy for all evaluated counties in New York at $23 billion. Study Looks at NY Sandy Impacts and Losses by County

Contact Information:

Christopher  Schubert ( Phone: 631-736-0783 x109 ); Ronald  Busciolano ( Phone: 631-736-0783 x104 ); Vic  Hines ( Phone: 813-855-3125 );



CORAM, N.Y. -- A new study that looked in part at how damage estimates evolve following a storm puts the total amount of building damage caused by Hurricane Sandy for all evaluated counties in New York at $23 billion. Estimates of damage by county ranged from $380 million to $5.9 billion.

The U.S. Geological Survey study, done in cooperation with the Federal Emergency Management Agency, marks the first time the agency has done this type of analysis and cost estimation for a coastal storm.

"We looked at how estimates of building damage change depending on the amount of information available at the time of the estimate, looking at three time periods -- storm landfall, two weeks later, and then three months later," said Chris Schubert, a USGS hydrologist and lead author of the study. "What we found was that the biggest jump in estimate reliability comes between the initial estimate and the two-week mark, but that the additional information available three months after an event greatly help refine the estimates even further."

The USGS researcher called the study a "proof of concept" that really showcased the value of gathering storm data before and after a storm.

"FEMA funded the sensor placement we did prior to the storm and our assessment of how high the water reached after the storm," Schubert said. "The results from this new study demonstrated how the additional resolution and accuracy of flood depictions resulting from these efforts greatly improved the damage estimates."

Damage estimates can be used by FEMA and other stakeholders to help prioritize relief and reconstruction efforts following a storm. The results can also assist with resiliency planning that helps communities prepare for future storms.

The researchers came up with the estimates by using census data and FEMA’s HAZUS modeling software program.  The HAZUS program is used to estimate potential loss from disasters such as earthquakes, wind, hurricanes and floods.  This program allows for an assessment of building loss on a block-by-block level.

Hurricane Sandy’s impact was the first time in recent memory, and record, that coastal water levels had reached the heights they attained in many places in the state of New York. Flood effects of Hurricane Sandy, in comparison to those from Tropical Storm Irene in 2011, were significantly more extensive, with most water levels rising at least 2.5 feet higher than in the 2011 storm.

With the latest USGS analysis, a comprehensive picture of the magnitude of Sandy’s impact is now available. Without the sensor placement before the storm, and assessment of high-water marks after, this level of understanding wouldn’t be possible.

"This is the first time USGS has done this type of analysis and cost estimation for a coastal storm," said Schubert. "The effort incorporates what we learned from previous storms going back to Katrina, and the storm-tide information we provided to FEMA in the immediate aftermath of Sandy is one of the building blocks for this research. The additional fidelity of the damage estimate underscores the tremendous value of the dataset for this storm."

Interpretation of storm-tide data from a variety of tools such as tide gauges, stream gauges, and temporary sensors combined with high-water marks showed the extreme nature of storm-tide flooding and, at some sites, the severity and arrival time of the storm surge.  Storm surge is the height of water above the normal astronomical tide level due to a storm. Storm tide is the storm surge in addition to the regular tide.

"Timing matters, though every storm is different," said Schubert. "Throughout southeastern New York, we saw that timing of the surge arrival determined how high the storm tide reached. The worst flooding impacts occurred along the Atlantic Ocean-facing parts of New York City and western Long Island, where the peak storm surge arrived at high tide. So the resulting storm tide was five to six feet higher than it would have been had the peak surge arrived at low tide."

The new research is available online in, Analysis of Storm-Tide Impacts from Hurricane Sandy in New York, SIR 2015-5036, by C.E. Schubert, R. Busciolano, P.P. Hearn Jr., A.N. Rahav, R. Behrens, J. Finkelstein, J. Monti Jr., and A. E. Simonson. It examined damage estimates from those counties with depictions of flood extent available from FEMA and the National Hurricane Center.

The USGS is also conducting a study in New Jersey that examines similar topics, including the estimated flood frequency of documented peak storm-tide elevations, comparisons of Sandy to historic coastal storms, the timing of storm surge, and changes in HAZUS damage estimates with the use of USGS sensor and high-water-mark data.   That study is expected to be completed and released later this year.

As Climate Warms Hawaiian Forest Birds Lose More Ground to Mosquitoes

U.S. Geological Survey News Feed - July 17, 2015 - 3:00pm
Summary: Hawai‘i, the name alone elicits images of rhythmic traditional dancing, breathtaking azure sea coasts and scenes of vibrant birds flitting through lush jungle canopy. Unfortunately, the future of many native Hawaiian birds looks grim as diseases carried by mosquitoes are due to expand into higher elevation safe zones

Contact Information:

Wei Liao ( Phone: 608-265-2130 ); David  Helweg ( Phone: 808-342-7606 ); Ryan McClymont ( Phone: 503-251-3237 );



ISLAND OF HAWAI‘I, Hawaii — Hawai‘i, the name alone elicits images of rhythmic traditional dancing, breathtaking azure sea coasts and scenes of vibrant birds flitting through lush jungle canopy. Unfortunately, the future of many native Hawaiian birds looks grim as diseases carried by mosquitoes are due to expand into higher elevation safe zones.

A new study published in Global Change Biology, by researchers at the U.S. Geological Survey and the University of Wisconsin-Madison, assesses how global climate change will affect future malaria risk to native Hawaiian bird populations in the coming century.

Mosquito-carried diseases such as avian pox and avian malaria have been devastating native Hawaiian forest birds. A single mosquito bite can transfer malaria parasites to a susceptible bird, where the death rate may exceed 90 percent for some species. As a result, many already threatened or endangered native birds now only survive in disease-free refuges found in high-elevation forests where mosquito populations and malaria development are limited by colder temperatures. Unlike continental bird species, island birds cannot move northward in response to climate change or increased disease stressors, but must adapt or move to less hospitable habitats to survive.

“We knew that temperature had significant effects on mosquitoes and malaria, but we were surprised that rainfall also played an important role,” said USGS Wisconsin Cooperative Wildlife Research Unit scientist Michael Samuel. “Additional rainfall will favor mosquitoes as much as the temperature change.”

With warming temperatures, mosquitoes will move farther upslope and increase in number. The authors expect high-elevation areas to remain mosquito-free, but only until mid-century when mosquito-friendly temperatures will begin to appear at higher elevations. Future increases in rainfall will likely benefit the mosquitoes as well.

Scientists know that historically, malaria has caused bird extinctions, but changing climates could affect the bird-mosquito-disease system in unknown ways. “We wanted to figure out how climate change impacts birds in the future,” said Wei Liao, post-doctorate at University of Wisconsin-Madison and lead author of the article.

As more mosquitoes move up the mountainside, disease-free refuges will no longer provide a safe haven for the most vulnerable species. The rate of disease infection is likely to speed up as the numbers of mosquitoes increase and more diseased birds become hosts to the parasites, continuing the cycle of infection to healthy birds.

Researchers conclude that future global climate change will cause substantial decreases in the abundance and diversity of remaining Hawaiian bird communities. Without significant intervention many native Hawaiian species, like the scarlet ‘I‘iwi with its iconic curved bill, will suffer major population declines or extinction due to increasing risk from avian malaria during the 21st century.

There is hope for the birds. Because these effects are unlikely to appear before mid-century, natural resource managers have time to implement conservation strategies to protect these unique species from further decimation. Land managers could work toward preventing forest bird number declines by restoring and improving habitat for the birds, reducing mosquitoes on a large scale and controlling predators of forest birds. 

“Hawaiian forest birds are some of the most threatened forest birds in the world,” said Samuel. “They are totally unique to Hawai‘i and found nowhere else. They are also important to the Hawaiian culture. And at this point, we still don’t fully understand what role they play as pollinators and in forest dynamics.”

The article, “Will a Warmer and Wetter Future Cause Extinction of Native Hawaiian Forest Birds?” can be found in the online edition of Global Change Biology.

The work was supported by the Department of Interior Pacific Islands Climate Science Center, which is managed by the USGS National Climate Change and Wildlife Science Center. The center is one of eight that provides scientific information to help natural resource managers respond effectively to climate change.

40 Years of North Pacific Seabird Survey Data Now Online

U.S. Geological Survey News Feed - July 16, 2015 - 2:30pm
Summary: The U.S. Geological Survey today released the North Pacific Pelagic Seabird Database — a massive online resource compiling the results of 40 years of surveys by biologists from the United States, Canada, Japan and Russia

Contact Information:

John  Piatt ( Phone: 360-774-0516 ); Ryan McClymont ( Phone: 503-251-3237 );



ANCHORAGE, Alaska — The U.S. Geological Survey today released the North Pacific Pelagic Seabird Database — a massive online resource compiling the results of 40 years of surveys by biologists from the United States, Canada, Japan and Russia. The database documents the abundance and distribution of 160 seabird and 41 marine mammal species over a 10 million-square-mile region of the North Pacific. 

“The database offers a powerful tool for analysis of climate change effects on marine ecosystems of the Arctic and North Pacific, and for monitoring the impact of fisheries, vessel traffic and oil development on marine bird communities over a vast region,” said Dr. John Piatt, head of the Seabird and Forage Fish Ecology Research Program at the USGS Alaska Science Center. “It also creates an unprecedented opportunity to study the biogeography and marine ecology of dozens of species of seabirds and marine mammals throughout their range in continental shelf waters of the United States.” 

Hundreds of scientists and observers conducted surveys, gathering data on more than 350,000 transects ranging from the Channel Islands of southern California westward to the coast of South Korea, and from the Hawaiian Islands northward to the North Pole. The majority of data collection occurred over the U.S. continental shelves stretching from California to Arctic Alaska, where concerns over the possible impact of human activities at sea have long fueled wildlife research and monitoring efforts.

The surveys were conducted over four decades as part of focused studies, for various purposes and in specific regions within the North Pacific.  Hundreds of observers from dozens of international, federal and state wildlife agencies, universities and consulting companies contributed data. Because similar observational methods were used, the data could be compiled into a single database, shedding light on broader patterns of seabird distribution and abundance.

USGS scientists started compiling the data into the NPPSD in 2001 and published the first version in 2005.  This is the first time the database has been made available online.  The current version includes surveys conducted in the last decade and from additional regions. The compilation of data from surveys spanning 40 years makes the NPPSD one of the largest marine wildlife censuses ever conducted in terms of the number of animals observed and spatial extent of the survey area.

“Contributors to the NPPSD can now examine large-scale phenomena that were previously impossible for individual studies to assess because they were conducted on smaller temporal and spatial scales,” said Dr. Gary Drew, database manager for the Seabird and Forage Fish Ecology Research Program at the USGS Alaska Science Center.

The value of the NPPSD for understanding the ecology of the North Pacific and the impacts of human activities in this region has just begun to be realized. Recent analyses using NPPSD data included a risk analysis of shipping traffic on seabirds in the heavily traveled Aleutian Islands conducted by the U.S. Fish and Wildlife Service, and a study commissioned by the National Audubon Society to identify “Important Bird Areas” from California to Alaska.  Future analysis of the database by USGS scientists aims to yield many insights into the status of seabird and marine mammal populations, while the live online database meets the Obama Administration’s directive of "Expanding Public Access to the Results of Federally Funded Research."

The NPPSD and Users Guide are available from the USGS Alaska Science Center website.

Power of Prediction: Avian Fatalities at Wind Facilities

U.S. Geological Survey News Feed - July 8, 2015 - 11:31am
Summary: The U.S. Geological Survey, in collaboration with the U.S. Fish and Wildlife Service, has released a study that will enable ecologists, managers, policy makers, and industry to predict the bird fatalities at a wind facility prior to it being constructed

Contact Information:

Leslie New ( Phone: 360-546-9309 ); Brian Milsap ( Phone: 505-559-3963 ); Hannah Hamilton ( Phone: 703-648-4356 );



The U.S. Geological Survey, in collaboration with the U.S. Fish and Wildlife Service, has released a study that will enable ecologists, managers, policy makers, and industry to predict the bird fatalities at a wind facility prior to it being constructed.

The study examined golden eagles as a case study because they are susceptible to collisions with wind turbines in part because of their soaring and hunting behavior. 

Bird fatalities due to collisions with rotating turbine blades are a leading concern for wildlife and wind facility managers. This new model builds upon previous approaches by directly acknowledging uncertainty inherent in predicting these fatalities. Furthermore, the computer code provided makes it possible for other researchers and managers to readily apply the model to their own data. 

The model looks at only three parameters:  hazardous footprint, bird exposure to turbines and collision probability. “This simplicity is part of what makes the model accessible to others,” said Leslie New, assistant professor of statistics at Washington State University, who led the research project as a USGS postdoctoral fellow. “It also allows wind facility developers to consider ways to reduce bird fatalities without having to collect a complicated set of data.”

High rates of bird fatalities do not occur at every wind facility. The geographic location, local topographic features, the bird species and its life history, as well as other factors all play a role in the number of fatalities.

Taking advantage of publically available information, research scientists incorporated a wealth of biological knowledge into their model to improve fatality predictions.

“Uncertainty in this model can be reduced once data on the actual number of fatalities are available at an operational wind facility,” said New.

To establish the utility of their approach, the scientists applied their model to golden eagles at a Wyoming wind facility. Their long-life span combined with delayed reproduction and small brood size means that there are potential population-level effects of this additional source of mortality.  

Golden eagles are protected under the Bald and Golden Eagle Protection Act and the Migratory Bird Treaty Act. The combination of law, conservation concerns, and renewable-energy development led the USFWS to develop a permitting process for wind facilities.  The USFWS permitting process requires that fatality predictions be made in advance of a wind facility’s construction. This allows the facility’s impact to be assessed and any mitigation measures related to turbine placement on the landscape to be taken. The new model was developed specifically for the purpose of assessing take as part of the preconstruction permitting process.

The study supports a conservative approach and the researchers’ model is used to inform this permitting process and balance management of eagle fatalities.

The article, “A collision risk model to predict avian fatalities at wind facilities: an example using golden eagles, Aquila chrysaetos by L.F. New, E. Bjerre, B. Millsap, M. Otto and M. Runge, is available in PLOS ONE online.

About the Golden Eagle:

The golden eagle has a vast range, from the tundra through grassland, forested habitat and woodland brushland south to arid deserts including Death Valley, California. They are aerial predators that build nests on cliffs or in the largest trees of forested stands that often afford an unobstructed view of the surrounding habitat.

Northern Alaska Coastal Erosion Threatens Habitat and Infrastructure

U.S. Geological Survey News Feed - July 1, 2015 - 2:30pm
Summary: In a new study published today, scientists from the U.S. Geological Survey found that the remote northern Alaska coast has some of the highest shoreline erosion rates in the world

Contact Information:

Ann Gibbs ( Phone: 831-460-7540 ); Paul  Laustsen ( Phone: 650-329-4046 );



This oblique aerial photograph from 2006 shows the Barter Island long-range radar station landfill threatened by coastal erosion. The landfill was subsequently relocated further inland, however, the coastal bluffs continue to retreat. (High resolution image)

ANCHORAGE, Alaska — In a new study published today, scientists from the U.S. Geological Survey found that the remote northern Alaska coast has some of the highest shoreline erosion rates in the world. Analyzing over half a century of shoreline change data, scientists found the pattern is extremely variable with most of the coast retreating at rates of more than 1 meter a year.  

“Coastal erosion along the Arctic coast of Alaska is threatening Native Alaskan villages, sensitive ecosystems, energy and defense related infrastructure, and large tracts of Native Alaskan, State, and Federally managed land,” said Suzette Kimball, acting director of the USGS.

Scientists studied more than 1600 kilometers of the Alaskan coast between the U.S. Canadian border and Icy Cape and found the average rate of shoreline change, taking into account beaches that are both eroding and expanding, was -1.4 meters per year. Of those beaches eroding, the most extreme case exceeded 18.6 meters per year.

“This report provides invaluable objective data to help native communities, scientists and land managers understand natural changes and human impacts on the Alaskan coast,” said Ann Gibbs, USGS Geologist and lead author of the new report.

Coastlines change in response to a variety of factors, including changes in the amount of available sediment, storm impacts, sea-level rise and human activities. How much a coast erodes or expands in any given location is due to some combination of these factors, which vary from place to place. 

"There is increasing need for this kind of comprehensive assessment in all coastal environments to guide managed response to sea-level rise and storm impacts," said Dr. Bruce Richmond of the USGS. "It is very difficult to predict what may happen in the future without a solid understanding of what has happened in the past. Comprehensive regional studies such as this are an important tool to better understand coastal change. ” 

Compared to other coastal areas of the U.S., where four or more historical shoreline data sets are available, generally back to the mid-1800s, shoreline data for the coast of Alaska are limited. The researchers used two historical data sources, from the 1940s and 2000s, such as maps and aerial photographs, as well as modern data like lidar, or “light detection and ranging,” to measure shoreline change at more than 26,567 locations.

There is no widely accepted standard for analyzing shoreline change. The impetus behind the National Assessment project was to develop a standardized method of measuring changes in shoreline position that is consistent on all coasts of the country. The goal was to facilitate the process of periodically and systematically updating the results in a consistent manner.

The report, titled “National Assessment of Shoreline Change: Historical Shoreline Change Along the North Coast of Alaska, U.S.-Canadian Border to Icy,” is the 8th Long-Term Coastal Change report produced as part of the USGS’s National Assessment of Coastal Change Hazards project. A comprehensive database of digital vector shorelines and rates of shoreline change for Alaska, from the U.S.-Canadian border to Icy Cape, is presented along with this report. Data for all 8 long-term coastal change reports are also available on the USGS Coastal Change Hazards Portal

 

Greenhouse Gas Emissions Remain the Primary Threat to Polar Bears

U.S. Geological Survey News Feed - June 30, 2015 - 3:30pm
Summary: Greenhouse gas emissions remain the primary threat to the preservation of polar bear populations worldwide. This conclusion holds true under both a reduced greenhouse gas emission scenario that stabilizes climate warming and another scenario where emissions and warming continue at the current pace, according to updated U.S. Geological Survey research models

Contact Information:

Paul  Laustsen ( Phone: 650-329-4046 ); Catherine Puckett ( Phone: 352-377-2469 );



ANCHORAGE, Alaska — Greenhouse gas emissions remain the primary threat to the preservation of polar bear populations worldwide. This conclusion holds true under both a reduced greenhouse gas emission scenario that stabilizes climate warming and another scenario where emissions and warming continue at the current pace, according to updated U.S. Geological Survey research models.  

Under both scenarios, the outcome for the worldwide polar bear population will very likely worsen over time through the end of the century.

The modeling effort examined the prognosis for polar bear populations in the four ecoregions (see map) comprising their range using current sea ice projections from the Intergovernmental Panel on Climate Change for two greenhouse gas emission scenarios. Both scenarios examined how greenhouse gas emissions may affect polar bears: one looked at stabilization in climate warming by century’s end because of reduced GHG emissions, and the other looked at unabated (unchanged) rates of GHG emissions, leading to increased warming by century’s end.

“Addressing sea ice loss will require global policy solutions to reduce greenhouse gas emissions and likely be years in the making,” said Mike Runge, a USGS research ecologist. “Because carbon emissions accumulate over time, there will be a lag, likely on the order of several decades, between mitigation of emissions and meaningful stabilization of sea ice loss.”

Under the unabated emission scenario, polar bear populations in two of four ecoregions were projected to reach a greatly decreased state about 25 years sooner than under the stabilized scenario. Under the stabilized scenario, GHG emissions peak around 2040, decline through 2080, then decline through the end of the century. In this scenario, USGS projected that all ecoregion populations will greatly decrease except for the Archipelago Ecoregion, located in the high-latitude Canadian Arctic, where sea ice generally persists longer in the summer. These updated modeling outcomes reinforce earlier suggestions of the Archipelago’s potential as an important refuge for ice-dependent species, including the polar bear.

The models, updated from 2010, evaluated specific threats to polar bears such as sea ice loss, prey availability, hunting, and increased human activities, and incorporated new findings on regional variation in polar bear response to sea ice loss.

“Substantial sea ice loss and expected declines in the availability of marine prey that polar bears eat are the most important specific reasons for the increasingly worse outlook for polar bear populations,” said Todd Atwood, research biologist with the USGS, and lead author of the study. “We found that other environmental stressors such as trans-Arctic shipping, oil and gas exploration, disease and contaminants, sustainable harvest and defense of life takes, had only negligible effects on polar bear populations—compared to the much larger effects of sea ice loss and associated declines in their ability to access prey.”

Additionally, USGS researchers noted that if the summer ice-free period lengthens beyond 4 months – as forecasted to occur during the last half of this century in the unabated scenario – the negative effects on polar bears will be more pronounced. Polar bears rely on ice as the platform for hunting their primary prey – ice seals – and when sea ice completely melts in summer, the bears must retreat to land where their access to seals is limited. Other research this year has shown that terrestrial foods available to polar bears during these land-bound months are unlikely to help polar bear populations adapt to sea ice loss.

USGS scientists’ research found that managing threats other than greenhouse gas emissions could slow the progression of polar bear populations to an increasingly worse status. The most optimistic prognosis for polar bears would require immediate and aggressive reductions of greenhouse gas emissions that would limit global warming to less than 2°C above preindustrial levels.

The U.S. Fish and Wildlife Service listed the polar bear as threatened under the Endangered Species Act in 2008 due to the threat posed by sea ice loss. The polar bear was the first species to be listed because of climate change. A plan to address recovery of the polar bear will be released into the Federal Register by the USFWS for public review on July 2, 2015.

The updated forecast for polar bears was developed by USGS as part of its Changing Arctic Ecosystems Initiative, together with collaborators from the U.S. Forest Service and Polar Bears International. The polar bear forecasting report is available online

Polar Bear Ecoregions: In the Seasonal Ice Ecoregion (see map), sea ice melts completely in summer and all polar bears must be on land. In the Divergent Ice Ecoregion, sea ice pulls away from the coast in summer, and polar bears must be on land or move with the ice as it recedes north. In the Convergent Ice and Archipelago Ecoregions, sea ice is generally retained during the summer. (High resolution image)

Water Used for Hydraulic Fracturing Varies Widely Across United States

U.S. Geological Survey News Feed - June 30, 2015 - 12:00pm
Summary: The amount of water required to hydraulically fracture oil and gas wells varies widely across the country, according to the first national-scale analysis and map of hydraulic fracturing water usage detailed in a new USGS study accepted for publication in Water Resources Research, a journal of the American Geophysical Union USGS Releases First Nationwide Map of Water Usage for

Contact Information:

Anne Berry Wade (USGS) ( Phone: 703-648-4483 ); Leigh Cooper (AGU) ( Phone: 202-777-7324 ); Tanya Gallegos ( Phone: 703-648-6181 );



The amount of water required to hydraulically fracture oil and gas wells varies widely across the country, according to the first national-scale analysis and map of hydraulic fracturing water usage detailed in a new USGS study accepted for publication in Water Resources Research, a journal of the American Geophysical Union. The research found that water volumes for hydraulic fracturing averaged within watersheds across the United States range from as little as 2,600 gallons to as much as 9.7 million gallons per well.

This map shows the average water use in hydraulic fracturing per oil and gas well in watersheds across the United States. (High resolution image)

In addition, from 2000 to 2014, median annual water volume estimates for hydraulic fracturing in horizontal wells had increased from about 177,000 gallons per oil and gas well to more than 4 million gallons per oil well and 5.1 million gallons per gas well. Meanwhile, median water use in vertical and directional wells remained below 671,000 gallons per well. For comparison, an Olympic-sized swimming pool holds about 660,000 gallons.

“One of the most important things we found was that the amount of water used per well varies quite a bit, even within a single oil and gas basin,” said USGS scientist Tanya Gallegos, the study’s lead author. “This is important for land and resource managers, because a better understanding of the volumes of water injected for hydraulic fracturing could be a key to understanding the potential for some environmental impacts.”

This map shows the percentage of oil and gas wells that use horizontal drilling in watersheds across the United States. (High resolution image)

Horizontal wells are those that are first drilled vertically or directionally (at an angle from straight down) to reach the unconventional oil or gas reservoir and then laterally along the oil or gas-bearing rock layers. This is done to increase the contact area with the reservoir rock and stimulate greater oil or gas production than could be achieved through vertical wells alone.

However, horizontal wells also generally require more water than vertical or directional wells. In fact, in 52 out of the 57 watersheds with the highest average water use for hydraulic fracturing, over 90 percent of the wells were horizontally drilled.

Although there has been an increase in the number of horizontal wells drilled since 2008, about 42 percent of new hydraulically fractured oil and gas wells completed in 2014 were still either vertical or directional. The ubiquity of the lower-water-use vertical and directional wells explains, in part, why the amount of water used per well is so variable across the United States.

The watersheds where the most water was used to hydraulically fracture wells on average coincided with parts of the following shale formations:

  • Eagle Ford (within watersheds located mainly in Texas)
  • Haynesville-Bossier (within watersheds located mainly in Texas & Louisiana)
  • Barnett (within watersheds located mainly in Texas)
  • Fayetteville (within watersheds located in Arkansas)
  • Woodford  (within watersheds located mainly in Oklahoma)
  • Tuscaloosa  (within watersheds located in Louisiana & Mississippi)
  • Marcellus & Utica (within watersheds located in parts of Ohio, Pennsylvania, West Virginia and within watersheds extending into southern New York)

Shale gas reservoirs are often hydraulically fractured using slick water, a fluid type that requires a lot of water. In contrast, tight oil formations like the Bakken (in parts of Montana and North Dakota) often use gel-based hydraulic fracturing treatment fluids, which generally contain lower amounts of water.

This research was carried out as part of a larger effort by the USGS to understand the resource requirements and potential environmental impacts of unconventional oil and gas development. Prior publications include historical trends in the use of hydraulic fracturing from 1947-2010, as well as the chemistry of produced waters from hydraulically fractured wells.

The report is entitled “Hydraulic fracturing water use variability in the United States and potential environmental implications,” and has been accepted for publication in Water Resources Research. More information about this study and other USGS energy research can be found at the USGS Energy Resources Program. Stay up to date on USGS energy science by signing up for our quarterly Newsletter or following us on Twitter!

Past Water Patterns Drive Present Wading Bird Numbers

U.S. Geological Survey News Feed - June 24, 2015 - 4:59pm
Summary: Wading bird numbers in the Florida Everglades are driven by water patterns that play out over multiple years according to a new study by the U.S. Geological Survey and Florida Atlantic University

Contact Information:

James Beerens ( Phone: 561-809-9793 ); Gabrielle Bodin ( Phone: 337-266-8655 ); Hannah Hamilton ( Phone: 703-648-4356 );



Wading bird numbers in the Florida Everglades are driven by water patterns that play out over multiple years according to a new study by the U.S. Geological Survey and Florida Atlantic University. Previously, existing water conditions were seen as the primary driving factor affecting numbers of birds, but this research shows that the preceding years’ water conditions and availability are equally important.

“We’ve known for some time that changes in water levels trigger a significant response by wading birds in the Everglades,” said James Beerens, the study’s lead author and an ecologist at USGS.  “But what we discovered in this study is the importance of history. What happened last year can tell you what to expect this year.”

From 2000 to 2009, scientists examined foraging distribution and abundance data for wading bird populations, including Great Egrets, White Ibises, and threatened Wood Storks.  To do the research, they conducted reconnaissance flights across the Greater Everglades system, an area that includes Big Cypress National Preserve and Everglades National Park. They found climate and water management conditions going as far back as three years influenced current bird population numbers and distribution.

“We know wading birds depend on small fish and invertebrates for food,” said Dale Gawlik, director of FAU’s Environmental Science Program and study coauthor. “What is interesting is the ‘lag effect’; wet conditions that build up invertebrate and fish numbers may not immediately result in increased bird numbers until after several more wet years.”

This new information has allowed scientists to improve existing wading bird distribution models providing a more accurate tool to estimate wading bird numbers under climate change scenarios and hydrological restoration scenarios proposed for the Everglades.

In the Everglades, food items such as small fish and crayfish are concentrated from across the landscape into pools as water levels recede throughout the dry season.  It does not always work that way anymore due to a lack of water and loss of habitat in Everglades marshes. This new research shows that under the right dry season conditions following a water pulse in previous years, wading bird food is even further concentrated in near-perfect water depths, setting off a boom in the numbers of young wading birds that add to the population.

Beerens and computer scientists from the USGS have also developed publically available software as an extension to this work that predicts wading bird numbers in the Everglades based on real-time, current conditions, in addition to historical settings. This new model allows managers to simulate the effect of various management strategies that can have an impact on future bird numbers. The number and distribution of wading birds serve as an important indicator of ecosystem health in the Everglades. Beerens further explained that “increased seasonal water availability in drier areas of the Everglades stimulates the entire ecosystem, as reflected in the wading birds.”

Altered water patterns resulting from land-use and water management changes have reduced wading bird numbers throughout the Everglades by about 90 percent since the turn of the 20th Century. This research shows that current management and use of water is equally important.

“Our findings also suggest that we can continue to improve the Everglades and its wading bird community by restoring water availability to areas that are over drained,” said Beerens. “There is increasing understanding that water availability and proper management make this entire ecological and economic engine work.”

Florida generates more than $3 billion in annual revenue from resident and nonresident wildlife watchers according to estimates from the U.S. Fish and Wildlife Service. Of the 1.9 million people who view wildlife in Florida while ‘away-from-home’ each year, more than 1.3 million watch wading birds and other water-dependent birds.

The study, “Linking Dynamic Habitat Selection with Wading Bird Foraging Distributions across Resource Gradients,” was published in the journal PLOS ONE and can be found online.

Scientists Expect Slightly Below Average Chesapeake Bay 'Dead Zone' this Summer

U.S. Geological Survey News Feed - June 23, 2015 - 1:00pm
Summary: Scientists are expecting that this year’s Chesapeake Bay hypoxic low-oxygen zone, also called the “dead zone,” will be approximately 1.37 cubic miles – about the volume of 2.3 million Olympic-size swimming pools Low river flow and nutrient loading reason for smaller predicted size

Contact Information:

Jon Campbell, USGS ( Phone: 703-648-4180 ); Ben  Sherman, NOAA ( Phone: 202-253-5256 ); Jim  Erickson, UMich. ( Phone: 734-647-1842 );



Scientists are expecting that this year’s Chesapeake Bay hypoxic low-oxygen zone, also called the “dead zone,” will be approximately 1.37 cubic miles – about the volume of 2.3 million Olympic-size swimming pools. While still large, this is 10 percent lower than the long-term average as measured since 1950. 

The anoxic portion of the zone, which contains no oxygen at all, is predicted to be 0.27 cubic miles in early summer, growing to 0.28 cubic miles by late summer. Low river flow and low nutrient loading from the Susquehanna River this spring account for the smaller predicted size. 

This is the ninth year for the Bay outlook which, because of the shallow nature of large areas of the estuary, focuses on water volume or cubic miles, instead of square mileage as used in the Gulf of Mexico dead zone forecast announced last week. The history of hypoxia in the Chesapeake Bay since 1985 can be found at EcoCheck, a website from the University of Maryland Center for Environmental Science. 

The Bay’s hypoxic and anoxic zones are caused by excessive nutrient pollution, primarily from human activities such as agriculture and wastewater. The nutrients stimulate large algal blooms that deplete oxygen from the water as they decay. The low oxygen levels are insufficient to support most marine life and habitats in near-bottom waters and threaten the Bay’s production of crabs, oysters and other important fisheries. 

The Chesapeake Bay Program coordinates a multi-year effort to restore the water and habitat quality to enhance its productivity. The forecast and oxygen measurements taken during summer monitoring cruises are used to test and improve our understanding of how nutrients, hydrology, and other factors affect the size of the hypoxic zone. They are key to developing effective hypoxia reduction strategies. 

The predicted “dead zone” size is based on models that forecast three features of the zone to give a comprehensive view of expected conditions: midsummer volume of the low-oxygen hypoxic zone, early-summer oxygen-free anoxic zone, and late-summer oxygen-free anoxic zone. The models were developed by NOAA-sponsored researchers at the University of Maryland Center for Environmental Science and the University of Michigan. They rely on nutrient loading estimates from the U. S. Geological Survey

"These ecological forecasts are good examples of the critical environmental intelligence products and tools that NOAA is providing to stakeholders and interagency management bodies such as the Chesapeake Bay Program," said Kathryn D. Sullivan, Ph.D., under secretary of commerce for oceans and atmosphere and NOAA administrator. “With this information, we can work collectively on ways to reduce pollution and protect our marine environments for future generations.” 

The hypoxia forecast is based on the relationship between nutrient loading and oxygen. Aspects of weather, including wind speed, wind direction, precipitation and temperature also impact the size of dead zones. For example, in 2014, sustained winds from Hurricane Arthur mixed Chesapeake Bay waters, delivering oxygen to the bottom and dramatically reducing the size of the hypoxic zone to 0.58 cubic miles. 

"Tracking how nutrient levels are changing in streams, rivers, and groundwater and how the estuary is responding to these changes is critical information for evaluating overall progress in improving the health of the Bay,” said William Werkheiser, USGS associate director for water. "Local, state and regional partners rely on this tracking data to inform their adaptive management strategies in Bay watersheds." 

The USGS provides the nutrient runoff and river stream data that are used in the forecast models. USGS estimates that 58 million pounds of nitrogen were transported to the Chesapeake Bay from January to May 2015, which is 29 percent below average conditions. The Chesapeake data are funded through a cooperative agreement between USGS and the Maryland Department of Natural Resources. USGS operates more than 400 real-time stream gages and collects water quality data at numerous long-term stations throughout the Chesapeake Bay basin to track how nutrient loads are changing over time. 

"Forecasting how a major coastal ecosystem, the Chesapeake Bay, responds to decreasing nutrient pollution is a challenge due to year-to-year variations and natural lags," said Dr. Donald Boesch, president of the University of Maryland Center for Environmental Science, "But we are heading in the right direction."  

Later this year researchers will measure oxygen levels in the Chesapeake Bay. The final measurement in the Chesapeake will come in October following surveys by the Chesapeake Bay Program's partners from the Maryland Department of Natural Resources (DNR) and the Virginia Department of Environmental Quality. Bimonthly monitoring cruise updates on Maryland Bay oxygen levels can be found on DNR’s Eyes on the Bay website.

NOAA, Partners Predict an Average 'Dead Zone' for Gulf of Mexico

U.S. Geological Survey News Feed - June 17, 2015 - 1:00pm
Summary: Scientists are expecting that this year’s Gulf of Mexico hypoxic zone, also called the “dead zone,” will be approximately 5,483 square miles or about the size of Connecticut — the same as it has averaged over the last several years Outlook incorporates multiple hypoxia models for first time

Contact Information:

Jon Campbell, USGS ( Phone: 703-648-4180 ); Ben  Sherman, NOAA ( Phone: 202-253-5256 ); Jim  Erickson ( Phone: 734-647-1842 );



Additional contacts:  Dave Malmquist, VIMS, 804-684-7011, davem@vims.edu  and William (Matt) Shipman, NCSU, 919-515-6386, matt_shipman@ncsu.edu

Scientists are expecting that this year’s Gulf of Mexico hypoxic zone, also called the “dead zone,” will be approximately 5,483 square miles or about the size of Connecticut — the same as it has averaged over the last several years. 

The dead zone in the Gulf of Mexico affects nationally important commercial and recreational fisheries and threatens the region's economy. Hypoxic zones hold very little oxygen, and are caused by excessive nutrient pollution, primarily from activities such as agriculture and wastewater. The low oxygen levels cannot support most marine life and habitats in near-bottom waters. 

This year marks the first time the results of four models were combined. The four model predictions ranged from 4,344 to 5,985 square miles, and had a collective predictive interval of 3,205 to 7,645 square miles, which take into account variations in weather and oceanographic conditions. 

The NOAA-sponsored Gulf of Mexico hypoxia forecast has improved steadily in recent years, a result of advancements of individual models and an increase in the number of models used for the forecast. Forecasts based on multiple models are called ensemble forecasts and are commonly used in hurricane and other weather forecasts. 

The ensemble models were developed by NOAA-sponsored modeling teams and researchers at the University of Michigan, Louisiana State University, Louisiana Universities Marine Consortium, Virginia Institute of Marine Sciences/College of William and Mary, Texas A&M University, North Carolina State University, and the U.S.Geological Survey (USGS). The hypoxia forecast is part of a larger NOAA effort to deliver ecological forecasts that support human health and well-being, coastal economies, and coastal and marine stewardship.

“NOAA, along with our partners, continues to improve our capability to generate environmental data that can help mitigate and manage this threat to Gulf fisheries and economies,” said Kathryn D. Sullivan, Ph.D., under secretary of commerce for oceans and atmosphere and NOAA administrator. “We are adding models to increase the accuracy of our dead zone forecast." 

The Gulf of Mexico hypoxia forecast is based on nutrient runoff and river stream data from the USGS. The USGS operates more than 3,000 real-time stream gauges, 50 real-time nitrate sensors, and collects water quality data at long-term stations throughout the Mississippi River basin to track how nutrient loads are changing over time. 

The USGS estimates that 104,000 metric tons of nitrate and 19,300 metric tons of phosphorus flowed down the Mississippi and Atchafalaya rivers into the Gulf of Mexico in May 2015. This is about 21 percent below the long-term (1980-2014) average for nitrogen and 16 percent above the long-term average for phosphorus. 

"Real-time nitrate sensors are advancing our understanding of how nitrate is transported in small streams and large rivers, including the main stem of the Mississippi River,” said William Werkheiser, USGS associate director for water. “Long-term monitoring is critical for tracking how nutrient levels are changing in response to management actions and for improving modeling tools to estimate which sources and areas are contributing the largest amounts of nutrients to the Gulf. " 

The confirmed size of the 2015 Gulf hypoxic zone will be released in early August, following a monitoring survey led by the Louisiana Universities Marine Consortium from July 28 to August 4.

Highest Peak in North America to be Surveyed

U.S. Geological Survey News Feed - June 15, 2015 - 11:30am
Summary: A new GPS survey of Mount McKinley, the highest point in North America, will update the commonly accepted elevation of McKinley’s peak, 20,320 ft. The last survey was completed in 1953 Several agency partners plan to update the height of Mount McKinley this summer

Contact Information:

Sue Mitchell, UAF GI ( Phone: 907-474-5823 ); Vicki Childers, NOAA/NGS ( Phone: 301-713-3211 x161 ); Mark Newell, USGS ( Phone: 573-308-3850 );



A new GPS survey of Mount McKinley, the highest point in North America, will update the commonly accepted elevation of McKinley’s peak, 20,320 ft. The last survey was completed in 1953.

The USGS, along with NOAA’s National Geodetic Survey (NGS), and the University of Alaska Fairbanks (UAF), are supporting a Global Positioning System (GPS) survey of the Mount McKinley apex. Surveying technology and processes have improved greatly since the last survey and the ability to establish a much more accurate height now exists. With the acquisition of new elevation (ifsar) data in Alaska as part of the 3D Elevation Program, there have been inquiries about the height of the summit. The survey party is being led by CompassData, a subcontractor for Dewberry on a task awarded under the USGS’ Geospatial Products and Services Contract (GPSC).

Using modern GPS survey equipment and techniques, along with better gravity data to improve the geoid model in Alaska, the partners will be able to report the summit elevation with a much higher level of confidence than has been possible in the past. It is anticipated the newly surveyed elevation will be published by the National Geodetic Survey in late August.

An experienced team of four climbers, one from UAF and three from CompassData, will start the precarious trek to the summit with the needed scientific instruments in tow, in the middle part of June. They plan to return on or before July 7 and begin work with the University of Alaska Fairbanks and NGS processing the data to arrive at the new summit elevation.

At 20, 320 feet, Mount McKinley is North America’s highest peak. (Photo courtesy of Todd Paris, UAF). (High resolution image) Climbing Mount McKinley, North America’s highest peak, is a daunting task for even the most experienced mountaineers at Denali National Park in Alaska. (Photo courtesy of National Geographic). (High resolution image) The Mount McKinley survey team, and their equipment, are expected to face temperatures well below zero, high winds and frequent snow. Current forecast, courtesy of NOAA. (Photo courtesy of Todd Paris, UAF). (High resolution image)

Tectonic Model Shows North America May Once Have Been Linked to Australia or Antarctica

U.S. Geological Survey News Feed - June 8, 2015 - 12:00pm
Summary: North America may have once been attached to Australia, according to research just published in Lithosphere and spearheaded by U.S. Geological Survey geologist James Jones and his colleagues at Bucknell University and Colorado School of Mines

Contact Information:

Aleeza  Wilkins ( Phone: 703-648-6106 ); James  Jones ( Phone: 907-786-7442 );



North America may have once been attached to Australia, according to research just published in Lithosphere and spearheaded by U.S. Geological Survey geologist James Jones and his colleagues at Bucknell University and Colorado School of Mines.

Approximately every 300 million years, the Earth completes a supercontinent cycle wherein continents drift toward one another and collide, remain attached for millions of years, and eventually rift back apart. Geologic processes such as subduction and rifting aid in the formation and eventual break-up of supercontinents, and these same processes also help form valuable mineral resource deposits. Determining the geometry and history of ancient supercontinents is an important part of reconstructing the geologic evolution of Earth, and it can also lead to a better understanding of past and present mineral distributions.

North America is a key component in reconstructions of many former supercontinents, and there are strong geological associations between the western United States and Australia, which is one of the world’s leading mineral producers.

In this study, Jones and others synthesized mineral age data from ancient sedimentary rocks in the Trampas and Yankee Joe basins of Arizona and New Mexico. They found that the ages of many zircon crystals—mineral grains that were eroded from other rocks and embedded in the sedimentary deposits—were approximately 1.6 to 1.5 billion years old, an age range that does not match any known geologic age provinces in the entire western United States.

This surprising result actually mirrors previous studies of the Belt-Purcell basin (located in Montana, Idaho and parts of British Columbia, Canada) and a recently recognized basin in western Yukon, Canada, in which many zircon ages between 1.6 and 1.5 billion years old are common despite the absence of matching potential source rocks of this age.

However, the distinctive zircon ages in all three study locations do match the well known ages of districts in Australia and, to a slightly lesser known extent, Antarctica.

This publication marks the first time a complete detrital mineral age dataset has been compiled to compare the Belt basin deposits to strata of similar age in the southwestern United States.  “Though the basins eventually evolved along very different trajectories, they have a shared history when they were first formed,” said Jones. “That history gives us clues as to what continents bordered western North America 1.5 billion years ago.”

The tectonic model presented in this paper suggests that the North American sedimentary basins were linked to sediment sources in Australia and Antarctica until the break up of the supercontinent Columbia. The dispersed components of Columbia ultimately reformed into Rodinia, perhaps the first truly global supercontinent in Earth’s history, around 1.0 billion years ago. Continued sampling and analysis of ancient sedimentary basin remnants will remain a critical tool for further testing global supercontinent reconstructions.

The paper can be accessed online. For information about USGS mineral resource modeling efforts, visit the USGS Mineral Resources Program Web site or follow us on Twitter.

Heat Accelerates Dry in California Drought

U.S. Geological Survey News Feed - May 28, 2015 - 3:56pm
Summary: Although record low precipitation has been the main driver of one of the worst droughts in California history, abnormally high temperatures have also played an important role in amplifying its adverse effects, according to a recent study by the U.S. Geological Survey and university partners

Contact Information:

Jon Campbell ( Phone: 703-648-4180 ); Chris  Funk ( Phone: 805-893-4223 );



Although record low precipitation has been the main driver of one of the worst droughts in California history, abnormally high temperatures have also played an important role in amplifying its adverse effects, according to a recent study by the U.S. Geological Survey and university partners.

Experiments with a hydrologic model for the period Oct. 2013-Sept. 2014 showed that if the air temperatures had been cooler, similar to the 1916-2012 average, there would have been an 86% chance that the winter snowpack would have been greater, the spring-summer runoff higher, and the spring-summer soil moisture deficits smaller.

To gauge the effect of high temperatures on drought, lead author Shraddhanand Shukla (University of California – Santa Barbara, UCSB) devised two sets of modeling experiments that compared climate data from water year 2014 (Oct. 2013-Sept. 2014) to similar intervals during 1916-2012.

In the first simulation set, Shukla substituted 2014 temperature values with the historical temperatures for each of the study’s 97 years, while keeping the 2014 precipitation values. In the second simulation set, he combined the observed 2014 temperatures with historical precipitation values for each of the preceding years, 1916-2012. 

“This experimental approach allows us to model past situations and tease out the influence of temperature in preceding drought conditions,” said Chris Funk, a USGS scientist and a co-author of the investigation. “By crunching enough data over many, many simulations, the effect of temperature becomes more detectable.  We can’t do the same in reality, the here and now, because then we only have a single sample.” Funk, an adjunct professor at UCSB, helps coordinate research at the university that supports USGS programs.  

High heat has multiple damaging effects during drought, according to the study, increasing the vulnerability of California’s water resources and agricultural industry. Not only does high heat intensify evaporative stress on soil, it has a powerful effect in reducing snowpack, a key to reliable water supply for the state. In addition to decreased snowpack, higher temperatures can cause the snowpack to melt earlier, dramatically decreasing the amount of water available for agriculture in summer when it is most needed.

Although the study did not directly address the issue of long-term climate change, the implications of higher temperatures are clear.

“If average temperatures keep rising, we will be looking at more serious droughts, even if the historical variability of precipitation stays the same,” Shukla said. “The importance of temperature in drought prediction is likely to become only more significant in the future.”

The research was published online in Geophysical Research Letters, a journal of the American Geophysical Union.

For more information about drought in California, visit the USGS California Water Science Center online.

Drought effects at Trinity Lake, a major California reservoir located about 60 miles NW of Redding, California. USGS photo, Tim Reed, Feb. 2014. Photo source: CA Water Science Center