U.S. Geological Survey News Feed

Citizen Scientists Submit More Than 100,000 Map Points

U.S. Geological Survey News Feed - January 29, 2015 - 11:00am
Summary: The U.S. Geological Survey citizen science project, The National Map Corps, has realized remarkable response. In less than two years, the volunteer-based project has harvested more than 100,000 “points” The USGS “crowd-sourcing” map project reaches major milestone

Contact Information:

Elizabeth McCartney ( Phone: 573-308-3696 ); Mark Newell ( Phone: 573-308-3850 );



The U.S. Geological Survey citizen science project, The National Map Corps, has realized remarkable response. In less than two years, the volunteer-based project has harvested more than 100,000 “points”. Hundreds of volunteer cartographers are making significant additions to the USGS ability to provide accurate mapping information to the public. 

Each point represents a structure or manmade feature on a map that has been verified and updated, and then submitted to support The National Map and US Topo maps.

Using crowd-sourcing techniques, the USGS Volunteer Geographic Information project known as The National Map Corps (TNMCorps) encourages citizen volunteers to collect manmade structure data in an effort to provide accurate and authoritative spatial map data for the USGS National Geospatial Program’s web-based map products.

“I am 80 years old. I work three days a week for a golf course trapping moles and gophers”, said a very prominent citizen scientist volunteer who goes by the handle of “Mole Trapper”. “I spent 11 years volunteering for a fish and wildlife agency. When the big landslide at Oso, Washington happened, I went on the USGS website and discovered the map corps. I worked summers while in high school for a surveyor who was very precise and he told me an inaccurate survey is worthless. I hate inaccurate maps, so this program was just right for me. I hope my work is as accurate as it can be, but if it isn't, I plead old age.”

Structures being updated include schools, hospitals, post offices, police stations and other important public buildings. The data currently being collected by volunteers becomes part of The National Map structures dataset, which is made available to users free of charge.

"I am retired from an unrelated field, but I have loved maps and travel all my life,” explained other highly active volunteer who goes by “fconley”. “When I saw that USGS was looking for volunteers I immediately joined, first of all working with paper maps and quads. As digital mapping, satellite imagery, and GPS became more available I was enthralled. With the imagery now accessible it is almost like being able to travel sitting at my desk. At times, locating structures seems similar to solving puzzles or detective work. This whole project is not only enjoyable but it makes me feel that I am making a lasting and useful contribution. I am thankful for the opportunity to be involved in this fascinating endeavor."

Beginning as a series of pilot projects in 2011, The National Map Corps has grown state-by-state to include the entire U.S. By August of 2013, volunteers were editing in every state in the country and the US territories. To date, the number of active volunteers has grown to 930 individuals, including some extremely energetic participants who have collected in excess of 6,000 points.

To show appreciation of the volunteers’ efforts, The National Map Corps has instituted a recognition program that awards “virtual" badges to volunteers. Each edit that is submitted is worth one point towards the badge level. The badges consist of a series of antique surveying instruments and images following the evolution of land survey and moving to aerial observation of the Earth’s surface such as pigeon-mounted cameras and hot air balloons. Additionally, volunteers are publically acknowledged (with permission) via TwitterFacebook and Google+.

Tools on TNMCorps web site explain how a volunteer can edit any area, regardless of their familiarity with the selected structures, and becoming a volunteer for TNMCorps is easy; go to The National Map Corps web site to learn more and to sign up as a volunteer. If you have access to the Internet and are willing to dedicate some time to editing map data, we hope you will consider participating.

Squadron of Biplane Spectators badge, currently the highest recognition award, is given to volunteers who submit more than 6,000 points.

Family of Floating Photogrammetrists badge is one of the new awards, which is given to volunteers who submit more than 3,000 points.

Badges awarded for submitting edits, shown from first to last: Order of the Surveyor’s Chain (25-49 points), Society of the Steel Tape ( 50-99 points), Pedometer Posse (100-199 points), Surveyor’s Compass  (200-499 points), Stadia Board Society (500-999 points), Alidade Alliance (1,000-1,999 points), and the Theodolite Assemblage (2000-2,999 points).

Historical Hydraulic Fracturing Trends and Data Unveiled in New USGS Publications

U.S. Geological Survey News Feed - January 27, 2015 - 2:35pm
Summary: Two new U.S. Geological Survey publications that highlight historical hydraulic fracturing trends and data from 1947 to 2010 are now available

Contact Information:

Heidi  Koontz ( Phone: 303-202-4763 );



Two new U.S. Geological Survey publications that highlight historical hydraulic fracturing trends and data from 1947 to 2010 are now available.

Hydraulic fracturing is presently the primary stimulation technique for oil and gas production in unconventional resource reservoirs. Comprehensive, published, and publicly available information regarding the extent, location, and character of hydraulic fracturing in the United States is scarce. 

“These national-scale data and analyses will provide a basis for making comparisons of current-day hydraulic fracturing to historical applications,” said USGS scientist and lead author Tanya Gallegos.

“We now have an improved understanding of where the practice is occurring and how hydraulic fracturing characteristics have changed over time.” 

This national analysis of data on nearly 1 million hydraulically fractured wells and 1.8 million fracturing treatment records from 1947 through 2010 is used to identify hydraulic fracturing trends in drilling methods and use of proppants (sand or similar material suspended in water or other fluid to keep fissures open), treatment fluids, additives, and water in the United States. These trends are compared to peer-reviewed literature in an effort to establish a common understanding of the differences in hydraulic fracturing and provide a context for understanding the costs and benefits of increased oil and gas production. The publications also examine how newer technology has affected the amount of water needed for the process and where hydraulic fracturing has occurred at different points in time. Although hydraulic fracturing is in widespread use across the United States in most major oil and gas basins for the development of unconventional oil and gas resources, historically, Texas had the highest number of records of hydraulic fracturing treatments and associated wells documented in the datasets. 

These datasets also illustrate the rapid expansion of water-intensive horizontal/directional drilling that has increased from 6 percent of new hydraulically fractured wells drilled in the United States in 2000 to 42 percent of new wells drilled in 2010. Increased horizontal drilling also coincided with the emergence of water-based “slick water” fracturing fluids. This is one example of how the most current hydraulic fracturing materials and methods are notably different from those used in previous decades and have contributed to the development of previously inaccessible unconventional oil and gas production target areas, namely in shale and tight-sand reservoirs. 

The USGS report Scientific Investigation Report is available along with the companion Data Series online.

Natural Breakdown of Petroleum Underground Can Lace Arsenic into Groundwater

U.S. Geological Survey News Feed - January 26, 2015 - 11:36am
Summary: In a long-term field study, U.S. Geological Survey (USGS) and Virginia Tech scientists have found that changes in geochemistry from the natural breakdown of petroleum hydrocarbons underground can promote the chemical release (mobilization) of naturally occurring arsenic into groundwater

Contact Information:

Jon Campbell ( Phone: 703-648-4180 ); Isabelle Cozzarelli ( Phone: 703-648-5899 );



In a long-term field study, U.S. Geological Survey (USGS) and Virginia Tech scientists have found that changes in geochemistry from the natural breakdown of petroleum hydrocarbons underground can promote the chemical release (mobilization) of naturally occurring arsenic into groundwater. This geochemical change can result in potentially significant arsenic groundwater contamination. 

While arsenic is naturally present in most soils and sediments at various concentrations, it is not commonly a health concern until it is mobilized by a chemical reaction and dissolves into groundwater. Elevated arsenic levels in groundwater used for drinking water is a significant public health concern since arsenic, a toxin and carcinogen, is linked to numerous forms of skin, bladder, and lung cancer. 

For the past 32 years, a collaborative group of government, academic, and industry-supported scientists have studied the natural attenuation (biodegradation over time) of a 1979 petroleum spill in the shallow, glacial aquifer at the National Crude Oil Spill Fate and Natural Attenuation Research Site, near Bemidji, Minnesota.  

Working at this intensively surveyed site, the researchers in this USGS-led investigation focused on a specific question: whether naturally occurring arsenic found in the glacial aquifers in this area might be mobilized in the presence of hydrocarbons because of chemical interactions involving iron hydroxides which also occur naturally. To address this question, arsenic concentrations were measured for several years in groundwater and in sediment up-gradient, within, and down-gradient from the hydrocarbon plume at Bemidji. 

Carefully measured samples from the field reveal that arsenic concentrations in the hydrocarbon plume can reach 230 micrograms per liter — 23 times the current drinking water standard of 10 micrograms per liter. Arsenic concentrations fall below 10 micrograms per liter both up-gradient and down-gradient from the plume. 

The scientists attributed the elevated arsenic in the hydrocarbon plume to a series of interrelated geochemical and biochemical processes that involve arsenic and iron oxides (both are commonly found in sediments across the country) and the metabolization of carbon–rich petroleum by microbes in anoxic (low oxygen) conditions. The complex chemical process is explained further at this USGS website and in the published research article.

The results from this work also suggest that the arsenic released in the plume may reattach to aquifer sediments down-gradient from the plume. This reattachment could be considered good news for limiting the extent of the arsenic contamination in the groundwater. However, the chemical reattachment process may also be reversible, highlighting the need for long–term monitoring of arsenic and other chemicals that pose a water quality concern in areas associated with petroleum hydrocarbon leaks and spills. 

The presence and amount of naturally occurring arsenic and iron oxides and the condition of the groundwater in the study area are fairly typical of many geologic settings across the nation, suggesting that the process of arsenic mobilization that was observed in the presence of hydrocarbons is not geographically limited.  

This research was supported by the USGS Toxic Substances Hydrology Program and Hydrologic Research and Development Program, the Virginia Polytechnic Institute and State University, and the National Crude Oil Spill Fate and Natural Attenuation Research Site, a collaborative venture of the USGS, the Enbridge Energy Limited Partnership, the Minnesota Pollution Control Agency, and Beltrami County, Minnesota. By law, the USGS, a science bureau of the U.S. Department of the Interior, does not have any regulatory authority or responsibility. 

Learn more 

Cozzarelli, IM; Schreiber, ME; Erickson, ML; and Ziegler, BA. “Arsenic cycling in hydrocarbon plumes: Secondary effects of natural attenuation,” Groundwater, 21 Jan 2015. 

USGS Toxic Substances Hydrology Program 

USGS National Research Program (Water)

More Global Topographic Data to Aid Climate Change Research

U.S. Geological Survey News Feed - January 26, 2015 - 11:00am
Summary: Improved global topographic (elevation) data are now publicly available for most of Asia (India, China, southern Siberia, Japan, Indonesia), Oceania (Australia, New Zealand), and western Pacific Islands Enhanced elevation data for most of Asia and Oceania; third of four releases

Contact Information:

Jon Campbell ( Phone: 703-648-4180 );



Improved global topographic (elevation) data are now publicly available for most of Asia (India, China, southern Siberia, Japan, Indonesia), Oceania (Australia, New Zealand), and western Pacific Islands. See diagram below for geographic coverage.   

Similar data were previously released by USGS for most of Africa (in September 2014) and the Western Hemisphere (December). 

The data are being released following the President’s commitment at the United Nations to provide assistance for global efforts to combat climate change. The broad availability of more detailed elevation data across the globe through the Shuttle Radar Topography Mission (SRTM) will improve baseline information that is crucial to investigating the impacts of climate change on specific regions and communities. 

“We are pleased to offer improved elevation data to scientists, educators, and students worldwide. It’s free to whomever can use it,” said Suzette Kimball, acting USGS Director, at the initial release of SRTM30 data for Africa in September. “Elevation, the third dimension of maps, is critical in understanding so many aspects of how nature works. Easy access to reliable data like this advances the mutual understanding of environmental challenges by citizens, researchers, and decision makers around the globe.” 

The SRTM30 datasets resolve to 30-meters and can be used worldwide to improve environmental monitoring, advance climate change research, and promote local decision support. The previous global resolution for this data was 90-meters. 

SRTM30 elevation data are increasingly being used to supplement other satellite imagery. In India, for example, SRTM30 elevation data can be used to track changes to the Gangotri Glacier, a major source of water for the Ganges River. Changes to this glacier, which has retreated 345 meters over the past 25 years, directly affect the water resources for hundreds of millions of people on the Indian subcontinent. 

USGS online poster of the Gangotri Glacier

The National Aeronautics and Space Administration (NASA) and the National Geospatial-Intelligence Agency (NGA) worked collaboratively to produce the enhanced SRTM data, which have been extensively reviewed by relevant government agencies and deemed suitable for public release. SRTM flew aboard the Space Shuttle Endeavour in February 2000, mapping Earth's topography between 56 degrees south and 60 degrees north of the equator. During the 11-day mission, SRTM used imaging radar to map the surface of Earth numerous times from different perspectives. 

The USGS, a bureau of the U.S. Department of the Interior, distributes SRTM30 data free of charge via its user-friendly Earth Explorer website. NASA also distributes SRTM data versions through the Land Processes Distributed Active Archive Center (LPDAAC) operated by USGS along with descriptions of the various versions and processing options. 

Enhanced 30-meter resolution SRTM data for the remainder of the globe (at less than 60 deg. latitude) are scheduled to be released in the last of four releases in August 2015.    

NASA press release on SRTM data

Shaded grid over most of Asia, Japan, and Australia indicates the coverage of the third of four releases of improved topographic (elevation) data now publicly available through USGS archives. (High resolution image)

USGS Statement Regarding Avian Flu Found in Washington State Green-Winged Teal

U.S. Geological Survey News Feed - January 23, 2015 - 7:55pm
Summary: Some media are reporting that the Asian H5N1 strain of highly pathogenic avian influenza has now entered the United States. This is incorrect

Contact Information:

Jonathan Sleeman ( Phone: 608-270 2401 ); Catherine  Puckett ( Phone: 352-278-0165 ); Marisa Lubeck ( Phone: 303-526-6694 );



Some media are reporting that the Asian H5N1 strain of highly pathogenic avian influenza has now entered the United States. This is incorrect. The avian flu that was recently found in a green-winged teal in Washington state is a different strain and is not known to harm humans nor has it been found in domestic poultry. This Washington state strain incorporates genes from North American waterfowl-associated viruses. Unlike the Asian H5N1 strain that has been found in Asia, Europe, and Africa, this Washington state strain has only been found in wild waterfowl and has not been associated with human illness, nor has this new Washington state strain been found in domestic poultry.

Culprit Identified in Decline of Endangered Missouri River Pallid Sturgeon

U.S. Geological Survey News Feed - January 23, 2015 - 3:30pm
Summary: BOZEMAN – Pallid sturgeon come from a genetic line that has lived on this planet for tens of millions of years; yet it has been decades since anyone has documented any of the enormous fish successfully producing young that survive to adulthood in the upper Missouri River basin.

Contact Information:

Catherine Puckett, USGS ( Phone: 352-377-2469 );



BOZEMAN – Pallid sturgeon come from a genetic line that has lived on this planet for tens of millions of years; yet it has been decades since anyone has documented any of the enormous fish successfully producing young that survive to adulthood in the upper Missouri River basin.

Now, fisheries scientists with the U.S. Geological Survey, Montana State University and the U.S. Fish and Wildlife Service have shown why, detailing for the first time the biological mechanism that has caused the long decline of pallid sturgeon in the Missouri River and led to its being placed on the endangered species list 25 years ago.

In a paper published this week in the journal Fisheries, the scientists show that oxygen-depleted dead zones between dams in the upper Missouri River are directly linked with the failure of endangered pallid sturgeon hatched embryos to survive to adulthood.

 “This research is a notable breakthrough in identifying the reason why pallid sturgeon in the Missouri River have been declining for so many decades,” said Suzette Kimball, acting director of the USGS. “By pinpointing the biological mechanism responsible for the species’ decline, resource managers have vital information they can use as a focus of pallid sturgeon conservation.”

“We certainly think this is a significant finding in the story of why pallid sturgeon are failing to recruit in the upper Missouri River,” said Christopher Guy, the assistant unit leader with the USGS Montana Cooperative Fishery Research Unit and the MSU professor who was the lead author on the paper. “We’re basically talking about a living dinosaur that takes 20 years to reach sexual maturity and can live as long as the average human in the U.S. After millions of years of success, the pallid sturgeon population stumbled and now we know why. From a conservation perspective, this is a major breakthrough.”

The study is the first to make a direct link among dam-induced changes in riverine sediment transport, the subsequent effects of those changes on reduced oxygen levels and the survival of an endangered species, the pallid sturgeon.

“This research shows that the transition zone between the freely flowing river and reservoirs is an ecological sink – a dead zone – for pallid sturgeon,” Guy said. “Essentially, hatched sturgeon embryos die in the oxygen-depleted sediments in the transition zones.”

Guy said fisheries biologists long suspected that the Missouri River’s massive reservoirs were preventing hatched embryonic pallid sturgeon from surviving to the juvenile stage. But early attempts to tie the problem to low levels of dissolved oxygen were unsuccessful. 

“The reason for that is we hadn’t sampled deep enough,” Guy said. “It wasn’t until we sampled water down at the bottom, where those sediments are being deposited, that we found there was no dissolved oxygen. Because hatched pallid sturgeon embryos are negatively buoyant, they tend to sink into that hostile environment.”  

“The lack of oxygen is a function of high microbial activity in the sediment laden area,” said co-author Eric Scholl, a Ph.D. student at Montana State University and a co-author on the study.

Hilary Treanor, an MSU research associate working with Guy, said they were able to show just how hostile these transition zones between riverine environment and reservoir could be to hatched sturgeon embryos. 

In experiments at the U.S. Fish and Wildlife Fish Technology Center in Bozeman with coauthors Molly Webb, Kevin Kappenman, and Jason Ilgen, Treanor said different aged hatched embryos were treated with water of varying levels of dissolved oxygen. The lowest level they could recreate – 1.5 milligrams of oxygen per liter of water – was still higher than samples pulled from the bottom at the upper end of Fort Peck Reservoir. 

At those depleted levels, the hatched sturgeon embryos suffered almost immediately.

“We saw changes in their behavior fairly quickly. They became disoriented and weren’t able to move the way they should have,” Treanor said. “Within an hour we started to see mortality. By the end of the experiment they were all dead.”

"Pallid sturgeon, native to the Missouri and Mississippi rivers, were listed as an endangered species in 1990. The species has a lifespan of as much as a century. According to the U.S. Fish and Wildlife Service, fewer than 175 wild-spawned pallid sturgeon – all adults – live in the free-flowing Missouri River above Lake Sakakawea. Since 1990, not a single wild-spawned pallid sturgeon is known to have survived to a juvenile, despite intensive searching.

 In the past 5 years, researchers identified the most important reason for pallid sturgeon population declines in the Upper Missouri River: the lack of survival of naturally produced hatched sturgeon embryos.

Guy said this most recent study of sturgeon built on research conducted by USGS fisheries biologist Patrick Braaten, which demonstrated not enough available drift distance exists between the reservoirs for hatched pallid sturgeon embryos before entering the reservoirs in the upper Missouri River. 

Before dams, hatched pallid sturgeon embryos would drift for hundreds of miles, eventually settling out of the river’s current in areas with low flow where they matured enough to negotiate the river’s flow.

“This team has shown how much we can do when we have a collaboration between MSU, USGS and world-renowned reproductive physiologists Molly Webb and Kevin Kappenman with the U.S. Fish and Wildlife Service,” Guy said. “In the process of doing this research, we’ve trained a dozen MSU graduate students and a number of undergraduate field and lab techs.”

Given what the new research shows about how no oxygen is available to hatched pallid sturgeon embryos, the authors of the paper propose that officials will need to consider innovative approaches to managing Missouri River reservoirs for pallid sturgeon conservation to have a chance. It also could provide some guiding principles for the construction of new dams around the world, Guy said.

Melting Glaciers Increase the Flow of Carbon to Downstream Ecosystems

U.S. Geological Survey News Feed - January 20, 2015 - 3:00pm
Summary: Melting glaciers are not just impacting sea level, they are also affecting the flow of organic carbon to the world’s oceans, according to new research that provides the first ever global-scale estimates for the storage and release of organic carbon from glaciers

Contact Information:

Ryan  McClymont ( Phone: 503-251-3237 ); Katie Bausler ( Phone: 907-796-6530 );



ANCHORAGE, Alaska    Melting glaciers are not just impacting sea level, they are also affecting the flow of organic carbon to the world’s oceans, according to new research that provides the first ever global-scale estimates for the storage and release of organic carbon from glaciers. 

The research, published in the Jan. 19 issue of Nature Geoscience, is crucial to better understand the role glaciers play in the global carbon cycle, especially as climate warming continues to reduce glacier ice stores and release ice-locked organic carbon into downstream freshwater and marine ecosystems.

“This research makes it clear that glaciers represent a substantial reservoir of organic carbon,” said Eran Hood, the lead author on the paper and a scientist with the University of Alaska Southeast (Juneau).  “As a result, the loss of glacier mass worldwide, along with the corresponding release of carbon, will affect high-latitude marine ecosystems, particularly those surrounding the major ice sheets that now receive fairly limited land-to-ocean fluxes of organic carbon.”

Polar ice sheets and mountain glaciers cover roughly 11 percent of the Earth’s land surface and contain about 70 percent of Earth’s fresh water. They also store and release organic carbon to downstream environments as they melt.  Because this glacier-derived organic carbon is readily metabolized by microorganisms, it can affect productivity in aquatic ecosystems.

“This research demonstrates that the impacts of glacier change reach beyond sea level rise,” said U.S. Geological Survey research glaciologist and co-author of the research Shad O’Neel. “Changes in organic carbon release from glaciers have implications for aquatic ecosystems because this material is readily consumed by microbes at the bottom of the food chain.”

Due to climate change, glacier mass losses are expected to accelerate, leading to a cumulative loss of nearly 17 million tons of glacial dissolved organic carbon by 2050 — equiva­lent to about half of the annual flux of dissolved organic carbon from the Amazon River.

These estimates are the first of their kind, and thus have high uncertainty, the scientists wrote, noting that refining estimates of organic carbon loss from glaciers is critical for improving the understanding of the impacts of glacier change. The U.S. Department of the Interior Alaska Climate Science Center and USGS Alaska Science Center plan to continue this work in 2015 and beyond with new efforts aimed at studying the biophysical implications of glacier change.

This project highlights ongoing collaboration between academic and federal research and the transformative results that stem from such funding partnerships.  Other institutions involved in the research include Ecole Polytechnique Fédérale de Lausanne and Florida State University.

The work was supported by the National Science Foundation, the USGS Alaska Science Center, and the DOI Alaska Climate Science Center. The Alaska Climate Science Center provides scientific information to help natural resource managers and policy makers respond effectively to climate change.

New Nebraska Maps Feature Trails

U.S. Geological Survey News Feed - January 13, 2015 - 12:00pm
Summary: Newly released US Topo maps for Nebraska now feature trails provided to the USGS through a “crowdsourcing” project operated by the International Mountain Biking Association (IMBA)

Contact Information:

Mark Newell, APR ( Phone: 573-308-3850 ); Larry  Moore ( Phone: 303-202-4019 );



Newly released US Topo maps for Nebraska now feature trails provided to the USGS through a “crowdsourcing” project operated by the International Mountain Biking Association (IMBA). Several of the 1,376 new US Topo quadrangles for the state now display trails along with other improved data layers such as map symbol redesign and new road source data.

"As an avid cyclist I look forward to exploring the new US Topo maps for bike trails as I plan my trips," said Jim Langtry, National Map Liaison for Nebraska. "I look forward to the expansion of the trail network and hope this encourages the crowdsourcing effort to add and maintain trails for future updates.  It would be great to see the Cowboy Trail, the nation’s longest rails-to-trail trek along the northern tier of Nebraska, included on the next update. You can hike, bike, or horseback ride a total of 195 miles on the completed trail from Norfolk to Valentine. Enjoy the small towns along the way, beautiful scenery and pristine air on the Cowboy Trail."

For Nebraska residents and visitors who want to explore the rolling “cornhusker” landscape on a bicycle seat, the new trail features on the US Topo maps will come in handy. The data is provided through a partnership with IMBA and MTB Project. During the past two years, the IMBA has been building a detailed national database of mountain bike trails with the aid and support of the MTB Project. This activity allows local IMBA chapters, IMBA members, and the public to provide trail data and descriptions through their website.  MTB Project and IMBA then verify the quality of the trail data provided, ensure accuracy and confirm that the trail is legal.  This unique crowdsourcing venture has increased the availability of trail data available through The National Map mobile and web apps, and the revised US Topo maps.

These new maps replace the first edition US Topo maps for Nebraska and are available for free download from The National Map, the USGS Map Locator & Downloader website , or several other USGS applications.

To compare change over time, scans of legacy USGS topo maps, some dating back to the late 1800s, can be downloaded from the USGS Historical Topographic Map Collection

For more information on US Topo maps: http://nationalmap.gov/ustopo/

New version of the North Platte, Nebraska US Topo quadrangle: 2014, with orthoimage turned on. (1:24,000 scale) (high resolution image 1.2 MB) 1902 historic version of the North Platte, Nebraska US Topo quadrangle at 1;25,000 scale. (high resolution image 1.8 MB)

Oso Landslide Research Paves Way for Future Hazard Evaluations

U.S. Geological Survey News Feed - January 12, 2015 - 3:00pm
Summary: VANCOUVER, Wash. — The large landslide that occurred on March 22, 2014 near Oso, Washington was unusually mobile and destructive. The first published study from U.S. Geological Survey investigations of the Oso landslide (named the “SR530 Landslide” by Washington State) reveals that the potential for landslide liquefaction and high mobility are influenced by several factors, and the landslide process at Oso could have unfolded very differently (with much less destruction) if initial conditions had been only subtly different. 

Contact Information:

Carolyn  Driedger ( Phone: 360-993-8907 ); Leslie  Gordon ( Phone: 650-329-4006 );



VANCOUVER, Wash. — The large landslide that occurred on March 22, 2014 near Oso, Washington was unusually mobile and destructive. The first published study from U.S. Geological Survey investigations of the Oso landslide (named the “SR530 Landslide” by Washington State) reveals that the potential for landslide liquefaction and high mobility are influenced by several factors, and the landslide process at Oso could have unfolded very differently (with much less destruction) if initial conditions had been only subtly different. 

A major focus of the research reported this week is to understand the causes and effects of the landslide’s high mobility. High “mobility” implies high speeds and large areas of impact, which can be far from the landslide source area. Because high-mobility landslides overrun areas that are larger than  normal, they present a significant challenge for landslide hazard evaluation. Understanding of the Oso event adds to the knowledge base that can be used to improve future hazard evaluations.

Computer reconstructions of the landslide source-area geometry make use of high-resolution digital topographic (lidar) data, and they indicate that the Oso landslide involved about 8 million cubic meters (about 18 million tons, or almost 3 times the mass of the Great Pyramid of Giza) of material.  The material consisted of sediments deposited by ancient glaciers and in streams and lakes near the margins of those glaciers. The landslide occurred after a long period of unusually wet weather. Prolonged wet weather increases groundwater pressures, which act to destabilize slopes by reducing frictional resistance between sediment particles.

The slope that failed at Oso on March 22, 2014 had a long history of prior historical landslides at the site, but these had not exhibited exceptional mobility.

The area overrun by the March 22 landslide was about 1.2 square kilometers (one-half square mile), mostly on the nearly flat floodplain of the North Fork Stillaguamish River. Additional areas were affected by upstream flooding along the river, which was partially dammed by the landslide. Eyewitness accounts and seismic energy radiated by the landslide indicate that slope failure occurred in two stages over the course of about 1 minute. During the second stage of slope failure, the landslide greatly accelerated, crossed the North Fork Stillaguamish River, and mobilized to form a high-speed debris avalanche. The leading edge of the wet debris avalanche probably acquired additional water as it crossed the North Fork Stillaguamish River. It transformed into a water-saturated debris flow (a fully liquefied slurry of quicksand-like material) that entrained and transported virtually all objects in its path.

Field evidence and mathematical modeling indicate that the high mobility of the debris avalanche was caused by liquefaction at the base of the slide caused by pressures generated by the landslide itself. The physics of landslide liquefaction has been studied experimentally and is well understood, but the complex nature of natural geological materials complicates efforts to predict which landslides will liquefy and become highly mobile.

Results from a suite of computer simulations indicate that the landslide’s liquefaction and high mobility were very sensitive to its initial porosity and water content. Landslide mobility may have been far less if the landslide material had been slightly denser and/or drier. Computer simulations that best fit field observations and seismological interpretations indicate that the fast-moving landslide crossed the entire 1-km-wide river floodplain in about one minute, implying an average speed of about 40 miles per hour.  Maximum speeds were even higher.

Only one individual landslide in U.S. history (an event in Mameyes, Puerto Rico in 1985 that killed at least 129) caused more fatalities than the 43 that occurred in the 2014 landslide near Oso.

The full paper, “Landslide mobility and hazards: implications of the 2014 Oso disaster” by R.M. Iverson et al. is published in the journal, “Earth and Planetary Science Letters” and is freely available online. 

Oso landslide simulation screen shot. (High resolution image) (Video)

[Access images for this release at: <a href="http://gallery.usgs.gov/tags/NR2015_01_12" _mce_href="http://gallery.usgs.gov/tags/NR2015_01_12">http://gallery.usgs.gov/tags/NR2015_01_12</a>]

By Bike, Foot or Hoof: New Arizona Maps Feature Trails

U.S. Geological Survey News Feed - January 7, 2015 - 12:00pm
Summary: Newly released US Topo maps for Arizona now feature mountain bike trails, segments of the Arizona National Scenic Trail and Public Land Survey System data

Contact Information:

Mark Newell, APR ( Phone: 573-308-3850 ); Larry  Moore ( Phone: 303-202-4019 );



Newly released US Topo maps for Arizona now feature mountain bike trails, segments of the Arizona National Scenic Trail and Public Land Survey System data.  Several of the 1,880 new US Topo quadrangles for the state now display these selected new features along with other improved data layers.

“Having recently returned to Arizona, I am excited to re-explore our state using the new USGS Arizona Topo maps,” said Curtis Pulford, Arizona State Cartographer.  “Detailed topographic maps are one of the best ways I know to visualize the terrain one is planning to examine.  All who use these will appreciate the newly updated reference features, such as BLM Public Lands Survey System, roadways, schools, fire and police stations, post offices, and hospitals.   Mountain bikers will appreciate the addition of International Mountain Biking Association trails.  And the addition of the 817 mile, border to border, Arizona National Scenic Trail will be an outstanding resource for nature enthusiasts, hikers and equestrians.” 

For Arizona residents and visitors who want to explore the landscape on a bicycle seat, the new mountain bike trails will come in handy. The mountain bike trail data is provided through a partnership with the International Mountain Biking Association (IMBA) and MTB Project. During the past two years, the IMBA has been building a detailed national database of mountain bike trails with the aid and support of the MTB Project. This activity allows local IMBA chapters, IMBA members, and the public to provide trail data and descriptions through their website.  MTB Project and IMBA then verify the quality of the trail data provided, ensure accuracy and confirm that the trail is legal.  This unique “crowdsourcing” project has allowed availability of mountain bike trail data though mobile and web apps, and the revised US Topo maps.

National Scenic Trail enthusiasts can now find the “Arizona Trail” on new US Topo map segments. The Arizona National Scenic Trail stretches more than 800 miles from the Mexican border to Utah to connect deserts, mountains, canyons, wilderness, history, communities and people.  Rugged, wild and challenging, this trail showcases Arizona’s diverse vegetation, wildlife, scenery, and historic and prehistoric sites in a way that provides a unique and unparalleled Arizona experience.

“For more than 20 years the Arizona Trail Association’s members have been creating, maintaining, and mapping the Arizona National Scenic Trail,” said Aaron Seifert, GIS Director for the Arizona Trail Association. “Since the trail was designated as a National Scenic Trail in 2009 and completed in 2011, it is very exciting to display the entire trail on the new set of US Topo maps for many more to discover the diverse landscape of Arizona from this amazing trail.”

The USGS partnered with the U.S. Forest Service and the Arizona Trail Association to incorporate the trail data onto the Arizona US Topo maps. This NST joins the Ice Age National Scenic Trail, the Pacific Northwest National Scenic Trail the North Country National Scenic Trail, the Pacific Crest National Scenic Trail, and the Appalachian National Scenic Trail as being featured on the new US Topo quads. The USGS hopes to eventually include all National Scenic Trails in The National Map products.

Another important addition to the new Arizona US Topo maps in the inclusion of Public Land Survey System. PLSS is a way of subdividing and describing land in the US. All lands in the public domain are subject to subdivision by this rectangular system of surveys, which is regulated by the U.S. Department of the Interior.

These new maps replace the first edition US Topo maps for Arizona and are available for free download from The National Map, the USGS Map Locator & Downloader website , or several other USGS applications.

To compare change over time, scans of legacy USGS topo maps, some dating back to the late 1800s, can be downloaded from the USGS Historical Topographic Map Collection

For more information on US Topo maps: http://nationalmap.gov/ustopo/

New (2014) Black Canyon City, Arizona US Topo quadrangle with orthoimage turn on. (1:24,000 scale). (high resolution image 1.3 MB) Historical USGS topographic map of the Prescott, Arizona area (1887). !:250,000 scale. (high resolution image 1.6 MB) Zoom of the Black Canyon City, Arizona, US Topo quadrangle. The Blank Canyon Trail (BCT) is denoted by a dashed line on the left side of the graphic. (high resolution image 1.2 MB)

Fewer Large Earthquakes in 2014

U.S. Geological Survey News Feed - January 7, 2015 - 11:00am
Summary: While the number of large earthquakes fell to 12 in 2014, from 19 in 2013, several moderate temblors hit areas relatively new to seismicity, including Oklahoma and Kansas, according to the U.S. Geological Survey

Contact Information:

Heidi  Koontz ( Phone: 303-202-4763 );



While the number of large earthquakes fell to 12 in 2014, from 19 in 2013, several moderate temblors hit areas relatively new to seismicity, including Oklahoma and Kansas, according to the U.S. Geological Survey. Worldwide, 11 earthquakes reached magnitude 7.0-7.9 and one registered magnitude 8.2, in Iquique, Chile, on April 1. This is the lowest annual total of earthquakes magnitude 7.0 or greater since 2008, which also had 12.

Earthquakes were responsible for about 664 deaths in 2014, with 617 having perished in the magnitude 6.1 Ludian Xian, Yunnan, China, event on August 3, as reported by the United Nations Office for Coordination of Humanitarian Affairs. Deadly quakes also occurred in Chile, Nicaragua, Papua New Guinea, and the United States.

A magnitude 6.0 quake struck American Canyon, California (South Napa) in the early hours of August 24, triggering more than 41,300 responses via the USGS Did You Feel It? website. One woman died from her injuries 12 days later. This temblor also represents northern California’s strongest earthquake since the October 1989 Loma Prieta event.

The biggest earthquake in the United States, and the second largest quake of 2014, was a magnitude 7.9 event in the Aleutian Islands of Alaska on June 23. Several quakes below magnitude 5.0 rattled Oklahoma, Texas, Kansas, Arkansas and Arizona throughout the year. The USGS estimates that several million earthquakes occur throughout the world each year, although most go undetected because they have very small magnitudes or hit remote areas.

On average, the USGS National Earthquake Information Center (NEIC) publishes the locations for about 40 earthquakes per day, or about 14,500 annually. The USGS NEIC publishes worldwide earthquakes with a magnitude of 4.0 or greater or U.S. earthquakes of 2.5 or greater. On average each year since about 1900, 18 have a magnitude of 7.0 or higher.

To monitor earthquakes worldwide, the USGS NEIC receives data in real-time from about 1,700 stations in more than 90 countries. These stations include the 150-station Global Seismographic Network, which is jointly supported by the USGS and the National Science Foundation, and is operated by the USGS in partnership with the Incorporated Research Institutions for Seismology (IRIS) consortium of universities. Domestically, the USGS partners with 13 regional seismic networks operated by universities that provide detailed coverage for the areas of the country with the highest seismic risk. 

In the U.S., 42 of the 50 states, plus Puerto Rico, may experience damaging ground shaking from an earthquake in 50 years, the nominal lifetime of a building. The USGS and its partners in the multi-agency National Earthquake Hazard Reduction Program are working to improve earthquake monitoring and reporting capabilities through the development of the USGS Advanced National Seismic System (ANSS). More information about ANSS can be found on the ANSS website.

Read a USGS feature story to learn more about other natural hazards in 2014.

Polar Bears Shifting to Areas with More Sea Ice -- Genetic Study Reveals

U.S. Geological Survey News Feed - January 6, 2015 - 4:00pm
Summary: Editors: B-roll footage of polar bear research is available for your use.

Contact Information:

Paul Laustsen ( Phone: 650-329-4046 );



Editors: B-roll footage of polar bear research is available for your use.

ANCHORAGE, Alaska — In a new polar bear study published today, scientists from around the Arctic have shown that recent generations of polar bears are moving towards areas with more persistent year-round sea ice.

Research scientists, led by the U.S. Geological Survey, found that the 19 recognized subpopulations of polar bears group into four genetically-similar clusters, corresponding to ecological and oceanographic factors. These four clusters are the Eastern Polar Basin, Western Polar Basin, Canadian Archipelago, and Southern Canada.

The scientists also detected directional gene flow towards the Canadian Archipelago within the last 1-3 generations. Gene flow of this type can result from populations expanding and contracting at different rates or directional movement and mating over generations. The findings of spatial structure (clusters) and directional gene flow are important because they support the hypothesis that the species is coalescing to the region of the Arctic most likely to retain sea ice into the future.

“The polar bear’s recent directional gene flow northward is something new,” said Elizabeth Peacock, USGS researcher and lead author of the study. “In our analyses that focused on more historic gene flow, we did not detect movement in this direction.” The study found that the predominant gene flow was from Southern Canada and the Eastern Polar Basin towards the Canadian Archipelago where the sea ice is more resilient to summer melt due to circulation patterns, complex geography, and cooler northern latitudes.

Projections of future sea ice extent in light of climate warming typically show greater retention of sea ice in the northern Canadian Archipelago than in other regions.

“By examining the genetic makeup of polar bears, we can estimate levels and directions of gene flow, which represents the past story of mating and movement, and population expansion and contraction,” said Peacock. “Gene flow occurs over generations, and would not be detectable by using data from satellite-collars which can only be deployed on a few polar bears for short periods of time.”

The authors also found that female polar bears showed higher fidelity to their regions of birth than did male polar bears. Data to allow comparison of the movements of male and female polar bears is difficult to obtain because male bears cannot be collared as their necks are wider than their heads.

The study also confirmed earlier work that suggests that modern polar bears stem from one or several hybridization events with brown bears. No evidence of current polar bear-brown bear hybridization was found in the more than 2,800 samples examined in the current study. Scientists concluded that the hybrid bears that have been observed in the Northern Beaufort Sea region of Canada represent a recent and currently localized phenomenon. Scientists also found that polar bear populations expanded and brown bear populations contracted in periods with more ice. In periods with less ice, the opposite was true.

The goal of the study was to see how genetic diversity and structure of the worldwide polar bear population have changed over the recent dramatic decline in their sea-ice habitat. The USGS and the Government of Nunavut led the study with scientists from 15 institutions representing all five nations with polar bears (U.S., Canada, Greenland, Norway, and Russia).  

This circumpolar, multi-national effort provides a timely perspective on how a rapidly changing Arctic is influencing the gene flow and likely future distribution of a species of worldwide conservation concern.  

The paper “Implications of the circumpolar genetic structure of polar bears for their conservation in a rapidly warming Arctic” was published today in the journal PLOS One.

Endangered Salmon Population Monitored with eDNA for First Time

U.S. Geological Survey News Feed - January 5, 2015 - 4:00pm
Summary: CORVALLIS, Ore. — Scientists from the U.S. Geological Survey and Washington State University have discovered that endangered Chinook salmon can be detected accurately from DNA they release into the environment. The results are part of a special issue of the journal Biological Conservation on use of environmental DNA to inform conservation and management of aquatic species.

Contact Information:

Susan Kemp ( Phone: 541-750-1047 ); Paul Laustsen ( Phone: 650-329-4046 );



CORVALLIS, Ore. — Scientists from the U.S. Geological Survey and Washington State University have discovered that endangered Chinook salmon can be detected accurately from DNA they release into the environment. The results are part of a special issue of the journal Biological Conservation on use of environmental DNA to inform conservation and management of aquatic species.

The special issue contains eleven papers that move the detection of aquatic species using eDNA from concept to practice and include a thorough examination of the potential benefits, limitations and biases of applying eDNA methods to research and monitoring of animals. 

“The papers in this special edition demonstrate that eDNA techniques are beginning to realize their potential contribution to the field of conservation biology worldwide,” said Caren Goldberg, Assistant Professor at Washington State University and lead editor of the special issue.

DNA, or deoxyribonucleic acid, is the hereditary material that contains the biological instructions to build and maintain all life forms; eDNA is the DNA that animals release into the environment through normal biological processes from sources such as feces, mucous, skin, hair, and carcasses. Research and monitoring of rare, endangered, and invasive species can be done by analyzing eDNA in water samples.

A paper included in the special issue by USGS ecologists Matthew Laramie and David Pilliod, and Goldberg, looked at the potential for eDNA analysis to improve detection of Chinook salmon in the Upper Columbia River in Washington, USA and British Columbia, Canada. This is the first time eDNA methods have been used to monitor North American salmon populations. The successful project also picked up evidence of Chinook in areas where they have not been previously observed.

“The results from this study indicate that eDNA detection methods are an effective way to determine the distribution of Chinook across a large area and can potentially be used to document the arrival of migratory species, like Pacific salmon, or colonization of streams following habitat restoration or reintroduction efforts,” said Laramie.

Spring Chinook of the Upper Columbia River are among the most imperiled North American salmon and are currently listed as endangered under the Endangered Species Act. Laramie has been working with the Confederated Tribes of the Colville Reservation Fisheries Program in the use of eDNA to document the success of reintroduction of Spring Chinook into the Okanogan Basin of the Upper Columbia River.

The papers of the special issue focus on techniques for analyzing eDNA samples, eDNA production and degradation in the environment and the laboratory, and practical applications of eDNA techniques in detecting and managing endangered fish and amphibians.

The co-editors, Goldberg, Pilliod, and WSU researcher Katherine Strickler, open the special issue with an overview on the state of eDNA science, a field developed from the studies of micro-organisms in environmental samples and DNA collected from ancient specimens such as mummified tissues or preserved plant remains.

“Incorporating eDNA methods into survey and monitoring programs will take time, but dedicated professionals around the world are rapidly advancing these methods closer to this goal,” said Goldberg.

Strickler, Goldberg, and WSU Assistant Professor Alexander Fremier authored a paper which quantified the effects of ultraviolet radiation, temperature, and pH on eDNA degradation in aquatic systems. Using eDNA from bullfrog tadpoles, the scientists determined that DNA broke down faster in warmer temperatures and higher levels of Ultraviolet-B light. 

“We need to better understand how long DNA can be detected in water under different conditions. Our work will help improve sampling strategies for eDNA monitoring of sensitive and invasive species,” said Strickler.

“These papers lead the way in advancing eDNA sample collection, processing, analysis, and interpretation,” said Pilliod, “eDNA methods have great promise for detecting aquatic species of concern and may be particularly useful when animals occur in low numbers or when there are regulatory restrictions on the use of more invasive survey techniques.”

How Does White-Nose Syndrome Kill Bats?

U.S. Geological Survey News Feed - January 5, 2015 - 2:00pm
Summary: For the first time, scientists have developed a detailed explanation of how white-nose syndrome (WNS) is killing millions of bats in North America, according to a new study by the U.S. Geological Survey and the University of Wisconsin New Science Helps Explain Hibernation Disease

Contact Information:

Marisa Lubeck ( Phone: 303-202-4765 ); Gail Moede Rogall ( Phone: 608-270-2438 );



For the first time, scientists have developed a detailed explanation of how white-nose syndrome (WNS) is killing millions of bats in North America, according to a new study by the U.S. Geological Survey and the University of Wisconsin. The scientists created a model for how the disease progresses from initial infection to death in bats during hibernation. 

“This model is exciting for us, because we now have a framework for understanding how the disease functions within a bat,” said University of Wisconsin and USGS National Wildlife Health Center scientist Michelle Verant, the lead author of the study. “The mechanisms detailed in this model will be critical for properly timed and effective disease mitigation strategies.” 

Scientists hypothesized that WNS, caused by the fungus Pseudogymnoascus destructans, makes bats die by increasing the amount of energy they use during winter hibernation. Bats must carefully ration their energy supply during this time to survive without eating until spring. If they use up their limited energy reserves too quickly, they can die. 

The USGS tested the energy depletion hypothesis by measuring the amounts of energy used by infected and healthy bats hibernating under similar conditions. They found that bats with WNS used twice as much energy as healthy bats during hibernation and had potentially life-threatening physiologic imbalances that could inhibit normal body functions. 

Scientists also found that these effects started before there was severe damage to the wings of the bats and before the disease caused increased activity levels in the hibernating bats.

“Clinical signs are not the start of the disease — they likely reflect more advanced disease stages,” Verant said. “This finding is important because much of our attention previously was directed toward what we now know to be bats in later stages of the disease, when we observe visible fungal infections and behavioral changes.” 

Key findings of the study include:

  • Bats infected with P. destructans had higher proportions of lean tissue to fat mass at the end of the experiment compared to the non-infected bats. This finding means that bats with WNS used twice as much fat as healthy control bats over the same hibernation period. The amount of energy they used was also higher than what is expected for normal healthy hibernating little brown bats.
  • Bats with mild wing damage had elevated levels of dissolved carbon dioxide in their blood resulting in acidification and pH imbalances throughout their bodies. They also had high potassium levels, which can inhibit normal heart function.  

The study, “White-nose syndrome initiates a cascade of physiologic disturbances in the hibernating bat host,” is published in BMC Physiology. Learn more about WNS, ongoing research and actions that are being taken here:

White-nose Syndrome Images

Interior Department Announces Funding for Climate Change Studies

U.S. Geological Survey News Feed - December 18, 2014 - 3:18pm
Summary: U.S. Secretary of the Interior Sally Jewell announced today that the Department of the Interior’s regional Climate Science Centers and the United States Geological Survey (USGS) National Climate Change and Wildlife Science Center are awarding nearly $6 million to universities and other partners for 50 new research projects to better prepare communities for impacts of climate change

Contact Information:

Arlene Compher ( Phone: 703-648-4282 );



WASHINGTON, D.C. — U.S. Secretary of the Interior Sally Jewell announced today that the Department of the Interior’s regional Climate Science Centers and the United States Geological Survey (USGS) National Climate Change and Wildlife Science Center are awarding nearly $6 million to universities and other partners for 50 new research projects to better prepare communities for impacts of climate change.

Highly Pathogenic H5 Avian Influenza Confirmed in Wild Birds in Washington State H5N2 Found in Northern Pintail Ducks & H5N8 Found in Captive Gyrfalcons

U.S. Geological Survey News Feed - December 17, 2014 - 2:58pm
Summary: WASHINGTON, Dec. 17, 2014 — The United States Department of Agriculture's (USDA) Animal and Plant Health Inspection Service (APHIS) confirmed the presence of highly pathogenic (HPAI) H5 avian influenza in wild birds in Whatcom County, Washington. Two separate virus strains were identified: HPAI H5N2 in northern pintail ducks and HPAI H5N8 in captive gyrfalcons that were fed hunter-killed wild birds. Neither virus has been found in commercial poultry anywhere in the United States and no human cases with these viruses have been detected in the United States, Canada or internationally. There is no immediate public health concern with either of these avian influenza viruses. Neither virus found in commercial poultry in U.S.; no public health concern at this time

Contact Information:

Marisa Lubeck, USGS ( Phone: 303-526-6694 ); Joelle Hayden, APHIS ( Phone: 301-851-4040 ); CDC Press ( Phone: 404-639-3286 );



WASHINGTON, Dec. 17, 2014 — The United States Department of Agriculture's (USDA) Animal and Plant Health Inspection Service (APHIS) confirmed the presence of highly pathogenic (HPAI) H5 avian influenza in wild birds in Whatcom County, Washington. Two separate virus strains were identified: HPAI H5N2 in northern pintail ducks and HPAI H5N8 in captive gyrfalcons that were fed hunter-killed wild birds. Neither virus has been found in commercial poultry anywhere in the United States and no human cases with these viruses have been detected in the United States, Canada or internationally. There is no immediate public health concern with either of these avian influenza viruses.

Both H5N2 and H5N8 viruses have been found in other parts of the world and have not caused any human infection to date. While neither virus has been found in commercial poultry, federal authorities with the U.S. Department of Agriculture also emphasize that poultry, poultry products and wild birds are safe to eat even if they carry the disease if they are properly handled and cooked to a temperature of 165 degrees Fahrenheit.

The finding in Whatcom County was reported and identified quickly due to increased surveillance for avian influenza in light of HPAI H5N2 avian influenza outbreaks in poultry affecting commercial poultry farms in British Columbia, Canada. The northern pintail duck samples were collected by officials from the Washington Department of Fish and Wildlife following a waterfowl die-off at Wiser Lake, Washington, and were sent to the U.S. Geological Survey (USGS) National Wildlife Health Center for diagnostic evaluation and initial avian influenza testing. The U.S. Department of the Interior's USGS, which also conducts ongoing avian influenza testing of wild bird mortality events, identified the samples as presumptive positive for H5 avian influenza and sent them to USDA for confirmation. The gyrfalcon samples were collected after the falconer reported signs of illness in his birds.

Following existing avian influenza response plans, USDA is working with the U.S. Department of the Interior and the U.S. Department of Health and Human Services as well as state partners on additional surveillance and testing of both commercial and wild birds in the nearby area.

Wild birds can be carriers of HPAI viruses without the birds appearing sick. People should avoid contact with sick/dead poultry or wildlife. If contact occurs, wash your hands with soap and water and change clothing before having any contact with healthy domestic poultry and birds.

HPAI would have significant economic impacts if detected in U.S. domestic poultry. Commercial poultry producers follow strict biosecurity practices and raise their birds in very controlled environments. Federal officials emphasize that all bird owners, whether commercial producers or backyard enthusiasts, should continue practicing good biosecurity. This includes preventing contact between your birds and wild birds, and reporting sick birds or unusual bird deaths to State/Federal officials, either through your state veterinarian or through USDA's toll-free number at 1-866-536-7593. Additional information on biosecurity for backyard flocks can be found at healthybirds.aphis.usda.gov.

CDC considers the risk to people from these HPAI H5 infections in wild birds to be low because (like H5N1) these viruses do not now infect humans easily, and even if a person is infected, the viruses do not spread easily to other people.

Avian influenza (AI) is caused by influenza type A viruses which are endemic in some wild birds (such as wild ducks and swans) which can infect poultry (such as chickens, turkeys, pheasants, quail, domestic ducks, geese and guinea fowl). AI viruses are classified by a combination of two groups of proteins: hemagglutinin or "H" proteins, of which there are 17 (H1–H17), and neuraminidase or "N" proteins, of which there are 10 (N1–N10). Many different combinations of "H" and "N" proteins are possible. Each combination is considered a different subtype, and can be further broken down into different strains. AI viruses are further classified by their pathogenicity—the ability of a particular virus to produce disease in domestic chickens.

For more information on avian influenza and wild birds, please visit the USGS National Wildlife Health Center. For other information visit the USDA avian influenza page and the USDA APHIS avian influenza page

Urban Stream Contamination Increasing Rapidly Due to Road Salt

U.S. Geological Survey News Feed - December 15, 2014 - 5:04pm
Summary: Average chloride concentrations often exceed toxic levels in many northern United States streams due to the use of salt to deice winter pavement, and the frequency of these occurrences nearly doubled in two decades.

Contact Information:

Marisa Lubeck ( Phone: 303-526-6694 ); Steve Corsi ( Phone: 608-821-3835 );



Average chloride concentrations often exceed toxic levels in many northern United States streams due to the use of salt to deice winter pavement, and the frequency of these occurrences nearly doubled in two decades.

Chloride levels increased substantially in 84 percent of urban streams analyzed, according to a U.S. Geological Survey study that began as early as 1960 at some sites and ended as late as 2011. Levels were highest during the winter, but increased during all seasons over time at the northern sites, including near Milwaukee, Wisconsin; Chicago, Illinois; Denver, Colorado; and other metropolitan areas. The report was published today in the journal Science of the Total Environment.

"Some freshwater organisms are sensitive to chloride, and the high concentrations that we found could negatively affect a significant number of species," said Steve Corsi, USGS scientist and lead author of the study. “If urban development and road salt use continue to increase, chloride concentrations and associated toxicity are also likely to increase.”

The scientists analyzed water-quality data from 30 monitoring sites on 19 streams near cities in Wisconsin, Illinois, Colorado, Michigan, Ohio, Pennsylvania, Maryland, Texas and the District of Columbia. Key findings include:

  • Twenty-nine percent of the sites exceeded the U.S. Environmental Protection Agency’s chronic water-quality criteria (230 milligrams per liter) by an average of more than 100 days per year from 2006 through 2011, which was almost double the amount of days from 1990 through 1994. This increase occurred at sites such as the Menomonee and Kinnickinnic Rivers near Milwaukee and Poplar Creek near Chicago.
  • The lowest chloride concentrations were in watersheds that had little urban land use or cities without much snowfall, such as Dallas, Texas.
  • In 16 of the streams, winter chloride concentrations increased over the study period.
  • In 13 of the streams, chloride concentrations increased over the study period during non-deicing periods such as summer. This finding suggests that chloride infiltrates the groundwater system during the winter and is slowly released to the streams throughout the year.
  • Chloride levels increased more rapidly than development of urban land near the study sites.
  • The rapid chloride increases were likely caused by increased salt application rates, increased baseline conditions (the concentrations during summer low-flow periods) and greater snowfall in the Midwest during the latter part of the study.

"Deicing operations help to provide safe winter transportation conditions, which is very important,” Corsi said. “Findings from this study emphasize the need to consider deicer management options that minimize the use of road salt while still maintaining safe conditions."

Road deicing by cities, counties and state agencies accounts for a significant portion of salt applications, but salt is also used by many public and private organizations and individuals to deice parking lots, walkways and driveways. All of these sources are likely to contribute to these increasing chloride trends.

Other major sources of salt to U.S. waters include wastewater treatment, septic systems, farming operations and natural geologic deposits. However, the new study found deicing activity to be the dominant source in urban areas of the northern U.S. 

The USGS conducted this study in cooperation with the Milwaukee Metropolitan Sewerage District. For more information about winter runoff and water-quality, please visit the USGS Wisconsin Water Science Center website.

[Access images for this release at: <a href="http://gallery.usgs.gov/tags/NR2010_09_02" _mce_href="http://gallery.usgs.gov/tags/NR2010_09_02">http://gallery.usgs.gov/tags/NR2010_09_02</a>]

New Scientific Study Supports that Capture-based Research is Safe for Polar Bears

U.S. Geological Survey News Feed - December 15, 2014 - 9:53am
Summary: ANCHORAGE, Alaska — A polar bear capture and release-based research program had no adverse long-term effects on feeding behavior, body condition, and reproduction, according to a new study by the U.S. Geological Survey.

Contact Information:

Paul Laustsen ( Phone: 650-329-4046 ); Paul Laustsen ( Phone: 650-329-4046 );



ANCHORAGE, Alaska — A polar bear capture and release-based research program had no adverse long-term effects on feeding behavior, body condition, and reproduction, according to a new study by the U.S. Geological Survey.

The study used over 40 years of capture-based data collected by USGS from polar bears in the Alaska portion of the southern Beaufort Sea. Scientists looked for short and long-term effects of capture and release and deployment of various types of satellite transmitters.

"We dug deeply into one of the most comprehensive capture-based data sets for polar bears in the world looking for any signs that our research activities might be negatively affecting polar bears," said Karyn Rode, lead author of the study and scientist with the USGS Polar Bear Research Program.  

The study found that, following capture, transmitter-tagged bears returned to near-normal rates of movement and activity within 2-3 days, and that the presence of tags had no effect on a bear's subsequent physical condition, reproductive success, or ability to successfully raise cubs.

"Importantly, we found no indication that neck collars, the primary means for obtaining critical information on polar bear movement patterns and habitat use, adversely affected polar bear health or reproduction," said Rode.

The study also found that repeated capture of 3 or more times was not related to effects on health and reproduction.  

"We care about the animals we study and want to be certain that our research efforts are not contributing to any negative effects," said Rode. "I expected we might find some sign that certain aspects of our studies, such as repeated capture, would negatively affect bears, and I was pleased that we could not find any negative implications."

Efforts to conserve polar bears will require a greater understanding of how populations are responding to the loss of sea ice habitat. Capture-based methods are required to assess individual bear health and to deploy transmitters that provide information on bear movement patterns and habitat use. These methods have been used for decades in many parts of the polar bear’s range. New less invasive techniques have been developed to identify individuals via hair and biopsy samples, but these techniques do not provide

complete information on bear health, movements or habitat use. Capture is likely to continue to be an important technique for monitoring polar bears. This study is reassurance that capture, handling, and tagging can be used as research and monitoring techniques with no long-term effects on polar bear populations.

The paper "Effects of capturing and collaring on polar bears: findings from long-term research on the southern Beaufort Sea population" was published today in the journal Wildlife Research.

Visit the USGS Polar Bear Research website for more information. 

Fault "Crossroads" May Have Been Origin Point for 2011 Virginia Earthquake

U.S. Geological Survey News Feed - December 11, 2014 - 1:20pm
Summary: The 2011 east coast earthquake felt by people from Georgia to Canada likely originated from a fault “junction” just outside of Mineral, Virginia, according to new U.S. Geological Survey research published in the Geological Society of America’s Special Papers.

Contact Information:

Heidi  Koontz ( Phone: 303-202-4763 ); Hannah Hamilton ( Phone: 703-648-4356 );



The 2011 east coast earthquake felt by people from Georgia to Canada likely originated from a fault “junction” just outside of Mineral, Virginia, according to new U.S. Geological Survey research published in the Geological Society of America’s Special Papers.

Following the August 23, 2011 event, USGS scientists conducted low-altitude geophysical (gravity and magnetic) flight surveys in 2012 over the epicenter, located about eight miles from the quake’s namesake. Maps of the earth’s magnetic field and gravitational pull show subtle variations that reflect the physical properties of deeply buried rocks. More research may reveal whether geologic crossroads such as this are conducive to future earthquakes in the eastern United States.

Caption: In map view, magnetic data were filtered (colors) to highlight geologic features near the earthquake depth. One contrast (blue dotted line) is aligned with aftershocks (black dots). The other crosses at an angle. They suggest that the earthquake (yellow star) occurred near a “crossroads,” or a complex intersection of different types of rock.

“These surveys unveiled not only one fault, which is roughly aligned with a fault defined by the earthquake’s aftershocks, but a second fault or contact between different rock types that comes in at an angle to the first one,” said USGS scientist and lead investigator, Anji Shah. “This visual suggests that the earthquake occurred near a ‘crossroads,’ or junction, between the fault that caused the earthquake and another fault or geologic contact.”

Deep imaging tools were specifically chosen because the earthquake occurred about five miles beneath the earth. Looking at faults in this way can help scientists better understand earthquake hazards in the eastern United States.

The USGS and partner scientists are also interested in why seismic events occur in certain parts of the central and eastern United States, like the Central Virginia seismic zone, since there are no plate boundaries there, unlike the San Andreas Fault in California, or the Aleutian Trench in Alaska.

USGS scientists still have remaining questions:  Could this happen elsewhere? How common are such crossroads?  Shah and other scientists are also trying to understand whether and why a junction like this might be an origin point for earthquakes.

“Part of it might be the complex stress state that arises in such an area. Imagine you have a plastic water bottle in your hand, and it has a cut (fault) in it the long way. When you squeeze the bottle, it pops (ruptures) where the cut is.  The long cut is comparable to an ancient fault – it’s an area of weakness where motion (faulting and earthquakes) is more likely to happen. Multiple intersecting cuts in that bottle produce zones of weakness where fault slip is more likely to happen, especially where two cuts intersect,” said Shah.

The situation near the fault on which the magnitude 5.8 Mineral earthquake occurred is more complex than that. For example, the fault may separate different types of rocks with varying densities and strengths, as suggested by the gravity data. This contributes to a complex stress field that could also be more conducive to slip.

Additional science data about the 2011 Mineral, Virginia, earthquake may be found online.

NASA-USGS Climate Data App Challenge: An Invitation for Innovation

U.S. Geological Survey News Feed - December 11, 2014 - 1:00pm
Summary: NASA in partnership with the U.S. Geological Survey (USGS) is offering more than $35,000 in prizes to citizen scientists for ideas that make use of climate data to address vulnerabilities faced by the United States in coping with climate change. 

Contact Information:

Jon Campbell, USGS ( Phone: 703-648-4180 ); Steve Cole, NASA ( Phone: 202-358-0918 );



NASA in partnership with the U.S. Geological Survey (USGS) is offering more than $35,000 in prizes to citizen scientists for ideas that make use of climate data to address vulnerabilities faced by the United States in coping with climate change. 

The Climate Resilience Data Challenge, conducted through the NASA Tournament Lab, a partnership with Harvard University hosted on Appirio/Topcoder, kicks off Monday, Dec. 15 and runs through March 2015. 

The challenge supports the efforts of the White House Climate Data Initiative, a broad effort to leverage the federal government’s extensive, freely available climate-relevant data resources to spur innovation and private-sector entrepreneurship in order to advance awareness of and preparedness for the impacts of climate change. The challenge was announced by the White House Office of Science and Technology Policy Dec. 9. 

According to the recent National Climate Assessment produced by more than 300 experts across government and academia, the United States faces a number of current and future challenges as the result of climate change. Vulnerabilities include coastal flooding and weather-related hazards that threaten lives and property, increased disruptions to agriculture, prolonged drought that adversely affects food security and water availability, and ocean acidification capable of damaging ecosystems and biodiversity. The challenge seeks to unlock the potential of climate data to address these and other climate risks. 

“Federal agencies, such as NASA and the USGS, traditionally focus on developing world-class science data to support scientific research, but the rapid growth in the innovation community presents new opportunities to encourage wider usage and application of science data to benefit society,” said Kevin Murphy, NASA program executive for Earth Science Data Systems in Washington. “We need tools that utilize federal data to help our local communities improve climate resilience, protect our ecosystems, and prepare for the effects of climate change.” 

“Government science follows the strictest professional protocols because scientific objectivity is what the American people expect from us,” said Virginia Burkett, acting USGS associate director for Climate Change and Land Use. “That systematic approach is fundamental to our mission. With this challenge, however, we are intentionally looking outside the box for transformational ways to apply the data that we have already carefully assembled for the benefit of communities across the nation.”

The challenge begins with an ideation stage for data-driven application pitches, followed by storyboarding and, finally, prototyping of concepts with the greatest potential. 

The ideation stage challenges competitors to imagine new applications of climate data to address climate vulnerabilities. This stage is divided into three competitive classes based on data sources: NASA data, federal data from agencies such as the USGS, and any open data. The storyboarding stage allows competitors to conceptualize and design the best ideas, followed by the prototyping stage, which carry the best ideas into implementation. 

The Climate Resilience Data Challenge is managed by NASA's Center of Excellence for Collaborative Innovation at NASA Headquarters, Washington. The center was established in coordination with the Office of Science and Technology Policy to advance open innovation efforts for climate-related science and extend that expertise to other federal agencies. 

For additional information and to register (beginning Dec. 15), visit the Climate Resilience Data Challenge website.   

Learn more 

National Climate Assessment
USGS Climate and Land Use Change
USGS Core Science Systems
NASA challenges and citizen science