11th December 2013
Powerful new greenhouse gas discovered
A newly-discovered greenhouse gas, perfluorotributylamine, has over 7,000 times the heat-trapping ability of CO2 over a 100-year period.
Scientists from the University of Toronto have discovered a novel chemical lurking in the atmosphere that appears to be a long-lived greenhouse gas (LLGHG). The chemical – perfluorotributylamine (PFTBA) – is among the most radiatively efficient chemicals found to date.
Radiative efficiency describes how effectively a molecule can affect climate. This value is then multiplied by its atmospheric concentration to determine the total climate impact.
PFTBA has been in use since the mid-20th century for various applications in electrical equipment and is currently used in thermally and chemically stable liquids marketed for use in electronic testing and as heat transfer agents. It does not occur naturally; that is, it is produced by humans. There are no known processes that would destroy or remove PFTBA in the lower atmosphere so it has a very long lifetime, possibly up to 500 years, and is destroyed in the upper atmosphere.
"Global warming potential is a metric used to compare the cumulative effects of different greenhouse gases on climate over a specified time period," said Cora Young who was part of the research team, along with Angela Hong and their supervisor, Scott Mabury. Time is incorporated in the global warming potential metric as different compounds stay in the atmosphere for different lengths of time, which determines how long-lasting the climate impacts are.
Carbon dioxide (CO2) is used as the baseline for comparison, since it is the most important greenhouse gas responsible for human-induced climate change. "PFTBA is extremely long-lived in the atmosphere and has very high radiative efficiency; the result of this is a very high global warming potential. Calculated over a 100-year timeframe, a single molecule of PFTBA has the equivalent climate impact as 7,100 molecules of CO2," said Hong. "There are no policies that control its production, use, or emission. It is not being regulated by any type of climate policy."
Dr Drew Shindell, a climatologist at NASA: "This is a warning to us that this gas could have a very, very large impact on climate change – if there were a lot of it. Since there is not a lot of it now, we don't have to worry about it at present, but we have to make sure it doesn't grow and become a very large contributor to global warming."
7th December 2013
Regions identified where centuries of industrial CO2 could be stored
Researchers at the University of Southampton have identified regions beneath the oceans where igneous rocks of the upper ocean crust could safely store huge volumes of carbon dioxide.
The burning of fossil fuels such as coal, oil, and natural gas has led to dramatically increased levels of CO2 in our planet's atmosphere. These industrial emissions reached 36 billion tonnes annually in 2013 – over 100 times greater than natural CO2 output from all of the world's volcanoes, according to the US Geological Survey. The overwhelming majority of climate scientists agree this is causing climate change and ocean acidification. Although technologies are now being developed to capture CO2 from power stations and other sources, this will only avoid further warming if that CO2 is then safely locked away from the atmosphere for centuries.
Chiara Marieni, PhD, from the National Oceanography Centre in Southampton, investigated the physical properties of CO2 to develop global maps of the ocean floor and estimate where CO2 can be safely stored.
At high pressures and low temperatures, like those in deep oceans, CO2 occurs as a liquid that is denser than seawater. Estimating temperatures in the upper ocean crust, Chiara and her colleagues identified regions where it may be possible to stably store large volumes of CO2 in the basalts. These fractured rocks have high proportions of open space and over time may also react with the CO2 so it becomes locked into solid calcium carbonate – permanently preventing its release back into the oceans or atmosphere. As a precaution, Chiara refined her locations to areas that have the additional protection of thick blankets of impermeable sediments to prevent gas escape.
The team identified five potential regions in off-shore Australia, Japan, Siberia, South Africa and Bermuda, ranging in size from ½ million square kilometres to almost four million square kilometres.
"We found regions that have the potential to store decades to hundreds of years of industrial carbon dioxide emissions," said Chiara, "although the largest regions are far off-shore. However, further work is needed in these regions to accurately measure local sediment conditions and sample the basalt beneath before this potential can be confirmed."
Her work, published this week in Geophysical Research Letters, shows that previous studies, which concentrated on the effects of pressure to liquefy CO2 – but ignored temperature – pointed to the wrong locations, where high temperatures mean the CO2 will have a low density, and thus be more likely to escape.
3rd December 2013
Underwater kites: a new generation of tidal power
A new tidal energy device can operate cost-effectively in deep waters with low-velocity currents. A full-scale demonstration of this innovative technology is now planned for 2015.
Founded in 2007, Minesto is a marine energy company based in Sweden and Ireland. Their patented technology aims to provide green electricity production from tidal and ocean currents. Known as "Deep Green", it is based on a system of underwater kites, consisting of a wing and turbine, attached by a tether to a fixed point on the ocean bed. As water flows over the hydrodynamic wing, a lift force is generated which allows the device to move smoothly through the water and for the turbine to rotate – hence generating electricity.
The wing is designed to create high loads, requiring a stiff structure and light weight with sufficient fatigue and material properties and has to include watertight compartments ensuring a lifetime of 20 years. It contains a buoyancy system, batteries and pressure sensors.
The tether is mainly a force-bearing element designed to take the high loads created by the wing, but will also accommodate power cables from the generator and signal cable to the control system.
Minesto claims that Deep Green is the only marine power plant in the world that will operate cost-effectively in areas with low-velocity currents – as opposed to other technologies, which are restricted to tidal "hot spot" locations. Since it can operate economically in deep waters at velocities below 2.5 m/s, the number of potentially suitable sites is huge.
Other advantages of Deep Green include:
– A robust anchorage system, since no tower is needed
– Low maintenance cost, since only attachment and detachment has to be done offshore
– Minimal visual and environmental impact; Deep Green is always at 20 metres or more below the water surface
– Predictability. Tidal currents are extremely regular
– Small and low weight, less than 7 tons per unit, 500kW
– Gearless turbine-generator system
Vinnova, the Swedish Innovation Agency, yesterday awarded Minesto a grant for a pre-study of its planned ocean trials. Miniature versions of Deep Green have already undergone trials in scale 1:4 at Strangford Lough, Northern Ireland, and the next step is to develop a full-size prototype and test it in the ocean. The project funded by Vinnova will enable Minesto to assess the feasibility and budget in preparation for these full-scale trials in 2015.
Anders Jansson, CEO and Co-Founder: "Ocean currents are the hidden treasure of renewable energy sources. With their almost continuous water flows they carry large amounts of renewable energy over the globe, and with a high load factor compared to weather-dependent sources like wind or solar power. This resource is predictable and feasible for providing base grid power, and has minimal environmental impact."
"The challenge has been that the currents are too slow and the sites are too deep for most available marine power plants. Deep Green solves that problem. Minesto's technology will contribute to making countries like the USA, Japan and Taiwan carbon neutral and independent energy producers, instead of hugely dependent on fossil-based and imported energy."
"Just to take one example: Taiwan claims that 50 per cent of their energy can be supplied from the ocean currents along the coast if they just find a viable technology, and we believe that Deep Green is that technology. Today Taiwan depends on 98 per cent imported energy, which is a threat to the country's economy."
26th November 2013
Even if emissions stop, carbon dioxide could warm Earth for centuries
Even if carbon dioxide emissions came to a sudden halt, the carbon dioxide already in Earth's atmosphere will continue to warm our planet for hundreds of years, according to Princeton University-led research published in the journal Nature Climate Change. The study suggests that it might take a lot less carbon than previously thought to reach the global temperature scientists deem unsafe.
A graph from the paper, showing how global temperature drops after emissions cease, but rises again after 100 years (red line). The blue line, which represents a global model that does not account for a gradual decline in ocean heat uptake, shows a slow decline in temperature over the same time. Source: Frölicher et al. (2013)
The researchers simulated an Earth on which, after 1,800 billion tons of carbon entered the atmosphere, all emissions suddenly stopped. Scientists commonly use the scenario of emissions screeching to a stop to gauge the heat-trapping staying power of carbon dioxide. In this simulated shutoff, the carbon itself faded steadily with 40 percent absorbed by Earth's oceans and landmasses within 20 years, 60 percent within 100 years and 80 percent within 1,000 years.
By itself, this decrease of atmospheric carbon dioxide should lead to cooling. But the heat trapped by the carbon dioxide took a divergent track. After a century of cooling, the Earth warmed by 0.37ºC (0.66ºF) during the next 400 years as the ocean absorbed less and less heat. While the resulting temperature spike seems slight, a little heat goes a long way here. For context, Earth has warmed by only 0.85ºC (1.5ºF) since pre-industrial times.
The Intergovernmental Panel on Climate Change (IPCC) estimates that global temperatures a mere 2ºC (3.6ºF) higher than pre-industrial levels would dangerously interfere with the climate system. To avoid that point would mean humans have to keep cumulative carbon dioxide emissions below 1,000 billion tons of carbon – about half of which has already been put into the atmosphere since the dawn of industry.
The lingering warming effect the researchers found, however, suggests that the 2-degree point may be reached with much less carbon, said first author Thomas Frölicher, who conducted the work as a postdoctoral researcher in Princeton's Program in Atmospheric and Oceanic Sciences.
"If our results are correct, the total carbon emissions required to stay below 2 degrees of warming would have to be three-quarters of previous estimates, only 750 billion tons instead of 1,000 billion tons of carbon," he said. "Thus, limiting the warming to 2 degrees would require keeping future cumulative emissions below 250 billion tons – only half of the already emitted amount of 500 billion tons.”
The researchers' work contradicts a scientific consensus that the global temperature would remain constant or decline if emissions were suddenly cut to zero. But previous research did not account for a gradual reduction in the oceans' ability to absorb heat from the atmosphere, especially the polar oceans, Frölicher said. Although CO2 steadily dissipates, Frölicher and his co-authors were able to see that the oceans that remove heat from the atmosphere gradually take up less. Eventually, the residual heat offsets the cooling that occurred due to dwindling amounts of carbon dioxide.
Photo courtesy of Eric Galbraith, McGill University
Frölicher and his co-authors showed that the change in ocean heat uptake in the polar regions has a larger effect on global average temperature than a change in low-latitude oceans, a mechanism known as "ocean-heat uptake efficacy." This mechanism was first explored in a 2010 paper by Frölicher's co-author, Michael Winton.
"The regional uptake of heat plays a central role," Frölicher explained. "Previous models have not really represented that very well."
"Scientists have thought that the temperature stays constant or declines once emissions stop, but now we show that the possibility of a temperature increase cannot be excluded," he continued. "This is illustrative of how difficult it may be to reverse climate change — we stop the emissions, but still get an increase in the global mean temperature."
The study appears in Nature Climate Change.
23rd November 2013
Driving economic growth into the Solar System
Planetary Resources, Inc. was co-founded in 2010 by Peter Diamandis and Eric C. Anderson. This new startup company hopes to address one of the paramount problems faced on Earth: resource scarcity. It will achieve this by developing a robotic asteroid mining industry, based on reduced fuel costs. As this video explains, prospecting and mining asteroids could drive economic growth into the Solar System, where potentially trillions of dollars' worth of metals and minerals lie. Planetary Resources has already signed an agreement with Virgin Galactic for payload services. In early 2014, they plan to launch "Arkyd-3", a testbed for the larger Arkyd-100 spacecraft that will hunt for asteroids.
18th November 2013
The most detailed ever map of global deforestation
Using a vast database of high-resolution satellite imagery, researchers at the University of Maryland have produced the most detailed ever map of forest loss and gain between 2000 and 2012. This has been made freely available on Google Earth.
A total of 654,000 cloud-free images were obtained from Landsat 7 – an orbiting satellite managed and operated by the U.S. Geological Survey. Each of the 143 billion pixels on the map shows an area roughly 30 metres square and provided researchers with enough detail to determine if an area had lost or gained tree cover. It was estimated that 888,000 sq mi (2.3 million sq km) of forest were lost, while around 309,000 sq mi (0.8 million sq km) regrew, a net loss of 579,000 sq mi (1.5 million sq km). This resulted from a combination of logging, wildfires, windstorms and insect pests.
Key to the project was collaboration with team members from Google Earth Engine, who reproduced in the Google Cloud the models developed at the University of Maryland for processing and characterising Landsat data. Prior to this study, country-to-country comparisons of forestry data were not possible at this level of accuracy. Different countries define forests differently, making previous global comparisons difficult with existing inventories.
Professor of Geographical Sciences, Matthew Hansen, who led the study: "When you put together datasets that employ different methods and definitions, it's hard to synthesise. But with Landsat, as a polar-orbiting instrument that takes the same quality pictures everywhere, we can apply the same algorithm to forests in the Amazon, in the Congo, in Indonesia, and so on. It's a huge improvement in our global monitoring capabilities."
"Losses or gains in forest cover shape many important aspects of an ecosystem – including climate regulation, carbon storage, biodiversity and water supplies, but until now there has not been a way to get detailed, accurate, satellite-based and readily available data on forest cover change from local to global scales."
Jeff Masek, Landsat project scientist at NASA: "Since the first Landsat satellite launched 41 years ago, scientists have been improving their land cover analysis as computers have become more powerful. Projects like Hansen's took a big leap forward once USGS made the data freely available on the Internet in 2008."
"This is the first time somebody has been able to do a wall-to-wall, global Landsat analysis of all the world's forests – where they're being cleared, where they're regrowing, and where they're subject to natural disturbances," he added, noting that the maps could be routinely updated to help in carbon accounting and other studies of land cover change.
Subtropical forests were found to have the highest rates of change, largely due to intensive forestry land uses. The rate of disturbance for North American subtropical forests, located in the Southeast United States, was found to be four times that of South American rainforests. More than 31 percent of U.S. southeastern forest cover was either lost or regrown. Landowners in this area harvest trees for timber and quickly plant replacements, treating them like crops, which makes it a highly intensive region for tree loss and gain.
Brazil's recent efforts to reduce its deforestation were offset by increasing forest loss in other tropical regions including Angola, Bolivia, Indonesia, Malaysia, Paraguay and Zambia. The country with the largest relative increase in forest loss was Indonesia, more than doubling its annual figure to almost 20,000 sq km (7,722 sq mi). Other countries with high rates of deforestation included Russia and Canada.
To view the forest cover maps in Google Earth Engine, visit the following link:
An even more detailed study – the BIOMASS mission – is planned for 2019. This observatory will be powerful enough to determine the height and wood content of individual trees.