Despite claims of a recent slowdown in global mean temperature, the number of local temperature extremes has "dramatically and unequivocally increased in number and area", according to researchers at the University of New South Wales. This has also occurred despite the complete absence of a strong El Niño since 1998.
This image shows a time series of temperature anomalies for hot extremes over land (red) and global mean temperature (black, blue). The anomalies are computed with respect to the 1979-2012 time period. The time series are based on the ERA-Interim 95th percentile of the maximum temperature over land (Txp95_Land, red) and the global (ocean + land) mean temperature (Tm_Glob) in ERA-Interim (blue) and HadCRUT4 (black).
Extremely hot temperatures over land have dramatically and unequivocally increased in number and area despite claims that the rise in global average temperatures has slowed over the past 10 to 20 years during what some public commentators have called a global warming hiatus period.
Scientists from the University of New South Wales (UNSW) ARC Centre of Excellence for Climate System Science and international colleagues made the finding when they focused their research on the rise of temperatures at the extreme end of the spectrum, where impacts are felt the most.
“It quickly became clear, the 'hiatus' in global average temperatures did not stop the rise in the number, intensity and area of extremely hot days,” said one of the paper’s authors, Dr Lisa Alexander.
“Our research has found a steep upward tendency in the temperatures and number of extremely hot days over land and the area they impact, despite the complete absence of a strong El Niño since 1998.”
The researchers examined the extreme end of the temperature spectrum because this is where global warming impacts are expected to occur first and are most clearly felt. As Australians saw this summer and the last, extreme temperatures in inhabited areas have major impacts on society. The observations also show that extremely hot events are now affecting, on average, more than twice the area when compared to similar events 30 years ago.
To get their results, which are published in the journal Nature Climate Change, the researchers examined hot days starting from 1979. Temperatures of every day throughout the year were compared against temperatures on that exact same calendar day from 1979-2012. The hottest 10 per cent of all days over that period were classified as hot temperature extremes.
Globally, on average, regions normally expect around 36.5 extremely hot days in a year. The observations showed that during the period from 1997-2012, regions that experienced 10, 30 or 50 extremely hot days above this average saw the greatest upward trends in extreme hot days over time and the area they impacted. The consistently upward trend persisted right through the “hiatus” period from 1998-2012.
“Our analysis shows there has been no pause in the increase of warmest daily extremes over land and the most extreme of the extreme conditions are showing the largest change,” said Dr Markus Donat.
“Another interesting aspect of our research was that those regions that normally saw 50 or more excessive hot days in a year saw the greatest increases in land area impact and the frequency of hot days. In short, the hottest extremes got hotter and the events happened more often.”
While global annual average near-surface temperatures are a widely used measure of climate change, this latest research reinforces that they do not account for all aspects of the climate system. A stagnation in the increase of global annual mean temperatures, over a relatively short period of 10 to 20 years, does not imply that global warming has stopped. Other measures – such as extreme temperatures, ocean heat content and disappearance of land-based ice – all show continuous changes that are consistent with a warming world.
“It is important when we take global warming into account, that we use measures that are useful in determining the impacts on our society,” said Professor Sonia Seneviratne from ETH Zurich, who led the study while on sabbatical at the ARC Centre. “Global average temperatures are a useful measurement for researchers, but it is at the extremes where we will most likely find those impacts that directly affect all of our lives. Clearly, we are seeing more heat extremes over land more often as a result of enhanced greenhouse gas warming.”
Over 20,000 crops originating from 100 countries will arrive this week at the Svalbard Global Seed Vault (SGSV), in time for its sixth anniversary.
Credit: Svalbard Global Seed Vault
The Svalbard Global Seed Vault (SGSV) was opened on 26th February, 2008. This secure facility is located on the Norwegian island of Spitsbergen in the remote Arctic Svalbard archipelago, 810 miles (1,300 km) from the North Pole. Intended to preserve a huge variety of plant seeds and their genetic codes, it is designed to last for centuries and to survive all man-made and natural disasters – everything from climate change, to nuclear war, or even asteroid impacts. Permafrost and thick rock ensure that, even in the case of a power cut, the seed samples will remain frozen.
The new samples added this week include a collection of barley from earthquake-rattled Japan, crucial to everything from beer and whiskey to miso soup and summertime tea; an untamed assortment of wild relatives of rice, maize and wheat; exotic red okra from Tennessee via the Cherokee; and, from Brazil, a humble bean that launched a national cuisine.
The addition of this cornucopia of crops to the "Doomsday Seed Vault" – as some call it – means there are now a total of 820,619 individual samples or "accessions" of food crops and their wild relatives, stored deep within an Arctic mountain. Their donation also coincides with the 10th anniversary of the Global Crop Diversity Trust, which maintains the seed vault in partnership with the Norwegian government and the Nordic Genetic Resources Centre.
Credit: Svalbard Global Seed Vault
"Our annual gatherings at the seed vault are a sort of winter Olympics of crop diversity – only we are not competing against each other but against the wide array of threats, natural and manmade, ranged against the diversity of food crops, diversity that is so crucial to the future of human civilization," said Marie Haga, the Crop Trust's executive director. "We are particularly excited to be welcoming our first seed deposits from Japan, which has been very active globally in the preservation of a wide array of crop species."
The seed vault is a backup, housing duplicates of living crop diversity collections kept in "genebanks" around the world that are widely and regularly shared with plant breeders.
"If something bad happened to our genebank, these resources could be damaged permanently," said Prof. Kazuhiro Sato, from Okayama University in Japan. "Barley is very important not just for Japan but for the food security of the world – we have varieties that are productive even in dry conditions and in saline soils – so we need to do everything we can to ensure they always will be available to future generations."
Credit: Svalbard Global Seed Vault
The shipments arriving this week at Svalbard also illustrate important progress in the global effort to collect and protect the wild relatives of domesticated crops, many of which could be important sources of genetic traits such as heat and drought tolerance and disease and pest resistance. These traits will be needed to help farmers adapt to stresses that are being intensified by climate change.
"CIMMYT alone already has sent 123,000 maize and wheat accessions and we are well on our way to having 100 percent of our collection duplicated in the seed vault by 2021," said Denise Costich, head of CIMMYT's Maize Germplasm Bank.
"Each and every single deposit into the vault provides an option for the future," added Haga. "At a time of unprecedented demands on our natural environment, it is critical to conserve plant genetic resources for food and agriculture. This will guarantee farmers and plant breeders continued access to the raw materials they need to improve and adapt crops. Conserving crop diversity guarantees that the foundation of our agriculture is secure for the future. Drawing on a global coalition of governments and private donors, the Crop Trust is building an Endowment Fund, which will safeguard the diversity of the major food crops of the world in genebanks when complete."
Injecting sulfate particles into the atmosphere to reflect sunlight and cool the Earth is one potential solution to global warming. If this process was ended too suddenly, however, the results would be catastrophic, based on research by the University of Washington.
Credit: Hughhunt (CC BY-SA 3.0)
Carrying out geoengineering for several decades and then stopping would trigger warming at a rate far exceeding that expected due to global warming, according to a study published this week in Environmental Research Letters.
"The absolute temperature ends up being roughly the same as it would have been – but the rate of change is so drastic, that ecosystems and organisms would have very little time to adapt to the changes," said lead author Kelly McCusker.
The study looks at solar radiation management, a proposed method of geoengineering by spraying sulfur-based particles into the upper atmosphere to reflect sunlight. This is similar to what happens after a major volcanic eruption, and many experts believe the technique is economically and technically feasible. But continuous implementation over decades will depend on technical functioning, continuous funding, international agreement and lack of negative side effects.
The team used a global climate model to simulate a business-as-usual emissions trend until 2035, when geoengineering would be implemented for 25 years – then suddenly stopped at 2060. The model showed that global temperatures could jump by 4°C in the following three decades (shown by the yellow lines in the graphs below), a rate more than double what it would have been otherwise, and far exceeding historical temperature trends.
These results build on recent work led by UK researchers pointing to the risk of implementing and then abruptly stopping geoengineering. That study compared several climate models, showing that the result is not specific to any one model. This new study used a single model with a more realistic scenario, where instead of simply decreasing the Sun's strength, they actually simulated sulfate particles to stabilise the temperature (shown in blue), allowing a more precise look at the geographic and seasonal patterns.
"The changes that will be needed to adapt to a warmer climate are really profound," said co-author David Battisti, professor of atmospheric sciences. "The faster the climate changes, the less time farmers have to develop new agricultural practices, and the less time plants and animals have to move or evolve."
The total amount of warming after stopping geoengineering would be largest in winter near the poles – but compared to typical historical rates of change, the most extreme effects would be seen at the tropics during summertime, where there is usually very little temperature variation.
“According to our simulations, tropical regions like South Asia and sub-Saharan Africa would be hit particularly hard, the very same regions that are home to many of the world’s most food-insecure populations,” McCusker said. “The potential temperature changes also pose a severe threat to biodiversity.”
The researchers looked at different variables and found that the rate of warming is largely determined by the length of time that geoengineering is deployed and the amount of greenhouse gases emitted during that time, rather than by how sensitive the climate is to changes in greenhouse-gas concentrations.
“If we must geoengineer, it does not give us an excuse to keep emitting greenhouse gases,” she added. “On the contrary – our results demonstrate that if geoengineering is ever deployed, it’s imperative that greenhouse gases be reduced at the same time to reduce the risk of rapid warming.”
Even a dramatic reduction in carbon emissions may prove insufficient, however. The momentum already locked into the climate system will drive warming for centuries to come, according to a recent study published in Nature. The evidence seems to be accumulating that we need to actually reverse CO2 levels – not just slow emissions – likely through a range of additional measures such as carbon capture and storage.
By embracing conservation measures and renewable energy, China can transition to an 82 percent renewable electric power system by 2050, at far less cost than relying on coal, according to a new report from WWF-US.
As a result, China’s carbon emissions from power generation could be 90% less than currently projected levels in 2050 without compromising the reliability of the electric grid or slowing economic growth.
The China’s Future Generation report was prepared by the Energy Transition Research Institute (Entri) for WWF and uses robust computer modeling to simulate four scenarios based on today’s proven technology: a Baseline, High Efficiency, High Renewables, and Low-Carbon Mix scenario. To develop its findings, Entri examines China’s electricity supply and demand on an hour-by-hour basis through 2050 using its advanced China Grid Model.
“By fully embracing energy conservation, efficiency and renewables, China has the potential to demonstrate to the world that economic growth is possible while sharply reducing emissions that drive unhealthy air pollution and climate change,” said WWF’s China Climate and Energy Program Director, Lunyan Lu. “This research shows that with strong political will, China can prosper while eliminating coal from its power mix within the next 30 years.”
In addition to ramping up development of renewable power sources, the world’s most populous and energy-hungry nation will need to simultaneously pursue aggressive energy efficiency initiatives to reduce electricity demand. These efficiencies – including bold standards for appliances and industrial equipment – can reduce annual power consumption in 2050 by almost half, which would set the gold standard for these products globally and make the shift to a renewables-based power system possible.
“This research allows Chinese leaders to put the questions of technical feasibility aside and economic viability aside. Instead, it is time to focus on how to enact the right policies and establish the right institutions to ensure that China’s citizens and economy are receiving clean, renewable electricity,” said Lu. “The report shows that today’s technology can get China within striking distance of a future powered solely by renewable energy.”
The analysis also describes recent Chinese regulatory efforts and challenges to increasing the percentage of renewable electricity in the country, while providing a set of targeted recommendations for Chinese leaders and policy makers on energy efficiency, prioritising low-carbon electricity supply investments, allowing price changes to reflect the true cost of service, and prioritising collection and analysis of key power usage data.
“Both China and the United States are at a crossroads – where leaders need to choose between a future where healthy communities are powered by clean, renewable energy, or a future darkened by air pollution and the dangerous effects of climate change,” said Lou Leonard, WWF’s US vice president for climate change. “This year, as all countries develop new national climate targets in advance of talks in Paris, our leaders need to choose that brighter future. For Chinese leaders the choice is simple. This report shows that renewables are doable. China can meet bold new targets with today’s technologies while cutting energy costs.”
Genetically modified potatoes that resist blight – a fungus that caused the Great Irish Famine – have been developed by the John Innes Centre and the Sainsbury Laboratory in Britain.
GM (left) and non-GM (right) potato plants, a month after infection with blight. Credit: The Sainsbury Laboratory (TSL)
The Great Famine is considered the worst tragedy in the history of Ireland. This period of mass starvation and disease occurred from 1845-1852, leaving over a million dead and causing many more to leave the country. The proximate cause of the famine was Phytophthora infestans – a serious potato disease commonly known as blight.
During 2012, the third year of the trial, the potatoes experienced ideal conditions for late blight. The scientists did not inoculate any plants, but waited for disease strains circulating in the UK to blow in.
Non-GM potato plants of the Desiree variety were 100% infected by August, while all GM plants remained fully resistant throughout the experiment. There was also a notable difference in yield, with tubers from the 16 transgenic plants weighing 6-13 kg, while the non-GM tubers weighed only 1.6-5 kg per block.
The introduced gene, from a South American wild relative of potato, triggers the plant’s natural defence mechanisms by enabling it to recognise the pathogen. Cultivated potatoes possess around 750 resistance genes – but in most varieties, late blight is able to elude them.
“Breeding from wild relatives is laborious and slow and by the time a gene is successfully introduced into a cultivated variety, the late blight pathogen may already have evolved the ability to overcome it,” said Professor Jonathan Jones from The Sainsbury Laboratory (TSL).
“With new insights into both the pathogen and its potato host, we can use GM technology to tip the evolutionary balance in favour of potatoes and against late blight.”
Blight causes annual worldwide losses of about £3.5 billion ($5.8 billion), according to TLS. As well as the economic costs, frequent chemical sprays lead to soil compaction from tractor journeys and CO2 emissions from diesel fuel. In northern Europe, farmers often spray a potato crop 10-15 times, or up to 25 times in a bad year. Scientists hope to replace chemical control with genetic control.
The researchers have licensed their technology to an American company, Simplot, which wants to grow them in the US.
"I think it is unfortunate that American farmers are going to benefit from the fruits of European taxpayers' funded work way before Europeans," Jones told the BBC. "This kind of product will likely be on the US market within a couple of years and if we are lucky within eight to 10 years in Europe."
With new satellite technology, it is becoming possible to count individual whales and to automatically estimate their population size. Using Very High Resolution (VHR) satellite imagery, alongside image processing software, researchers were able to detect and count the number of whales breeding off the coast of Argentina.
Satellite images compared with aerial photograph (top right)
The new method, published this week in the journal PLoS ONE, could revolutionise how whale populations are estimated. Marine mammals are extremely difficult to count on a large scale and traditional methods – such as counting from platforms or land – can be costly and inefficient.
Lead author Peter Fretwell from the British Antarctic Survey (BAS): “This is a proof of concept study that proves whales can be identified and counted by satellite. Whale populations have always been difficult to assess; traditional means of counting them are localized, expensive and lack accuracy. The ability to count whales automatically, over large areas in a cost effective way will be of great benefit to conservation efforts for this and potentially other whale species.”
Previously, satellites have provided limited success in counting whales – but their accuracy has improved in recent years. The BAS team began by taking a single WorldView2 satellite image of a bay where southern right whales gather to calve and mate. Driven to near extinction, these whales have made a limited recovery following a whaling ban. In recent years, however, many deaths have been seen on their nursery grounds at Peninsula Valdes. Their population size is now unknown but with this sharp increase in calf mortality, estimates are needed.
The enclosed bays in this region contain shallow, calm waters – increasing the chance of spotting the whales from space. Three main criteria were used to identify whales: objects visible in the image should be the right size and shape; they should be in the right place (where whales would be expected to be) and there should be no (or few) other types of objects that could be mistaken as whales.
Whales in the image were manually counted, finding 55 probable whales, 23 possible whales and 13 sub-surface features. Several automated methods where then tested against these numbers. A ‘thresholding’ of the Coastal Band of the WorldView2 image gave the greatest accuracy. This part of the image uses light from the far blue end of the spectrum, which penetrates the water column deeper and reveals more whales. This technique found 89% of probable whales identified in the manual count.
This semi-automated technique needs some user input to identify the best threshold. Future satellite platforms, however, will provide much higher quality imagery and Worldview3 is planned to be launched later this year. This will allow for greater confidence in identifying whales and differentiating mother and calf pairs. Such technological advancements may also allow scientists to apply this method to other species.
India's government has signed a deal with six companies to build a 4 gigawatt (GW) solar power plant – by far the world's largest.
This facility – described by officials as an "ultra mega" project – is equivalent to four nuclear reactors and double the nation's entire current solar capacity. It will be 10 times bigger than any plant of its kind in the world. Located west of Jaipur, close to Sambhar Lake, it will be constructed in two phases over seven years, with phase 1 comprising 1000 MW followed by the remaining 3000 MW in a later phase.
In 2010, India launched a "solar mission" initiative, aiming to deliver 20 GW of solar capacity by 2022. This new project will be a significant step towards achieving that goal. The nation has an even more ambitious plan to reach 100 GW by 2030, enough to supply 200 million people.
With its high levels of sunlight, India is well-placed to exploit solar energy. Combined with plummeting installation costs and improving efficiency, solar is becoming a more attractive option with each passing year. It now stands at 7.50 rupees per kilowatt-hour, down from 17 rupees just three years ago. By comparison, natural gas is roughly 5.5 rupees per kWh, nuclear is 3 rupees per kWh and coal is 2.50 rupees per kWh. It won't be long before solar is able to match these cheaper forms of energy, and then things could get really interesting. Solar has the potential to be a highly disruptive technology – especially when combined with smart grids and local storage. Longer term, there is the possibility of using solar within continent-wide supergrids.
This week, the Australian government approved plans to dump five million cubic metres of sediment near the Great Barrier Reef, as part of an expansion to create the world's largest coal port.
Visible from outer space, the Great Barrier Reef is the world's largest coral reef system and the biggest single structure made by living organisms. It is home to a staggering diversity of marine life – including more than 1,500 fish species alongside birds, sea turtles, sea snakes, dolphins, whales, dugongs, and molluscs such as the giant clam; not to mention thousands of different plants like seagrasses and seaweeds. It has been labelled as one of the seven natural wonders of the world and was chosen as a World Heritage Site in 1981. Tourism is an important economic activity for the region, generating over $6 billion per year.
With its delicate ecosystem, the Great Barrier Reef is highly sensitive and vulnerable to sudden environmental changes. According to a recent study, it has lost more than half its coral cover since 1985, with two-thirds of the loss occurring from 1998. Among the human-caused threats are climate change, pollution, overfishing, shipping accidents and oil spills. Natural causes include tropical cyclones, disease and invasions by crown-of-thorns starfish. Much of the reef could be wiped out by the middle of this century, based on current trends.
Despite its already fragile state, the Great Barrier Reef now faces additional harm in the form of Abbot Point – a coal port being expanded to provide new export facilities from the Galilee Basin in Queensland. When shipments begin in 2016, it will become the largest port of its kind in the world. To allow ships into the port, a massive dredging project is needed, with a disposal site for the sludge located 16 miles (25 km) to the north-east. An investigation zone is being assessed for alternative locations, as shown below.
The Great Barrier Reef Marine Park Authority (GBRMPA) notes that "the seafloor of the approved disposal area consists of sand, silt and clay and does not contain coral reefs or seagrass beds," claiming the operation will be "subject to strict environmental conditions."
Federal minister for the environment, Greg Hunt, said that water quality would actually improve in the region, due to conditions on the development that include programmes to support the health of the reef: "The conditions put in place for these projects will result in an improvement in water quality and strengthen the Australian government's approach to meeting the challenges confronting the reef into the future."
Scientists have been expressing a different opinion, however. A coral reef ecologist from the University of Queensland, Selina Ward, dismissed Hunt's remarks as "ridiculous" and explained that a huge amount of work had already gone into improving the water quality in recent years. To offset the damage arising from dredging operations of this size would take "unimaginable effort."
When sediment is dumped in this way, it can expand and travel outward, carried by ocean currents. The food chain is disrupted as seagrass and other plants die, in turn killing off animal populations that rely on them. Coral is weakened as increased sediment clouds the water and reduces the amount of sunlight getting through, harming algae that live symbiotically with them. Carbon is stored below seagrass in substantial quantities and this can be released when it dies – these meadows are currently disappearing at a global annual rate of 1.5 per cent, with almost 300 million tons of carbon added back into the environment each year as a result, according to Nature Geoscience. Another study concludes that seagrass is 35 times more efficient at absorbing carbon than rainforests.
There are further impacts to consider. Expanding the port will lead to a rise in ship traffic, increasing the chance of a collision with the reef or with other marine life. Humpback whale mothers and calves have been observed resting in the shallow waters around Abbot Point during migration. Green and flatback turtles on Abbot Beach will have their egg laying disrupted as they are confused by all the noise, lights and construction activity nearby. Directly behind the port itself is Caley Valley Wetland, home to several threatened bird species.
A group of 233 scientists had urged the Authority to reject the expansion, with a joint letter to chairman Russell Reichelt that stated: "The best available science makes it very clear that expansion of the port at Abbot Point will have detrimental effects on the Great Barrier Reef. Sediment from dredging can smother corals and seagrasses and expose them to poisons and elevated nutrients."
Last year, UNESCO had warned that the Great Barrier Reef might be placed on its list of World Heritage sites in danger unless action was taken to safeguard the region. A recent poll showed that 91 percent of Australians think protecting the Great Barrier Reef is the country's most important environmental issue – a number of huge petitions had been submitted prior to this week's decision.
The expansion of Abbot Point could be just the beginning, however. Several other massive dredging projects may emerge along the north-east coast, with Queensland's state government fast-tracking mega ports along the reef and dumping potentially 140 million tons of sediment by 2025, according to researchers based at James Cook University in Queensland. Abbot Point itself could be expanded further to accommodate the Alpha North Coal Project. The Australian Prime Minister, Tony Abbott, appears to show little interest in the environment, having abolished the Climate Commission and slashed a number of clean energy initiatives.
The industrialisation of the Great Barrier Reef is just the latest in a string of developments in areas previously considered immune to human influence. For instance, plans are underway to build a highway through the Serengeti National Park, while oil drilling is approved in the heart of Yasuni National Park, the most biologically diverse spot on Earth.
The average combined land and ocean surface temperature for January–December 2013 was tied as the fourth warmest such period on record, at 0.62°C (1.12°F) above the 20th century average.
The latest summary of global temperature released by the National Oceanic and Atmospheric Administration (NOAA) concludes that warmer-than-average temperatures affected the vast majority of the globe during 2013. Record warmth was observed across much of southern and western Australia, southwestern Ethiopia, eastern Tanzania, parts of central Asia around Kazakhstan and Uzbekistan, a large section of the southwestern Pacific Ocean, along with regions of the Arctic, central Pacific, and central Indian Oceans.
Temperatures were cooler-than-average across the central United States – a region that saw record warmth in 2012 – along with small sections of the eastern Pacific Ocean and the Southern Ocean off the tip of South America. No record coldest regions were observed for the January–December 2013 period, as shown in the map below.
Globally, 2010 remains the hottest year recorded by NOAA at 0.66°C (1.2°F) above the 20th century average, with 2005 and 1998 in second and third place, respectively. Including 2013, all 13 years of the 21st century (2001-2013) rank among the 15 warmest in the 134-year observational record. Viewed over a longer timescale, the trend is even more obvious. Last year's high temperatures occurred even without El Niño, suggesting that a new record may soon be reached and casting doubt on recent claims of a "pause" in warming.
In fact, the evidence for climate change (a term used since at least 1955) and humanity's contribution to it has become stronger than ever. Study after study confirms that human industrial activity is clearly and by far the dominant factor driving the recent changes in our atmosphere:
We have the world's most powerful supercomputers, making trillions of calculations per second for months on end, running state-of-the-art simulations with fantastic levels of detail. Contrary to what some would claim, these models have proven remarkably successful, correctly predicting:
• That our land, atmosphere and oceans would warm.
• That the troposphere would warm and the stratosphere would cool.
• That nighttime average temperatures would increase more than daytime average temperatures.
• That winter average temperatures would increase more than summer average temperatures.
• That polar amplification would lead to greater temperature increases nearer the poles.
• That the Arctic would warm faster than the Antarctic.
• The magnitude (0.3 K) and duration (two years) of the cooling from the Mt. Pinatubo eruption.
• The amount of water vapour feedback due to ENSO.
• The response of southern ocean winds to the ozone hole.
• The expansion of the Hadley cells.
• The poleward movement of storm tracks.
• The rising of the tropopause and the effective radiating altitude.
• The clear sky super greenhouse effect from increased water vapour in the tropics.
• The near constancy of relative humidity on global average.
• That coastal upwelling of ocean water would increase.
• They performed a retrodiction for the Last Glacial Maximum sea surface temperatures, which was inconsistent with the paleo evidence, and better paleo evidence subsequently showed the models were right.
And yet, even without these computer models, there is clear evidence of climate change and our influence on it. Decades of peer-reviewed studies in the world's top scientific journals have confirmed this reality; just as they confirmed the reality of evolution, our planet's geologic history, the germ theory of disease, links between smoking and cancer, depletion of the ozone layer by CFCs, along with countless other biological, chemical and physical processes. The science can never be perfect and there will always be gaps, but today no scientific body of national or international standing disputes the fundamental points.
Given all of the above, the risks of inaction – and the obvious benefits of clean technology – how can people be so eager to embrace fossil fuels, so confident in their scepticism, and willing to take such a gamble on their children's future? Even the conservative U.S. military now takes the issue seriously and is preparing for the impacts. If climate scientists are in it for the money, they're doing it wrong.
Global warming is the biggest story of our time, a result of our explosive growth in population and technology. It will define the 21st century and possibly many centuries to come. Ignoring the evidence and casually dismissing what decades of peer-reviewed science have told us would be a mistake of truly monumental proportions.
A quarter of the world's cartilaginous fish – namely sharks and rays – face extinction within a few decades, according to the first study to systematically and globally assess their fate.
The International Union for Conservation of Nature's (IUCN's) Shark Specialist Group (SSG), co-chaired by Nick Dulvy, conducted the study, which was published in the eLIFE journal this week.
Previous studies have documented local overfishing of some populations of sharks and rays. This new survey is the first to assess their status throughout coastal seas and oceans. It reveals that globally, one-quarter (249) of 1,041 known shark, ray and chimaera species fall under threatened categories on the IUCN Red List.
"We now know that many species of sharks and rays – not just the charismatic white sharks – face extinction across the ice-free seas of the world," says Dulvy. "There are no real sanctuaries for sharks where they are safe from overfishing."
Over two decades, the authors applied the IUCN's Red List categories and criteria to the 1,041 species at 17 workshops involving more than 300 experts. They incorporated all available information on distribution, catch, abundance, population trends, habitat use, life histories, threats and conservation measures.
Sharks and rays are at substantially higher risk of extinction than many other animals and have the lowest percentage of species considered safe. Using the IUCN Red List, the authors classified 107 species of rays (including skates) and 74 species of sharks as threatened. Just 23 percent of species were labeled as being Least Concern.
Major hotspots for shark and ray depletion identified in the study were the Indo-Pacific (particularly the Gulf of Thailand), the Red Sea and the Mediterranean Sea.
"In the most peril are the largest species of rays and sharks, especially those living in relatively shallow water that is accessible to fisheries. The combined effects of overexploitation – especially for the lucrative shark fin soup market – and habit degradation are most severe for the 90 species found in freshwater.
"A whole bunch of wildly charismatic species is at risk. Rays, including the majestic manta and devil rays, are generally worse off than sharks. Unless binding commitments to protect these fish are made now, there is a real risk that our grandchildren won't see sharks and rays in the wild."
Losing these fish will be like losing whole chapters of our evolutionary history, says Dulvy. "They are the only living representatives of the first lineage to have jaws, brains, placentas and the modern immune system of vertebrates."
The potential loss of the largest species is frightening for many reasons, he adds. "The biggest species tend to have the greatest predatory role. The loss of top or apex predators cascades throughout marine ecosystems."
The IUCN SSG is calling on governments to safeguard sharks, rays and chimaeras through a variety of measures, including the following: prohibition on catching the most threatened species, science-based fisheries quotas, protection of key habitats and improved enforcement.
Sharks' fin on the menu of a restaurant in Singapore. Credit: ProjectManhattan
Concerned by his "all of the above" energy strategy, a group of environmentalists this week sent a joint letter to President Barack Obama, calling on him to expand clean energy. This follows a similar effort last year by business leaders, philanthropists and election campaign supporters. The letter is reproduced here in full.
American Rivers | Clean Water Action | Defenders of Wildlife | Earthjustice
Energy Action Coalition | Environment America | Environmental Defense Fund
Friends of the Earth | League of Conservation Voters | National Audubon Society |
National Wildlife Federation | Native American Rights Fund
Natural Resources Defense Council | Oceana | Physicians for Social Responsibility |
Population Connection | Sierra Club | Voices for Progress
President Barack Obama
The White House
1600 Pennsylvania Ave NW
Washington, DC 20500
Dear Mr. President,
We applaud the actions you have taken to reduce economy-wide carbon pollution and your commitment last June "to take bold action to reduce carbon pollution" and "lead the world in a coordinated assault on climate change." We look forward to continuing to work with you to achieve these goals.
In that speech, you referenced that in the past you had put forward an "all of the above" energy strategy, yet noted that we cannot just drill our way out of our energy and climate challenge. We believe that continued reliance on an "all of the above" energy strategy would be fundamentally at odds with your goal of cutting carbon pollution and would undermine our nation's capacity to respond to the threat of climate disruption. With record-high atmospheric carbon concentrations and the rising threat of extreme heat, drought, wildfires and super storms, America's energy policies must reduce our dependence on fossil fuels, not simply reduce our dependence on foreign oil.
We understand that the U.S. cannot immediately end its use of fossil fuels and we also appreciate the advantages of being more energy independent. But an "all of the above" approach that places virtually no limits on whether, when, where or how fossil fuels are extracted ignores the impacts of carbon-intense fuels and is wrong for America's future. America requires an ambitious energy vision that reduces consumption of these fuels in order to meet the scale of the climate crisis.
An "all of the above" strategy is a compromise that future generations can't afford. It fails to prioritize clean energy and solutions that have already begun to replace fossil fuels, revitalize American industry, and save Americans money. It increases environmental injustice while it locks in the extraction of fossil fuels that will inevitably lead to a catastrophic climate future. It threatens our health, our homes, our most sensitive public lands, our oceans and our most precious wild places. Such a policy accelerates development of fuel sources that can negate the important progress you've already made on lowering U.S. carbon pollution, and it undermines U.S. credibility in the international community.
Mr. President, we were very heartened by your commitment that the climate impacts of the proposed Keystone XL pipeline would be "absolutely critical" to the decision and that it would be contrary to the "national interest" to approve a project that would "significantly exacerbate the problem of carbon pollution." We believe that a climate impact lens should be applied to all decisions regarding new fossil fuel development, and urge that a "carbon-reducing clean energy" strategy rather than an "all of the above" strategy become the operative paradigm for your administration's energy decisions.
In the coming months your administration will be making key decisions regarding fossil fuel development -- including the Keystone XL pipeline, fracking on public lands, and drilling in the Arctic ocean -- that will either set us on a path to achieve the clean energy future we all envision or will significantly exacerbate the problem of carbon pollution. We urge you to make climate impacts and emission increases critical considerations in each of these decisions.
Mr. President, we applaud you for your commitment to tackle the climate crisis and to build an economy powered by energy that is clean, safe, secure, and sustainable.
Wm. Robert Irvin
President and CEO
Clean Water Action
Jamie Rappaport Clark
President and CEO
Defenders of Wildlife
Trip Van Noppen
Energy Action Coalition
Environmental Defense Fund
Friends of the Earth
League of Conservation Voters
President and CEO
National Audubon Society
Larry J. Schweiger
President & CEO
National Wildlife Federation
Native American Rights Fund
Natural Resources Defense Council
Chief Executive Officer
Catherine Thomasson, MD
Physicians for Social Responsibility
A report published this week concludes that the lion is facing extinction across the entire West African region. The West African lion once ranged continuously from Senegal to Nigeria, but the new paper reveals there are now only an estimated 250 adult lions restricted to four isolated and severely imperiled populations. Only one of those populations contains more than 50 lions.
West African male lion. Credit: Jonas Van de Voorde (CC BY-SA 3.0)
Led by Dr. Philipp Henschel of conservation group Panthera, and co-authored by an international team from West Africa, the UK, Canada and the USA, this survey appears in the journal PLOS ONE. The report's sobering results represent a massive effort – taking six years and covering 11 countries where lions were presumed to exist in the last two decades. This new, highly detailed information builds on an earlier continent-wide review of lion status produced by Duke University to which Dr. Henschel also contributed. Both surveys were funded by National Geographic's Big Cats Initiative (BCI).
"When we set out in 2006 to survey all the lions of West Africa, the best reports suggested they still survived in 21 protected areas," explains Henschel. "We surveyed all of them, representing the best remaining lion habitat in West Africa. Our results came as a complete shock. All but a few of the areas we surveyed were basically 'paper parks', having neither management budgets nor patrol staff, and had lost all their lions and other iconic large mammals."
The team discovered that West African lions now survive in only 5 countries: Senegal, Nigeria and a single trans-frontier population on the shared borders of Benin, Niger and Burkina Faso. They are genetically distinct from the better-known lions of famous game parks in East and southern Africa. Recent molecular research shows they are closely related to the extinct "Barbary Lions" that once roamed North Africa, as well as to the last Asiatic lions surviving in India.
"West African lions have unique genetic sequences not found in any other lions, including in zoos or captivity," explained Dr. Christine Breitenmoser, co-chair of the IUCN/SCC Cat Specialist Group, which determines the conservation status of wild cats around the world. "If we lose the lion in West Africa, we will lose a unique, locally adapted population found nowhere else. It makes their conservation even more urgent."
Lions have disappeared across Africa as human populations and their livestock herds have expanded, competing for land with lions and other wildlife. Wild savannas are converted for agriculture and cattle, the lion's natural prey is hunted out and lions are killed by pastoralists fearing the loss of their herds.
National Geographic explorer and BCI co-founder Dereck Joubert commented: "Every survey we do is inaccurate because as soon as you complete it, it is already out of date; the declines are so rapid. It is a terribly sad state of affairs when you can very accurately count the lions in an area because there are so few of them. This is critical work that again confirms that we are underestimating the rate of decline of lion populations and that the situation requires a global emergency intervention."
Today, fewer than 35,000 lions remain in Africa in about 25% of the species' original range. In West Africa, the lion now survives in less than 50,000 square kilometres – smaller than half the size of New York State – and only 1% of its original historic range in the region.
Panthera's President, Dr. Luke Hunter: "Lions have undergone a catastrophic collapse in West Africa. The countries that have managed to retain them are struggling with pervasive poverty and little funding for conservation. To save the lion – and many other critically endangered mammals including unique populations of cheetahs, African wild dogs and elephants – will require a massive commitment of resources from the international community."
Ford Motor Company has announced the C-MAX Solar Energi Concept, a first-of-its-kind Sun-powered car with potential to deliver the best of what a plug-in hybrid offers – without depending on the electric grid for fuel.
Instead of powering its battery from an electrical outlet, the C-MAX Solar Energi harnesses power from the Sun by using a special concentrator that acts like a magnifying glass – directing intense rays to panels on the vehicle roof.
The result is a concept vehicle that takes a day’s worth of sunlight to deliver the same performance as the conventional C-MAX Energi plug-in hybrid, which draws its power from the electric grid. Ford C-MAX Energi gets a combined best miles per gallon equivalent in its class, with 108 MPGe city and 92 MPGe highway, for a combined average 100 MPGe. By using renewable power, it reduces the annual greenhouse gas emissions a typical owner would produce by four metric tons.
“Ford C-MAX Solar Energi Concept shines a new light on electric transportation and renewable energy,” said Mike Tinskey, Ford global director of vehicle electrification and infrastructure. “As an innovation leader, we want to further the public dialog about the art of the possible in moving the world toward a cleaner future.”
C-MAX Solar Energi Concept, which will be shown at the 2014 Consumer Electronics Show (CES) in Las Vegas, is a collaborative project of Ford, SunPower Corp and the Georgia Institute of Technology.
Strong electrified vehicle sales
The C-MAX Solar Energi Concept debuts as Ford caps a record year of electrified vehicle sales. The company expects to sell 85,000 hybrids, plug-in hybrids and all-electric vehicles for 2013 – the first full year its six new electrified vehicles were available in dealer showrooms.
Ford sold more plug-in vehicles in October and November than both Toyota and Tesla, and it outsold Toyota through the first 11 months of 2013. Plug-in hybrids continue to grow in sales as more customers discover the benefits of using electricity to extend their driving range.
Breakthrough clean technology
SunPower, which has been Ford’s solar technology partner since 2011, is providing high-efficiency solar cells for the roof of this concept car. Because of the extended time it takes to absorb enough energy to fully charge, Ford turned to the Georgia Institute of Technology for a way to amplify sunlight, to make a solar-powered hybrid feasible for daily use.
Researchers developed an off-vehicle solar concentrator (pictured below) with a special Fresnel lens to direct sunlight to the solar cells while boosting the impact of sunlight by a factor of eight. A Fresnel is a compact lens originally developed for use in lighthouses. Similar in concept to a magnifying glass, this patent-pending system tracks the Sun as it moves from east to west, drawing enough power each day to equal a four-hour battery charge (8 kilowatts).
With a full charge, the C-MAX Solar Energi Concept will achieve the same range as a conventional C-MAX Energi hybrid – up to 620 miles, including 21 electric-only miles. Additionally, the vehicle still has a charge port, and can be charged by connecting to a station via cord and plug, so that drivers retain the option to power up via the grid, if desired.
After the C-MAX Solar Energi Concept is shown at CES, Ford and Georgia Tech will begin testing the vehicle in numerous real-world scenarios. The outcome of those tests will help to determine if the concept is feasible as a production car.
By tapping renewable solar energy with a rooftop solar panel system, the C-MAX Solar Energi Concept is not dependent on the traditional electric grid for its battery power. Research by Ford suggests the Sun could power up to 75 percent of all trips made by an average driver in a solar hybrid car. This could be especially important in places where the electric grid is underdeveloped, unreliable or expensive to use.
The vehicle also reinforces MyEnergi Lifestyle, a concept revealed by Ford and several partners at 2013 CES. MyEnergi Lifestyle uses math, science and computer modelling to help homeowners understand how they can take advantage of energy-efficient home appliances, solar power systems and plug-in hybrid vehicles to significantly reduce monthly expenses while also reducing their overall carbon footprint.
The positive environmental impact from Ford C-MAX Solar Energi could be significant. It would reduce yearly CO2 and other greenhouse gas emissions from the average U.S. car owner by as much as four metric tons – the equivalent of what a U.S. house produces in four months.
If all light-duty vehicles in the United States were to adopt Ford C-MAX Solar Energi Concept technology, annual greenhouse gas emissions could be reduced by approximately 1 billion metric tons.
Global average temperatures will rise at least 4°C by 2100 and potentially more than 8°C by 2200 if carbon dioxide emissions are not reduced, according to new research that shows our climate is more sensitive to CO2 than most previous estimates.
This research could solve one of the great unknowns of climate sensitivity, the role of cloud formation and whether it will have a positive or negative effect on global warming.
Professor Steven Sherwood, from the University of New South Wales: "Our research has shown climate models indicating a low temperature response to a doubling of carbon dioxide from preindustrial times are not reproducing the correct processes that lead to cloud formation."
"When the processes are correct in the climate models, the level of climate sensitivity is far higher. Previously estimates of the sensitivity of global temperature to a doubling of carbon dioxide ranged from 1.5°C to 5°C. This new research takes away the lower end of climate sensitivity estimates, meaning that global average temperatures will increase by 3°C to 5°C with a doubling of carbon dioxide."
The key to this narrower but much higher estimate can be found in the observations around the role of water vapour in cloud formation. Observations show that when water vapour is taken up by the atmosphere through evaporation the updraughts often rise up to 15 km to form heavy rains, but can also rise just a few km before returning to the surface without forming such rains. In addition, where updraughts rise this smaller distance, they reduce total cloud cover because they pull more vapour away from the higher cloud forming regions than when only the deep ones are present.
Climate models showing a low temperature response to carbon dioxide do not include enough of this lower-level process. They instead simulate nearly all updraughts rising to 15 km. These deeper updraughts alone do not have the same effect, leading to increased reflection of sunlight and reduced sensitivity of the global climate to atmospheric carbon dioxide. However, real world observations show this behaviour is wrong.
When the processes are correct in the climate model, this produces cycles that take water vapour to a wider range of heights in the atmosphere, causing fewer clouds to form in a warmer climate. This increases the amount of sunlight and heat entering the atmosphere and increases the sensitivity of our climate to carbon dioxide or any other perturbation.
When water vapour processes are correctly represented, the sensitivity of the climate to a doubling of carbon dioxide – which will occur in the next 50 years – means we can expect a temperature increase of at least 3°C and more likely 4°C by 2100.
"Climate sceptics like to criticise climate models for getting things wrong, and we are the first to admit they are not perfect, but what we're finding is that the mistakes are being made by those models which predict less warming, not those that predict more," said Professor Sherwood. "Rises in global average temperatures of this magnitude will have profound impacts on the world and the economies of many countries if we don't urgently start to curb our emissions."
Climate change has not been strongly influenced by variations in heat from the sun, according to researchers from the University of Edinburgh's School of GeoSciences.
These findings cast doubt on the view that lengthy periods of warm and cold weather in the past might have been caused by periodic fluctuations in solar activity.
Research examining the causes of climate change in the northern hemisphere over the past 1000 years has shown that until 1800, the main driver of periodic changes in climate was volcanic eruptions. These tend to reduce the amount of sunlight reaching the Earth – causing cool, drier weather. Since 1900, greenhouse gases have been the primary cause of climate change.
The findings show that periods of low sun activity should not be expected to have a large impact on global temperatures and are expected to improve scientists' ability to predict future climate.
Scientists at the University of Edinburgh carried out the study using records of past temperatures constructed with data from tree rings and other historical sources. They compared this data record with computer-based models of past climate, featuring both significant and minor changes in the sun.
Their model of weak changes in the sun gave the best correlation with temperature records, indicating that solar activity has had a minimal impact on temperature over the past millennium.
Dr Andrew Schurer, of the University of Edinburgh's School of GeoSciences, said: "Until now, the influence of the sun on past climate has been poorly understood. We hope that our new discoveries will help improve our understanding of how temperatures have changed over the past few centuries, and improve predictions for how they might develop in future."
The study, published in Nature GeoScience, was supported by the Natural Environment Research Council.
Transport for London (TfL) and operator Go-Ahead London have begun a trial of the first 100% electric buses in the UK capital.
Two electric buses, built by Chinese manufacturer BYD Auto, were handed over in a ceremony involving the Mayor of London’s Environment spokesman, operator Go-Ahead and BYD. The buses will operate on two central London routes – numbers 507 and 521 – the first in the city to be serviced by fully electric, emissions-free buses.
The 12 metre BYD ebus has already been tested worldwide in major global cities, including in Europe: Paris, Bremen, Bonn, Madrid, Barcelona, Salzburg, Warsaw, Amsterdam, Brussels and Budapest. These trials have demonstrated a range that comfortably exceeds 250 kilometres (156 miles) on a single charge in real world urban conditions – sufficient to operate for a full day without the need to recharge. This performance has also been proven in China, where 220 ebuses have been running in the southern city of Shenzhen since January 2011, covering a total of 13 million miles (21 million km).
While the vehicles offer environmental and health benefits, they also provide major cost savings. Energy consumption is around 130kWh/100 km in urban conditions. The battery takes 4-5 hours to recharge from totally exhausted at a cost of only £19.44 (US$32.14) using off-peak electricity (the buses will be recharged at night). This represents a fuel cost saving of up to 75% compared to a diesel bus.
These trials will help TfL develop plans for greater use of electric buses in the future, supporting the Mayor's vision for an Ultra Low Emission Zone. In addition to the pair of buses in this test run, another six electric buses will be introduced into the fleet during early 2014.
London has recently come under fire over its air quality. Nitrogen dioxide (NO2) exceeds the recommended EU levels by over 50% in some areas, with the East End having the worst traffic pollution in Britain. This is a particular problem for the oldest and youngest members of society, who are more vulnerable to respiratory and heart diseases. It is estimated that more than 4,000 people die from air pollution each year in London.
Isbrand Ho, Managing Director of BYD Europe: “We are convinced that widespread adoption of the BYD ebus could have a dramatic effect on lowering pollution levels in major cities, so this development in London – one of the world’s top cities – is of tremendous importance. We look forward to a long and positive relationship with TfL and Go-Ahead."
Hybrid buses (i.e. combining electric and petrol-driven motors) already feature heavily in London with 600 now operating, Europe's largest hybrid bus fleet. By 2016, that number will almost triple to more than 1,700 – representing 20 per cent of London's total bus fleet, which carries more than six million passengers each weekday.
Hydrogen fuel cell-powered buses have also begun appearing. In December 2010, they came into service on route RV1 between Covent Garden and Tower Gateway. During summer 2013, the final bus was added to that service – making it the first 100% hydrogen-powered bus route in London.
Hybrid, pure electric, and hydrogen vehicles are still rare on a global basis. However, they are expected to expand rapidly in the next few decades.
Although the number of African elephants killed for their tusks declined slightly last year after worldwide recognition of this wildlife crime epidemic, the rates remain unacceptably high, conservation group WWF says. Data released by the UN shows that an estimated 22,000 elephants were slaughtered by poachers across Africa during 2012, down from the previous year's record of at least 25,000.
Central Africa remains the hardest hit with poaching rates twice as high as the continental average, according to analysis conducted on behalf of the 179 members of the Convention on International Trade in Endangered Species. The region has lost nearly two-thirds of its elephant population over the past decade, leaving little time left to reverse its decline.
"High level commitments to action against poaching and smuggling are beginning to have an impact, but Central Africa's endangered forest elephants remain in peril," said Lamine Sebogo, WWF's African Elephant Programme Manager.
The number of large ivory seizures increased last year – signalling better detection, but also indicating a continued involvement by organised criminal groups. Projections for 2013 are even graver; already this year over 40 tonnes of tusks have been confiscated while in transit.
Analysts from TRAFFIC, a joint programme of WWF and IUCN, have also found that smuggling routes are shifting as enforcement is bolstered in some locations. Although global shipping patterns are changing to exploit weaker systems, China remains the top destination for illegal ivory, TRAFFIC found.
"Wildlife crime is a serious global security issue and participation by all countries is required to stop it. Improvements are needed in regulation, enforcement, transparency, resourcing and transnational collaboration," Sebogo said.
Governments met last week at back-to-back summits in Botswana and Paris to agree emergency activities to protect elephants from poaching and trafficking, and to discuss the peace and security implications of this transnational crime. WWF has urged nations to adopt the Marrakech Declaration – a ten point action plan to combat illicit wildlife trafficking launched by the African Development Bank and WWF in May.
Next February, the UK government will host a global summit on illegal trade in wildlife. David Cameron will attend the summit in London, along with heads of government and other high level representatives from as many as 50 nations. Countries invited include those where poaching is threatening the survival of wildlife, and the biggest markets for illegal wildlife products, including Vietnam and China. WWF hopes the participants will agree to strong action to tackle this destructive trade, and implement those measures afterwards.
A newly-discovered greenhouse gas, perfluorotributylamine, has over 7,000 times the heat-trapping ability of CO2 over a 100-year period.
Scientists from the University of Toronto have discovered a novel chemical lurking in the atmosphere that appears to be a long-lived greenhouse gas (LLGHG). The chemical – perfluorotributylamine (PFTBA) – is among the most radiatively efficient chemicals found to date.
Radiative efficiency describes how effectively a molecule can affect climate. This value is then multiplied by its atmospheric concentration to determine the total climate impact.
PFTBA has been in use since the mid-20th century for various applications in electrical equipment and is currently used in thermally and chemically stable liquids marketed for use in electronic testing and as heat transfer agents. It does not occur naturally; that is, it is produced by humans. There are no known processes that would destroy or remove PFTBA in the lower atmosphere so it has a very long lifetime, possibly up to 500 years, and is destroyed in the upper atmosphere.
"Global warming potential is a metric used to compare the cumulative effects of different greenhouse gases on climate over a specified time period," said Cora Young who was part of the research team, along with Angela Hong and their supervisor, Scott Mabury. Time is incorporated in the global warming potential metric as different compounds stay in the atmosphere for different lengths of time, which determines how long-lasting the climate impacts are.
Carbon dioxide (CO2) is used as the baseline for comparison, since it is the most important greenhouse gas responsible for human-induced climate change. "PFTBA is extremely long-lived in the atmosphere and has very high radiative efficiency; the result of this is a very high global warming potential. Calculated over a 100-year timeframe, a single molecule of PFTBA has the equivalent climate impact as 7,100 molecules of CO2," said Hong. "There are no policies that control its production, use, or emission. It is not being regulated by any type of climate policy."
Dr Drew Shindell, a climatologist at NASA: "This is a warning to us that this gas could have a very, very large impact on climate change – if there were a lot of it. Since there is not a lot of it now, we don't have to worry about it at present, but we have to make sure it doesn't grow and become a very large contributor to global warming."
Researchers at the University of Southampton have identified regions beneath the oceans where igneous rocks of the upper ocean crust could safely store huge volumes of carbon dioxide.
The burning of fossil fuels such as coal, oil, and natural gas has led to dramatically increased levels of CO2 in our planet's atmosphere. These industrial emissions reached 36 billion tonnes annually in 2013 – over 100 times greater than natural CO2 output from all of the world's volcanoes, according to the US Geological Survey. The overwhelming majority of climate scientists agree this is causing climate change and ocean acidification. Although technologies are now being developed to capture CO2 from power stations and other sources, this will only avoid further warming if that CO2 is then safely locked away from the atmosphere for centuries.
Chiara Marieni, PhD, from the National Oceanography Centre in Southampton, investigated the physical properties of CO2 to develop global maps of the ocean floor and estimate where CO2 can be safely stored.
At high pressures and low temperatures, like those in deep oceans, CO2 occurs as a liquid that is denser than seawater. Estimating temperatures in the upper ocean crust, Chiara and her colleagues identified regions where it may be possible to stably store large volumes of CO2 in the basalts. These fractured rocks have high proportions of open space and over time may also react with the CO2 so it becomes locked into solid calcium carbonate – permanently preventing its release back into the oceans or atmosphere. As a precaution, Chiara refined her locations to areas that have the additional protection of thick blankets of impermeable sediments to prevent gas escape.
The team identified five potential regions in off-shore Australia, Japan, Siberia, South Africa and Bermuda, ranging in size from ½ million square kilometres to almost four million square kilometres.
"We found regions that have the potential to store decades to hundreds of years of industrial carbon dioxide emissions," said Chiara, "although the largest regions are far off-shore. However, further work is needed in these regions to accurately measure local sediment conditions and sample the basalt beneath before this potential can be confirmed."
Her work, published this week in Geophysical Research Letters, shows that previous studies, which concentrated on the effects of pressure to liquefy CO2 – but ignored temperature – pointed to the wrong locations, where high temperatures mean the CO2 will have a low density, and thus be more likely to escape.
A new tidal energy device can operate cost-effectively in deep waters with low-velocity currents. A full-scale demonstration of this innovative technology is now planned for 2015.
Founded in 2007, Minesto is a marine energy company based in Sweden and Ireland. Their patented technology aims to provide green electricity production from tidal and ocean currents. Known as "Deep Green", it is based on a system of underwater kites, consisting of a wing and turbine, attached by a tether to a fixed point on the ocean bed. As water flows over the hydrodynamic wing, a lift force is generated which allows the device to move smoothly through the water and for the turbine to rotate – hence generating electricity.
The wing is designed to create high loads, requiring a stiff structure and light weight with sufficient fatigue and material properties and has to include watertight compartments ensuring a lifetime of 20 years. It contains a buoyancy system, batteries and pressure sensors.
The tether is mainly a force-bearing element designed to take the high loads created by the wing, but will also accommodate power cables from the generator and signal cable to the control system.
Minesto claims that Deep Green is the only marine power plant in the world that will operate cost-effectively in areas with low-velocity currents – as opposed to other technologies, which are restricted to tidal "hot spot" locations. Since it can operate economically in deep waters at velocities below 2.5 m/s, the number of potentially suitable sites is huge.
Other advantages of Deep Green include:
– A robust anchorage system, since no tower is needed
– Low maintenance cost, since only attachment and detachment has to be done offshore
– Minimal visual and environmental impact; Deep Green is always at 20 metres or more below the water surface
– Predictability. Tidal currents are extremely regular
– Small and low weight, less than 7 tons per unit, 500kW
– Gearless turbine-generator system
Vinnova, the Swedish Innovation Agency, yesterday awarded Minesto a grant for a pre-study of its planned ocean trials. Miniature versions of Deep Green have already undergone trials in scale 1:4 at Strangford Lough, Northern Ireland, and the next step is to develop a full-size prototype and test it in the ocean. The project funded by Vinnova will enable Minesto to assess the feasibility and budget in preparation for these full-scale trials in 2015.
Anders Jansson, CEO and Co-Founder: "Ocean currents are the hidden treasure of renewable energy sources. With their almost continuous water flows they carry large amounts of renewable energy over the globe, and with a high load factor compared to weather-dependent sources like wind or solar power. This resource is predictable and feasible for providing base grid power, and has minimal environmental impact."
"The challenge has been that the currents are too slow and the sites are too deep for most available marine power plants. Deep Green solves that problem. Minesto's technology will contribute to making countries like the USA, Japan and Taiwan carbon neutral and independent energy producers, instead of hugely dependent on fossil-based and imported energy."
"Just to take one example: Taiwan claims that 50 per cent of their energy can be supplied from the ocean currents along the coast if they just find a viable technology, and we believe that Deep Green is that technology. Today Taiwan depends on 98 per cent imported energy, which is a threat to the country's economy."
Even if carbon dioxide emissions came to a sudden halt, the carbon dioxide already in Earth's atmosphere will continue to warm our planet for hundreds of years, according to Princeton University-led research published in the journal Nature Climate Change. The study suggests that it might take a lot less carbon than previously thought to reach the global temperature scientists deem unsafe.
A graph from the paper, showing how global temperature drops after emissions cease, but rises again after 100 years (red line). The blue line, which represents a global model that does not account for a gradual decline in ocean heat uptake, shows a slow decline in temperature over the same time. Source: Frölicher et al. (2013)
The researchers simulated an Earth on which, after 1,800 billion tons of carbon entered the atmosphere, all emissions suddenly stopped. Scientists commonly use the scenario of emissions screeching to a stop to gauge the heat-trapping staying power of carbon dioxide. In this simulated shutoff, the carbon itself faded steadily with 40 percent absorbed by Earth's oceans and landmasses within 20 years, 60 percent within 100 years and 80 percent within 1,000 years.
By itself, this decrease of atmospheric carbon dioxide should lead to cooling. But the heat trapped by the carbon dioxide took a divergent track. After a century of cooling, the Earth warmed by 0.37ºC (0.66ºF) during the next 400 years as the ocean absorbed less and less heat. While the resulting temperature spike seems slight, a little heat goes a long way here. For context, Earth has warmed by only 0.85ºC (1.5ºF) since pre-industrial times.
The Intergovernmental Panel on Climate Change (IPCC) estimates that global temperatures a mere 2ºC (3.6ºF) higher than pre-industrial levels would dangerously interfere with the climate system. To avoid that point would mean humans have to keep cumulative carbon dioxide emissions below 1,000 billion tons of carbon – about half of which has already been put into the atmosphere since the dawn of industry.
The lingering warming effect the researchers found, however, suggests that the 2-degree point may be reached with much less carbon, said first author Thomas Frölicher, who conducted the work as a postdoctoral researcher in Princeton's Program in Atmospheric and Oceanic Sciences.
"If our results are correct, the total carbon emissions required to stay below 2 degrees of warming would have to be three-quarters of previous estimates, only 750 billion tons instead of 1,000 billion tons of carbon," he said. "Thus, limiting the warming to 2 degrees would require keeping future cumulative emissions below 250 billion tons – only half of the already emitted amount of 500 billion tons.”
The researchers' work contradicts a scientific consensus that the global temperature would remain constant or decline if emissions were suddenly cut to zero. But previous research did not account for a gradual reduction in the oceans' ability to absorb heat from the atmosphere, especially the polar oceans, Frölicher said. Although CO2 steadily dissipates, Frölicher and his co-authors were able to see that the oceans that remove heat from the atmosphere gradually take up less. Eventually, the residual heat offsets the cooling that occurred due to dwindling amounts of carbon dioxide.
Photo courtesy of Eric Galbraith, McGill University
Frölicher and his co-authors showed that the change in ocean heat uptake in the polar regions has a larger effect on global average temperature than a change in low-latitude oceans, a mechanism known as "ocean-heat uptake efficacy." This mechanism was first explored in a 2010 paper by Frölicher's co-author, Michael Winton.
"The regional uptake of heat plays a central role," Frölicher explained. "Previous models have not really represented that very well."
"Scientists have thought that the temperature stays constant or declines once emissions stop, but now we show that the possibility of a temperature increase cannot be excluded," he continued. "This is illustrative of how difficult it may be to reverse climate change — we stop the emissions, but still get an increase in the global mean temperature."
Planetary Resources, Inc. was co-founded in 2010 by Peter Diamandis and Eric C. Anderson. This new startup company hopes to address one of the paramount problems faced on Earth: resource scarcity. It will achieve this by developing a robotic asteroid mining industry, based on reduced fuel costs. As this video explains, prospecting and mining asteroids could drive economic growth into the Solar System, where potentially trillions of dollars' worth of metals and minerals lie. Planetary Resources has already signed an agreement with Virgin Galactic for payload services. In early 2014, they plan to launch "Arkyd-3", a testbed for the larger Arkyd-100 spacecraft that will hunt for asteroids.
Using a vast database of high-resolution satellite imagery, researchers at the University of Maryland have produced the most detailed ever map of forest loss and gain between 2000 and 2012. This has been made freely available on Google Earth.
A total of 654,000 cloud-free images were obtained from Landsat 7 – an orbiting satellite managed and operated by the U.S. Geological Survey. Each of the 143 billion pixels on the map shows an area roughly 30 metres square and provided researchers with enough detail to determine if an area had lost or gained tree cover. It was estimated that 888,000 sq mi (2.3 million sq km) of forest were lost, while around 309,000 sq mi (0.8 million sq km) regrew, a net loss of 579,000 sq mi (1.5 million sq km). This resulted from a combination of logging, wildfires, windstorms and insect pests.
Key to the project was collaboration with team members from Google Earth Engine, who reproduced in the Google Cloud the models developed at the University of Maryland for processing and characterising Landsat data. Prior to this study, country-to-country comparisons of forestry data were not possible at this level of accuracy. Different countries define forests differently, making previous global comparisons difficult with existing inventories.
Professor of Geographical Sciences, Matthew Hansen, who led the study: "When you put together datasets that employ different methods and definitions, it's hard to synthesise. But with Landsat, as a polar-orbiting instrument that takes the same quality pictures everywhere, we can apply the same algorithm to forests in the Amazon, in the Congo, in Indonesia, and so on. It's a huge improvement in our global monitoring capabilities."
"Losses or gains in forest cover shape many important aspects of an ecosystem – including climate regulation, carbon storage, biodiversity and water supplies, but until now there has not been a way to get detailed, accurate, satellite-based and readily available data on forest cover change from local to global scales."
Jeff Masek, Landsat project scientist at NASA: "Since the first Landsat satellite launched 41 years ago, scientists have been improving their land cover analysis as computers have become more powerful. Projects like Hansen's took a big leap forward once USGS made the data freely available on the Internet in 2008."
"This is the first time somebody has been able to do a wall-to-wall, global Landsat analysis of all the world's forests – where they're being cleared, where they're regrowing, and where they're subject to natural disturbances," he added, noting that the maps could be routinely updated to help in carbon accounting and other studies of land cover change.
Subtropical forests were found to have the highest rates of change, largely due to intensive forestry land uses. The rate of disturbance for North American subtropical forests, located in the Southeast United States, was found to be four times that of South American rainforests. More than 31 percent of U.S. southeastern forest cover was either lost or regrown. Landowners in this area harvest trees for timber and quickly plant replacements, treating them like crops, which makes it a highly intensive region for tree loss and gain.
Brazil's recent efforts to reduce its deforestation were offset by increasing forest loss in other tropical regions including Angola, Bolivia, Indonesia, Malaysia, Paraguay and Zambia. The country with the largest relative increase in forest loss was Indonesia, more than doubling its annual figure to almost 20,000 sq km (7,722 sq mi). Other countries with high rates of deforestation included Russia and Canada.
To view the forest cover maps in Google Earth Engine, visit the following link: