future timeline technology singularity humanity
 
   
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed
 
     
     
 
       
 
 
 

Blog » Energy & the Environment

 
     
 

28th September 2014

Humanity on track for worst-case emissions scenario

Worldwide emissions of carbon dioxide continue to outpace reduction measures, putting the world on course for a worst-case scenario later in the 21st century.

 

global carbon emissions future trends 2020 2050 2100 timeline

 

This week, the United Nations hosted the Climate Summit 2014 in New York City. Prior to the conference, an estimated 400,000+ people took part in the People's Climate March, the largest ever protest of its kind. The summit opening featured an impassioned speech from actor Leonardo DiCaprio — pleading for world leaders to address the looming crisis and stating: "You can make history, or you will be vilified by it."

Despite rapid growth in clean tech, emissions continue to trend at the high end of scenarios, eroding the chances to keep warming below the recommended limit of 2°C. In 2013, the total output of CO2 from fossil fuel combustion and cement production grew by 2.3 per cent to a record high of 36.1 billion tonnes. This year, emissions are set to increase by a further 2.5%, 65 per cent higher than the 1990 level. Globally, August 2014 was the hottest August on record, according to data from NOAA. This follows the hottest May and June also this year. As greenhouse gases continue to build in the atmosphere and oceans, the world appears to be heading for a genuine catastrophe unless immediate, large-scale action is taken.

In its yearly analysis of trends in global carbon emissions, the Global Carbon Project (GCP) has published three peer-reviewed articles, highlighting a number of recent developments. Among the findings are that China's emissions per head of population have now surpassed the EU for the first time, reaching 7.2 tonnes per person, compared to the EU's 6.8 tonnes. India is also forecast to overtake Europe's CO2 output by 2019.

“China continues to reshape the global distribution of emissions, and as politics impedes significant progress in the US and other key countries, observers increasingly look to China to provide a breakthrough in climate negotiations”, says Glen Peters, a co-author of the studies.

On current trends, the remaining "carbon budget" to surpass 2°C of global warming will be used up in around 30 years (one generation). This quota implies that over two-thirds of proven fossil fuel reserves – amounting to nearly a trillion tonnes – will have to remain in the ground.

“Globally, emissions would need sustained and unprecedented reductions of around 7%/year for a likely chance to stay within the quota”, says Peters. “Furthermore,” he adds, “because of differentiated capabilities, some countries would need even higher rates of emissions reductions. These rates have not been seen in any individual country outside of severe economic crises.”

 

 

The ability to keep temperatures below 2°C depends on three things: uncertainties in the climate system, when deep and sustained mitigation starts, and rapid development of new technologies.

Robbie Andrew from the Centre for International Climate and Environmental Research (CICERO) in Oslo, Norway: “Most scenarios consistent with 2°C used in the IPCC Fifth Assessment Report largely depend on carbon capture and storage, both from fossil-fuel combustion and, particularly, bioenergy.”

But the development and deployment of CCS technologies has not lived up to expectations.

“Today’s emission-reduction targets need to incorporate the risk that society is unable to commercially develop and rapidly deploy a technology that is so far largely unproven at the required scale”, says Peters. “If carbon capture and storage technologies are not realised, it may not be possible to keep the temperature increase below 2°C.”

There were some positive outcomes at the New York summit this week. A pledge was made by governments, multinational companies and campaigners to halve the rate of deforestation by 2020 and halt it completely by 2030, alongside restoring 1.35 million sq miles (3.5 million sq km) of degraded land, an area the size of India. It is estimated that this could save between 4.5 and 8.8 billion tonnes of carbon emissions per year by 2030 – the equivalent of taking all of the world’s cars off the road. More than 70 countries and 1,000 companies endorsed the idea of mechanisms to reflect the true costs of emissions and other forms of pollution. The Rockefeller family also announced a divestment of some $50bn (£31bn) in fossil fuel assets.

Next year's climate summit, hosted in Paris, will be seen as crucial to making progress. According to the organising committee, the 2015 conference will attempt to achieve – for the first time in over 20 years of UN negotiations – a legally binding and universal agreement on carbon emissions, from all the nations of the world. Implementation will follow in 2020 if successful.

 

  speech bubble Comments »
 

 

 

24th September 2014

1000-fold increase in next-generation battery capacity by 2023

New chemistries offering higher energy density and lower prices will lead to exponential growth in the worldwide capacity of next-generation advanced batteries.

 

exponential growth in next generation advanced battery capacity 2023

 

A report from Navigant Research analyses the global market for next-generation advanced batteries, focusing on the current leading battery chemistry – lithium ion – and the energy storage device types that might eventually replace it. While lithium ion (Li-ion) batteries offer many advantages over traditional battery technologies, research and development of new battery chemistries that, in many ways, surpass Li-ion is advancing rapidly and is expected to have a major impact on the battery industry in the coming years. These new chemistries are anticipated to enable even more applications for batteries, thus increasing the overall size of the market. According to Navigant, total worldwide capacity of advanced batteries is expected to grow from 30.4 megawatt-hours (MWh) in 2014 to more than 28,000 MWh in 2023.

“The limitations to Li-ion, including input costs, safety issues, and materials scarcity, could leave it vulnerable to new chemistries that solve some or all of those problems,” says Sam Jaffe, principal research analyst with Navigant Research. “Although most of the chemistries explored in this report are only at laboratory-scale production levels today, they could reshape the market for advanced batteries in the next 10 years.”

New emerging battery types include ultracapacitors, lithium sulfur, magnesium ion, solid electrolyte, next-generation flow, and metal-air. Their advent is occurring alongside an enormous growth in the world’s appetite for advanced energy storage devices. Increasingly, this will include electric vehicles and localised home energy storage, e.g. from rooftop solar.

The report, “Next-Generation Advanced Batteries”, offers detailed analysis of the market issues and battery landscape associated with next-generation advanced batteries. Global market forecasts for energy capacity, power capacity, and revenue, segmented by region, application, and chemistry, extend through 2023. The report also examines the key imminent and emerging technologies related to advanced batteries and provides in-depth profiles of the major industry players. An Executive Summary is available as a free download.

 

  speech bubble Comments »
 

 

 

23rd September 2014

Liberia to halt all deforestation by 2020

In return for development aid, Liberia will become the first African nation to completely stop cutting down its trees.

 

Liberia forests
Forests in Liberia. Credit: US Agency for International Development

 

At the UN Climate Summit in New York today, officials announced a $150m agreement that will see Norway paying Liberia to completely stop its deforestation by 2020. This deal prevents new logging contracts, gives more power to forest-dependent communities to manage their forests, and increases the protected forest areas, with more monitoring and policing. There will be independent verification that trees remain standing.

The small West African nation is currently dealing with an Ebola outbreak. This new partnership between the Norwegian and Liberian governments will help to repair the country's economy and place it on a more sustainable path to poverty reduction and environmental protection. Although Liberia has smaller forests than other regions, it is nevertheless a biodiversity hotspot – home to endangered chimpanzees, forest elephants and leopards. It is estimated that one third of Liberia’s 4.3 million people live in these forests, with many more reliant on them.

"We hope Liberia will be able to cut emissions and reduce poverty at the same time," said Jens Frolich Holte, an adviser to the Norwegian government, speaking to the BBC. "We have funded efforts in Indonesia and Brazil, but I think this is the first time we have entered a deal on a country level."

 

Liberia Africa forest map

 

"Today’s announcement by Liberia and Norway is momentous,” said Patrick Alley, director of Global Witness. “For decades, Liberia’s forests have been more of a curse than a blessing. Timber revenues funded Charles Taylor’s regime during Liberia’s brutal civil war. Since then the experiment to generate economic development through industrial-scale logging has failed, with logging companies routinely logging illegally, skirting taxes, and causing huge damage to forests and forest communities. The proposed shift towards community management and conservation could be a profound reversal of that failed model.”

“Over the past two years, the Liberian Government has taken steps to improve governance and is now showing real commitment to helping communities, not companies, benefit from the forest,” he added. “With today’s pledges and Norway’s help, we are hopeful that Liberia will continue down this path, although ultimately the proof of this deal will be in its implementation.”

 

  speech bubble Comments »
 

 

 

13th September 2014

The coffee genome is sequenced

The coffee genome has been published, with more than 25,000 genes identified. This reveals that coffee plants make caffeine using a different set of genes from those found in tea, cacao and other such plants. The new findings could help to improve coffee production in the future.

 

coffee

 

Researchers have published the genome of Coffea canephora, a plant which accounts for about 30 percent of the world's coffee production. By comparing the sequences and positions of genes in coffee, tea and cacao (chocolate) plants, they have shown how enzymes involved in producing caffeine likely evolved independently in each of these three organisms. In other words, coffee did not inherit caffeine-linked genes from a shared common ancestor – but instead developed the genes on its own.

Compared to several other plant species – including the grape and tomato – coffee has larger families of genes that relate to the production of alkaloid and flavonoid compounds, which contribute to qualities such as aroma and bitterness. Coffee also has an expanded collection of N-methyltransferases, enzymes that are involved in making caffeine.

Upon taking a closer look, the researchers found that coffee's caffeine enzymes are more closely related to other genes within the coffee plant than to caffeine enzymes in tea and chocolate. This provides evidence that caffeine production emerged independently in coffee. If this trait had been inherited from a shared common ancestor, the enzymes would have been more similar between species.

There are several possible reasons why caffeine is so important in nature. The chemical may help to deter pests, as well as nearby competitors by stunting their growth when coffee leaves fall on the soil. It may also facilitate pollination. One recent paper showed that – like humans – certain insects can develop caffeine addiction. Bees visiting caffeine-producing plants often returned to get another taste.

 

coffee plant

 

Worldwide, over 2.2 billion cups of coffee are consumed daily. It is the principal agricultural product of many tropical countries. According to estimates by the International Coffee Organisation, more than 8.7 million tons of coffee is produced each year from 110,000 sq km (42,500 sq mi) of land – an area equivalent in size to the U.S. state of Pennsylvania. Annual export revenue is $15.4 billion and the sector employs 26 million people in 52 countries.

Philippe Lashermes, at the French Institute of Research for Development: "Coffee is as important to everyday early risers as it is to the global economy. Accordingly, a genome sequence could be a significant step toward improving coffee."

In addition to new and exotic flavours, these improvements may include better resistance to drought and disease. Leaf rust, for example, is currently affecting about half the plants in Central America, in the worst outbreak since 1976. Scientists could also engineer the plants to grow faster and increase their output of coffee beans. Such genetic enhancements may prove vital in the future – a study in 2012 estimated that climate change alone will lead to the extinction of wild Arabica coffee (Coffea arabica) by the 2080s.

 

 

  speech bubble Comments »
 

 

 

8th September 2014

Blue whales in coastal California recover to
historic numbers

Blue whales off the California coast have recovered to near historical population levels, now numbering about 2,200.

 

blue whale

 

At 30 metres (98 ft) in length and reaching 190 tons or more in weight, the blue whale is the largest existing animal and the heaviest that ever existed – twice the weight of the largest known dinosaur. Blue whales were abundant in nearly all oceans on Earth until the early 20th century. For more than a hundred years they were hunted almost to extinction by whalers, until protected by the international community in 1966.

In coastal California waters, blue whale numbers have now rebounded to near historic levels, according to a study published in Marine Mammal Science – and while the number of these animals being struck by ships is higher than U.S. limits, such strikes do not immediately threaten that recovery. This is the only population of blue whales in the world known to have recovered from whaling.

"The recovery of California blue whales from whaling demonstrates the ability of blue whale populations to rebuild under careful management and conservation measures," says Cole Monnahan, from the University of Washington, lead author of the study.

California blue whales are at their most visible while feeding 20 to 30 miles off the California coast, but are actually found along the eastern side of the Pacific Ocean from the equator up into the Gulf of Alaska. Their numbers in this region are now estimated at 2,200 – which is likely 97 percent of the historical level, according to the model the researchers used.

 

coastal california map
Map of coastal California. Credit: BrendelSignature (Cc-by-sa-3.0)

 

There are likely at least 11 blue whales struck by ships every year along the West Coast, which is above the "potential biological removal" of 3.1 whales per year allowed by the U.S. Marine Mammal Protection Act. However, the new findings show there could be an 11-fold increase in vessels before there is a 50 percent chance that the population will drop below what is considered "depleted" by regulators.

"Even accepting our results that the current level of ship strikes is not going to cause overall population declines, there is still going to be ongoing concern that we don't want these whales killed by ships," said co-author Trevor Branch, assistant professor of aquatic and fishery sciences.

The population returning to its historical level explains the slowdown in population growth, noted in recent years, better than the idea of ship strikes, according to the scientists. With no other readily apparent human-caused factor, whale numbers are reaching their habitat limit – something called the carrying capacity.

 

blue whales
Credit: J Gilpatrick/M Lynn/NOAA

 

"We think the California population has reached the capacity of what the system can take as far as blue whales," said Branch.

"Our findings aren't meant to deprive California blue whales of protections that they need going forward," Monnahan said. "California blue whales are recovering because we took actions to stop catches and start monitoring. If we hadn't, the population might have been pushed to near extinction – an unfortunate fate suffered by other blue whale populations."

"It's a conservation success story," Monnahan concluded.

 

  speech bubble Comments »
 

 

 

7th September 2014

The first coal plant in the U.S. to capture and store CO2 underground

This week, the Environmental Protection Agency (EPA) approved permits allowing the FutureGen Industrial Alliance to capture and store CO2 deep underground near Jacksonville, Illinois – the first project of its kind in the U.S.

 

carbon capture and storage map
The Meredosia Energy Centre. Credit: FutureGen

 

Atmospheric levels of carbon dioxide (CO2) currently stand at nearly 400ppm, the highest concentration for millions of years. This is largely a result of human industrial activity, the evidence for which is overwhelming. A new study published in Climate Risk Management finds there is a 99.999% certainty that humans are driving global warming, while a paper earlier this year revealed a similar figure of 99.9%. Claims of a "pause" in warming are refuted when accounting for missing heat data in the Arctic, and in any case, 93% of heat is found in the oceans rather than surface temperatures.

Worldwide, coal supplies about 30% of energy, accounting for 44% of CO2 emissions. In the United States, its use has declined in recent years, but still represents nearly a quarter of the nation's greenhouse gas emissions. Plans are underway to reduce carbon pollution from U.S. coal plants by 30% from 2005 levels by 2030 – through efficiency measures, shifting from coal to gas, investing in clean energy and making power plant upgrades.

Carbon capture and storage (CCS) will form part of this strategy. The trapping of CO2 in deep geological formations has been proposed, with studies having identified regions where centuries' worth of emissions could be safely sequestered. Industry forecasts indicate that CCS will achieve widespread adoption by the late 2020s, by which time sequestered coal-based electricity may cost less than unsequestered coal-based power today. The IPCC estimates that the economic potential of CCS could be up to 55% of the total carbon mitigation effort until the year 2100.

 

carbon capture system
The oxy-combustion process being developed by FutureGen.

 

Since 2003, a non-profit group known as the FutureGen Industrial Alliance, in partnership with the U.S. Department of Energy, has been seeking to develop a first-of-its-kind, near-zero emissions coal plant. A new oxy-coal system using an innovative purification method has shown potential as a way of separating, compressing and storing over 90% of CO2 generated from the combustion process. After successful testing at 30 MW scale, the project was relaunched as FutureGen 2.0, with plans for commercial-scale validation at an existing 200 MW coal plant in Meredosia, Illinois. Retrofitting this old power station (seen in the photo above) would enable the capturing of 1.1 million tons of CO2 each year – equivalent to emissions from 232,000 cars.

The project has faced a number of setbacks and cost overruns. This week, however, the Environmental Protection Agency (EPA) finally approved permits allowing CO2 to be captured at Meredosia, transported along a 30 mi (48 km) pipeline and stored 4,000 ft (1.2 km) underground at a site in Jacksonville. Drilling of the four wells containing liquefied gas could begin as soon as next month, according to the EPA's press release. The project will also include a visitors, research and training centre.

A total of 22 million metric tons of CO2 are expected to be captured over the 20 year life of the project. This appears miniscule when compared to the almost 5.5 billion tons of annual emissions in the U.S. – not to mention the 1,400 gigatons of global CO2 output since the Industrial Revolution, something which is going to need resolving later in this century. Nevertheless, FutureGen 2.0 represents an important milestone in CCS technology and a vital early step on the long road to decarbonisation. For more information, visit FutureGenAlliance.org.

 

ccs map route

 

  speech bubble Comments »
 

 

 

2nd September 2014

An office enriched with plants makes staff happier and boosts productivity by 15 per cent

Future working environments could benefit from adding more greenery, if they follow scientific advice from the University of Queensland.

 

office desk with plants

 

An office enriched with plants makes staff happier and boosts productivity by 15 per cent, a University of Queensland researcher has found. The study is the first of its kind to assess long-term impacts of plants in office environments. Co-authored by Professor Alex Haslam from UQ’s School of Psychology, the study found that adding plants to an office also improved employee satisfaction and quality of life. A green office helps employees to be more physically, mentally and emotionally involved in their work.

“Office landscaping helps the workplace become a more enjoyable, comfortable and profitable place to be,” said Haslam. “It appears that in part this is because a green office communicates to employees that their employer cares about them and their welfare. Employees from previously ‘lean’ office environments experienced increased levels of happiness, resulting in a more effective workplace.”

The study was conducted in partnership with researchers from Cardiff University, the University of Exeter and the University of Groningen. The research examined the impact ‘lean’ versus ‘green’ office space has on employees from two large commercial offices in the UK and the Netherlands. A team monitored staff productivity levels over a two-month period, and employees were surveyed to determine perceptions of air quality, concentration and workplace satisfaction.

 

office plants

 

“Employees were more satisfied with their workplace and reported increased concentration levels and better perceived air quality in an office with plants,” Professor Haslam said. “The findings suggest that investing in landscaping an office will pay off through an increase in office workers’ quality of life and productivity.”

Professor Haslam also said the findings challenge modern business philosophies that suggest a lean office is a more productive one.

“The ‘lean’ philosophy has been influential across a wide range of organisational domains,” he said. “Modern offices and desks have been stripped back to create sparse spaces. Our findings question this widespread theory that less is more – sometimes less is just less.”

 

  speech bubble Comments »
 

 

 

31st August 2014

New method automates plastic sorting for recycling

By measuring a plastic item's fluorescence half-life, it's possible to identify the exact kind of plastic it's made from.

 

plastics ready for recycling

 

A team of researchers at the University of Munich in Germany (LMU) has developed a new process that will greatly simplify the sorting of plastics in recycling plants. This method enables the automated identification of polymers, facilitating rapid separation of plastics for re-use.

The new technique, which is being patented soon, involves exposing particles of plastic to a brief flash of light which causes their material to fluoresce. Photoelectric sensors then measure the intensity of the light emitted in response to the induced photoexcitation to determine the dynamics of its decay. Because the different polymer materials used in the manufacture of plastics display specific fluorescence lifetimes, the pattern of the decay curve can be used to identify their chemical nature.

Prof. Heinz Langhals, of the LMU's Department of Chemistry: “With this process, errors in measurement are practically ruled out; for any given material, one will always obtain the same value for the fluorescence half-life, just as in the case of radioactive decay.”

Even in its current prototype form, the machine can sort up to 1.5 tons (1.4 tonnes) of plastic per hour. According to the researchers, this figure meets the specifications required for application on an industrial scale.

A paper on the research is published in the journal Green and Sustainable Chemistry.

 

  speech bubble Comments »
 

 

 

24th August 2014

New satellite data shows "unprecedented" ice loss from Greenland and West Antarctic

Ice loss from the Greenland and West Antarctic ice sheets has more than doubled in the last five years, based on extensive mapping by the European satellite CryoSat-2. This "unprecedented" rate of melting – around 500 cubic kilometres of ice per year – could mean future sea levels have been underestimated.

 

antarctica ice

 

Researchers from the Alfred Wegener Institute in Germany have – for the first time – extensively mapped both Greenland’s and Antarctica’s ice sheets using the recently launched ESA satellite CryoSat-2. This new data shows that the ice crusts of these regions are declining at a rate never seen before.

Elevation changes were calculated from a high-precision altimeter, using 200 million data points for Antarctica and 14.3 million for Greenland. The loss of ice volume since 2009 was found to have doubled in Greenland and tripled in West Antarctica, with a combined thinning of 500 km3 (120 mi3) per year. That is equivalent to a 6cm (2.5") layer of water covering the entire surface of the contiguous United States.

The areas where CryoSat-2 found the biggest elevation changes were Jakobshavn Glacier in West Greenland and Pine Island Glacier in West Antarctica. Since February 2014, scientists have known that the Jakobshavn Glacier is collapsing into the ocean at a record rate of 46 m/day. The Pine Island Glacier hit the headlines in July 2013, when a table iceberg the size of Hamburg was seen breaking off the tip of its ice shelf.

The Western Antarctic is rapidly losing ice volume – but East Antarctica is gaining volume, an argument frequently put forward by climate sceptics. However, this small increase occurring in the east is nowhere near enough to compensate the huge losses on the other side of the continent, as the map shows below.

 

elevation change maps greenland and antarctica

Maps of elevation changes in Greenland (left) and Antarctica (right). Red shows ice losses, while blue shows ice gains.

 

These latest findings, published in the 20th August issue of The Cryosphere, will add to concerns that future sea levels may be underestimated by the Intergovernmental Panel on Climate Change (IPCC). A survey by the Vision Prize – which provides impartial and independent polling of experts on important scientific issues – has found a majority of expert respondents believe that future sea levels will be at the upper end of the IPCC's projections. New research in Nature Geoscience recently showed how Greenland is far more vulnerable to warming ocean waters than had previously been thought, with NASA glaciologist Eric Rignot stating that "the globe's ice sheets will contribute far more to sea level rise than current projections show." In fact the West Antarctic Ice Sheet has now begun an "irreversible" process of collapse which has "passed the point of no return" according to NASA. It is also worth noting that the so-called warming pause can be explained by missing heat data in the polar regions, the Arctic for example having warmed roughly eight times faster than the rest of the planet. There is certainly no hiatus when looking at extreme high temperature records.

This has grave implications for the future. Low-lying regions such as Bangladesh, the Maldives and Kivalina will be among the first places to be affected, followed by Bangkok, then major Western cities by the 2050s and 2060s, resulting in massive geoengineering projects. Because of the delayed reaction from these interventions, the crisis will likely continue into the 2070s and possibly beyond. If society collapses and the mitigation efforts fail, sea levels could ultimately rise 66 metres (216 ft) by 7000 AD.

 

ice melt

 

  speech bubble Comments »
 

 

 

22nd August 2014

Tidal stream and wave power – slower than expected growth by 2020

Bloomberg New Energy Finance has revised down its forecasts for global tidal stream and wave power deployment in 2020 – by 11 percent and 72 percent respectively.

 

undersea turbine
The Ness of Duncansby project. Credit: ScottishRenewables.com

 

Global installations of tidal stream and wave power are set to grow to 148MW and 21MW respectively by 2020, from almost nothing today, according to new research from Bloomberg New Energy Finance, but these are still trifling amounts in the context of the world’s power system.

The emergence of marine renewable energy technologies is taking longer than hoped, due to project setbacks, fatigue among venture capital investors, and the sheer difficulty of deploying devices in the harsh marine environment. This latest forecast represents a downward revision from the figures of 167MW for tidal stream and 74MW for wave that Bloomberg New Energy Finance published a year ago.

Tidal stream power involves using machines resembling underwater wind turbines to convert the energy of the tides into electricity. Wave power involves the use of buoys, snakes, flaps and other devices to capture the energy of the waves. Engineers and entrepreneurs have been working hard on both for the last two decades, spending hundreds of millions of dollars.

 

wave energy
Credit: AlphaGalileo Foundation

 

Angus McCrone, senior analyst at Bloomberg New Energy Finance, said: “Governments in countries such as the UK, France, Australia and Canada have identified tidal and wave as large opportunities not just for clean power generation, but also for creating local jobs and building national technological expertise. That continues to be the case, and we will see further progress over the rest of this decade. But caution is necessary because taking devices from the small-scale demonstrator stage to the pre-commercial array stage is proving even more expensive and time-consuming than many companies – and their investors – expected.”

The last 12 months have seen a number of wave power companies fail or falter. Oceanlinx and Wavebob went out of business, Wavegen was folded back into parent company Voith, AWS Ocean Energy scaled back its activities, and Ocean Power Technologies has cancelled two of its main projects. Other wave firms such as Aquamarine, Carnegie, Pelamis and Seabased have pressed on with device and project development.

There have been clearer positives for tidal stream technology companies, with Andritz Hydro Hammerfest and Alstom/TGL both earning Renewable Obligation Certificates for electricity generated from devices at the European Marine Energy Centre in Orkney, Scotland, and Atlantis Resources raising £12m (US$20m) in an initial public offering on London’s AIM in February.

However, the amount of marine energy capacity installed and generating consistently for a period of years remains tiny. There is only the 1.2MW SeaGen tidal stream device owned by Siemens/Marine Current Turbines in Strangford Lough, Northern Ireland, and a few small pilot wave power plants.

Michael Liebreich, chairman of the advisory board at Bloomberg New Energy Finance, commented: “Tidal stream and wave power companies continue to face huge challenges. Although the potential is almost limitless, it’s a tough environment. It is possible to make equipment reliable, as the offshore oil and gas industry has shown, but it’s not cheap. And you have to put a huge amount of steel and concrete into the water, which is inherently expensive. It is still unclear whether this can be done at a cost competitive with offshore wind, let alone other clean energy generating technologies.”

 

  speech bubble Comments »
 

 

 

19th August 2014

Earth Overshoot Day 2014

Today – 19th August – is the date when our ecological footprint exceeds our planet's budget for this year.

 

earth hourglass

 

It has taken less than eight months for humanity to use up nature’s entire budget for the year and go into "ecological overshoot" – according to data from the Global Footprint Network (GFN), an international sustainability think tank with offices in North America, Europe and Asia.

Global Footprint Network monitors humanity’s demand on the planet (ecological footprint) against nature’s biocapacity, i.e. its ability to replenish the planet's resources and absorb waste, including CO2. Earth Overshoot Day marks the date when humanity's footprint in a given year exceeds what Earth can regenerate in that year. Since the year 2000, overshoot has grown, according to GFN’s calculations. Consequently, Earth Overshoot Day has moved from 1st October in 2000 to 19th August this year.

"Global overshoot is becoming a defining challenge of the 21st century. It is both an ecological and an economic problem," says Mathis Wackernagel, president of the GFN and co-creator of the resource accounting metric. "Countries with resource deficits and low income are exceptionally vulnerable. Even high-income countries that have had the financial advantage to shield themselves from the most direct impacts of resource dependence need to realise that a long-term solution requires addressing such dependencies before they turn into a significant economic stress."

In 1961, humanity used just three-quarters of the biocapacity Earth had available that year for generating food, fibre, timber, fish stock and absorbing greenhouse gases. Most countries had biocapacities larger than their own respective footprints. By the early 1970s, economic and demographic growth had increased humanity’s footprint beyond what the planet could renewably produce. We went into ecological overshoot.

 

biocapacity and overshoot

 

Today, 86 percent of the world's population lives in countries that demand more from nature than their own ecosystems can renew. According to the GFN's calculations, it would take 1.5 Earths to produce the renewable resources necessary to support humanity’s current footprint. Future trends in population, energy, food and other resource consumption indicate this will rise to three planets by the 2050s, which could be physically unfeasible.

The costs of our ecological overspending are becoming more evident by the day. The "interest" we are paying on our mounting ecological debt – in the form of deforestation, freshwater scarcity, soil erosion, biodiversity loss and the build-up of CO2 in our atmosphere – also comes with mounting human and economic costs.

Governments who ignore resource limits in their decision-making put their long-term economic performance at risk. In times of persistent overshoot, countries running biocapacity deficits will find that reducing their resource dependence is aligned with their self-interest. Conversely, countries that are endowed with biocapacity reserves have an incentive to preserve these ecological assets that constitute a growing competitive advantage in a world of tightening ecological constraints.

 

resources

 

More and more countries are taking action in a variety of ways. The Philippines is on track to adopt the GFN's Ecological Footprint at the national level – the first country in Southeast Asia to do so – via its National Land Use Act. This policy, the first of its kind in the Philippines, is designed to protect areas from haphazard development and plan for the country's use and management of its own physical resources. Legislators are seeking to integrate the Ecological Footprint metric into this national policy, putting resource limits at the centre of decision-making.

The United Arab Emirates (UAE), a high-income country, intends to significantly reduce its per capita Ecological Footprint – one of the world’s highest – starting with carbon emissions. Its Energy Efficiency Lighting Standard will result in only energy-efficient indoor-lighting products being made available throughout the territory before the end of this year.

Morocco wants to collaborate with the Global Footprint Network on a review of the nation’s 15-year strategy for sustainable development in agriculture – Plan Maroc Vert – through the lens of the Ecological Footprint. Specifically, Morocco is interested in comprehensively assessing how the plan contributes to the sustainability of the agriculture sector, as well as a society-wide transition towards sustainability.

Regardless of a nation’s specific circumstances, incorporating ecological risk into economic planning and development strategy is not just about foresight – it has become an urgent necessity.

 

  speech bubble Comments »
 

 

 

13th August 2014

CO2 "sponge" could soak up pollution

A new polymer that could help to absorb man-made emissions from power plants has been announced by the American Chemical Society.

 

co2 sponge

 

A sponge-like plastic that soaks up the greenhouse gas carbon dioxide (CO2) might ease our transition away from polluting fossil fuels and toward new energy sources, such as hydrogen. The material — a relative of the plastics used in food containers — could play a role in the U.S. government's plan to cut CO2 emissions 30 percent by 2030, and could also be integrated into power plant smokestacks in the future. A report on the new material is one of nearly 12,000 presentations at the 248th National Meeting & Exposition of the American Chemical Society (ACS), the world’s largest scientific society, taking place in San Francisco this week.

“The key point is that this polymer is stable, it’s cheap, and it adsorbs CO2 extremely well,” says Andrew Cooper, Ph.D. “It’s geared toward function in a real-world environment. In a future landscape where fuel-cell technology is used, this adsorbent could work toward zero-emission technology.”

Adsorbents are most commonly used to remove greenhouse gas pollutants from smokestacks at power plants where fossil fuels are burned. However, Cooper and his team intend this adsorbent — a microporous organic polymer — for a different application. The new material would become part of an emerging technology called integrated gasification combined cycle (IGCC), which can convert fossil fuel into hydrogen gas. Hydrogen holds great promise for use in fuel-cell cars and electricity generation, because it produces almost no pollution. IGCC is a bridging technology that is intended to jump-start the hydrogen economy, or the transition to hydrogen fuel, while still using the existing fossil-fuel infrastructure. But the IGCC process yields a mixture of hydrogen and CO2 gas, which must be separated.

Cooper, who is from the University of Liverpool, claims that the sponge works best under the high pressures intrinsic to the IGCC process. Just like a kitchen sponge swells when it takes on water, the adsorbent swells slightly when it soaks up CO2 in the tiny spaces between its molecules. When the pressure drops, the adsorbent deflates and releases the CO2, which can then be collected for storage or conversion into useful compounds.

The material — a brown, sand-like powder — is made by linking together many small carbon-based molecules into a network. Cooper explains that the idea to use this structure was inspired by polystyrene, a plastic used in styrofoam and other packaging material. Polystyrene can adsorb small amounts of CO2 by the same swelling action.

One advantage of using polymers is that they tend to be very stable. The material can even withstand being boiled in acid, proving it should tolerate the harsh conditions in power plants where CO2 adsorbents are needed. Other CO2 scrubbers — whether created from plastics or metals or in liquid form — do not always hold up well, he says. Another benefit of this new adsorbent is its ability to adsorb CO2 without also taking on water vapour, which can clog up other materials and make them less effective. Its low cost, reusability, and long lifetime also makes the sponge polymer attractive. In his report, Cooper also describes how it is relatively simple to embed the spongy polymers in the kinds of membranes already being evaluated to remove CO2 from power plant exhaust. Combining two types of scrubbers could make even better adsorbents, by harnessing the strengths of each.

 

  speech bubble Comments »
 

 

 

8th August 2014

Air traffic growth will outpace carbon reduction efforts

Carbon reduction efforts by airlines will be outweighed by growth in air traffic, even if the most contentious mitigation measures are implemented, according to new research by the University of Southampton.

 

contrails

 

Even if proposed mitigation measures are agreed upon and put in place, air traffic growth rates are likely to outpace emission reductions, unless demand is substantially reduced.

"There is little doubt that increasing demand for air travel will continue for the foreseeable future," says Professor John Preston, travel expert and study co-author. "As a result, civil aviation is going to become an increasingly significant contributor to greenhouse gas emissions."

The authors of the new study – which is published in the journal Atmospheric Environment – have calculated that the ticket price increase necessary to drive down demand would value CO2 emissions at up to one hundred times the amount of current valuations.

"This would translate to a yearly 1.4 per cent increase on ticket prices, breaking the trend of increasing lower airfares," says co-author and researcher Matt Grote. "The price of domestic tickets has dropped by 1.3 per cent a year between 1979 and 2012, and international fares have fallen by 0.5 per cent per annum between 1990 and 2012."

However, the research suggests any move to suppress demand would be resisted by the airline industry and national governments. The researchers say a global regulator ‘with teeth’ is urgently needed to enforce CO2 emission reduction measures.

"Some mitigation measures can be left to the aviation sector to resolve," says Professor Ian Williams, Head of the Centre for Environmental Science at the University of Southampton. "For example, the industry will continue to seek improvements to fuel efficiency as this will reduce costs. However, other essential measures, such as securing international agreements, setting action plans, regulations and carbon standards will require political leadership at a global level."

The literature review conducted by the researchers suggests that the UN's International Civil Aviation Organisation (ICAO) "lacks the legal authority to force compliance and therefore is heavily reliant on voluntary cooperation and piecemeal agreements".

Current targets, set at the most recent ICAO Assembly Session last October, include a global average fuel-efficiency improvement of two per cent a year (up to 2050) and keeping global net CO2 emissions for international aviation at the same level from 2020. Global market based measures (MBM) have yet to be agreed upon, while Boeing predicts the number of aircraft in service to double between the years 2011 and 2031.

 

  speech bubble Comments »
 

 

 

3rd August 2014

Tesla and Panasonic to build $5 billion "Gigafactory"

Tesla has reached an agreement with Panasonic to build a $5 billion "Gigafactory". This will produce more batteries than all other lithium-ion battery factories in the world combined, slashing costs by nearly one-third and boosting the adoption of electric vehicles.

 

tesla gigafactory 2020

 

Tesla Motors and Panasonic had been in talks for several months over a massive new factory to produce electric car batteries. This week, they signed an agreement to build the $5 billion facility. Dubbed the "Gigafactory," its location is still unknown – but sites are being evaluated in Arizona, California, Nevada, New Mexico and Texas. Tesla will be responsible for the land, buildings and utilities, while Panasonic will handle the equipment, manufacturing and supply side, based on their mutual approval.

Ground-breaking is planned to begin later this year, and the first batteries are expected to roll off the assembly line in 2017. It is hoped that by 2020, 500,000 battery cells will be produced each year; 35 GWh worth of cells and 50 GWh worth of packs. These will be used to power Tesla's Model S and Model X cars, along with a cheaper Model 3 sedan being introduced in 2017. The Model 3 is expected to be around $35,000 – half the cost of a Model S.

According to the press release, cost reductions at the Gigafactory will be driven by economies of scale previously impossible in battery cell production. Further savings will be achieved by manufacturing cells that have been optimised for electric vehicle design – both in size and function – by co-locating suppliers on-site to eliminate packaging, transportation and duty costs and inventory carrying costs, and by manufacturing at a location with lower utility and operating expenses. As shown in the rendering above, localised solar and wind turbines will be used to power the facility.

Tesla co-founder and CEO, Elon Musk, says there will eventually be a need for "several more" of these Gigafactories. Other efforts by Tesla to boost electric cars have included its revolutionary supercharger network, offering free high-speed charges in less than an hour. There are now more than 100 of these stations operating in the United States, with many more planned, covering 98 percent of the population by the end of 2015. Networks are also being established in Europe and Asia. The company released its patents in June this year, to encourage the spread of its technology. Future historians will surely look back on Elon Musk favourably.

 

  speech bubble Comments »
 

 

 

30th July 2014

Early signs that Earth is nearing a mass extinction

A new report from Stanford University warns that biodiversity is close to a tipping point that will lead to the next mass extinction.

 

mass extinctions

Credit:Smith609, Wikipedia (CC BY-SA 3.0)

 

Earth's current biodiversity – the product of 3.5 billion years of evolutionary trial and error – is very high when looking at the long history of life. But it may be reaching a tipping point. In a review of scientific literature and data analysis published in Science, an international team of scientists cautions that the loss and decline of animals is contributing to what appears to be the early days of the planet's next mass extinction event.

Since 1500, over 320 terrestrial vertebrates have gone extinct. Populations of the remaining species show a 25 percent average decline in abundance. The situation is similarly dire for invertebrate animal life. And while previous extinctions have been driven by natural planetary transformations or catastrophic asteroid strikes, the current die-off can be associated to human activity – a situation that lead author Rodolfo Dirzo, professor of biology at Stanford University, calls an era of "Anthropocene defaunation."

Across vertebrates, up to 33 percent of all species are estimated to be globally threatened or endangered. Large animals – described as megafauna and including elephants, rhinoceroses, polar bears and countless other species worldwide – face the highest rates of decline, a trend that matches previous extinction events. Larger animals tend to have lower population growth rates and produce fewer offspring. They need larger habitat areas to maintain viable populations. Their size and meat mass make them easier and more attractive hunting targets for humans.

Although these species represent a relatively low percentage of the animals at risk, their loss would have trickle-down effects that could shake the stability of other species and, in some cases, even human health. For instance, previous experiments conducted in Kenya have isolated patches of land from megafauna such as zebras, giraffes and elephants, and observed how an ecosystem reacts to the removal of its largest species. Rather quickly, these areas become overwhelmed with rodents. Grass and shrubs increase and the rate of soil compaction decreases. Seeds and shelter become more easily available, and the risk of predation drops. Consequently, the number of rodents doubles – and so does the abundance of the disease-carrying ectoparasites that they harbour. If rats dominate ecosystems, they could evolve to giant sizes in the future, according to recent research.

"Where human density is high, you get high rates of defaunation, high incidence of rodents, and thus high levels of pathogens, which increases the risks of disease transmission," says Dirzo. "Who would have thought that just defaunation would have all these dramatic consequences? But it can be a vicious circle."

The scientists also detailed a troubling trend in invertebrate defaunation. Human population has doubled in the past 40 years; during the same period, the number of invertebrate animals – such as beetles, butterflies, spiders and worms – has decreased by 45 percent. As with larger animals, the loss is driven primarily by loss of habitat and global climate disruption, and could have trickle-up effects in our everyday lives. For instance, insects pollinate roughly 75 percent of the world's food crops, an estimated 10 percent of the economic value of the world's food supply. They also play critical roles in nutrient cycling and decomposing organic materials, which helps to ensure ecosystem productivity.

Dirzo says that the solutions are complicated. Immediately reducing rates of habitat change and overexploitation would help, but these approaches need to be tailored to individual regions and situations. He said he hopes that raising awareness of the ongoing mass extinction – and not just of large, charismatic species – and its associated consequences will help spur change.

"We tend to think about extinction as loss of a species from the face of Earth, and that's very important, but there's a loss of critical ecosystem functioning in which animals play a central role that we need to pay attention to as well," he said. "Ironically, we have long considered that defaunation is a cryptic phenomenon, but I think we will end up with a situation that is non-cryptic because of the increasingly obvious consequences to the planet and to human wellbeing."

 

animals

 

  speech bubble Comments »
 

 

 

28th July 2014

Hottest June on record

Globally, June 2014 was the hottest June since records began in 1880. Experts predict that 2014 will be an El Niño year.

 

hottest june on record global warming june 2014

 

According to NOAA scientists, the globally averaged temperature over land and ocean surfaces for June 2014 was the highest for June since records began in 1880. This follows the hottest May on record the previous month. It also marked the 38th consecutive June and 352nd consecutive month with a global temperature above the 20th century average. The last below-average global temperature for June was in 1976 and the last below-average global temperature for any month was February 1985.

Most of the world experienced warmer-than-average monthly temperatures, with record warmth across part of southeastern Greenland, parts of northern South America, areas in eastern and central Africa, and sections of southern and southeastern Asia. Drought conditions in the southwest U.S. continued to worsen, with Lake Mead dropping to its lowest levels ever – triggering fears of major water shortages within the next several years. Australia saw nationally-averaged rainfall 32 percent below average and in Western Australia precipitation was 72 percent below average.

Ocean surface temperatures for June were 0.64°C (1.15°F) above the 20th century average of 16.4°C (61.5°F), the highest for June on record and the highest departure from average for any month. Notably, large parts of the western equatorial and northeast Pacific Ocean and most of the Indian Ocean were record warm or much warmer than average for the month. Although neither El Niño nor La Niña conditions were present across the central and eastern equatorial Pacific Ocean during June 2014, ocean waters in that region continued to trend above average. NOAA's Climate Prediction Centre estimates there is about a 70 percent chance that El Niño conditions will develop during Northern Hemisphere summer 2014 and 80 percent chance it will develop during the fall and winter.

 

global warming june

 

  speech bubble Comments »
 

 

 

25th July 2014

Deep sea mining moves a step closer

With many of Earth's metals and minerals facing a supply crunch in the decades ahead, deep ocean mining could provide a way of unlocking major new resources. Amid growing commercial interest, the UN's International Seabed Authority has just issued seven exploration licences.

 

deep sea mining
Credit: Nautilus Minerals Inc.

 

To build a fantastic utopian future of gleaming eco-cities, flying cars, robots and spaceships, we're going to need metal. A huge amount of it. Unfortunately, our planet is being mined at such a rapid pace that some of the most important elements face critical shortages in the coming decades. These include antimony (2022), silver (2029), lead (2031) and many others. To put the impact of our mining and other activities in perspective: on land, humans are now responsible for moving about ten times as much rock and earth as natural phenomena such as earthquakes, volcanoes and landslides. The UN predicts that on current trends, humanity's annual resource consumption will triple by 2050.

While substitution in the form of alternative metals could help, a longer term answer is needed. Asteroid mining could eventually provide an abundance from space – but a more immediate, technically viable and commercially attractive solution is likely to arise here on Earth. That's where deep sea mining comes in. Just as offshore oil and gas drilling was developed in response to fossil fuel scarcity on land, the same principle could be applied to unlock massive new metal reserves from the seabed. Oceans cover 72% of the Earth's surface, with vast unexplored areas that may hold a treasure trove of rare and precious ores. Further benefits would include:

• Curbing of China's monopoly on the industry. As of 2014, the country is sitting on nearly half the world's known reserves of rare earth metals and produces over 90% of the world's supply.

• Limited social disturbance. Seafloor production will not require the social dislocation and resulting impact on culture or disturbance of traditional lands common to many land-based operations.

• Little production infrastructure. As the deposits are located on the seafloor, production will be limited to a floating ship with little need for additional land-based infrastructure. The concentration of minerals is an order of magnitude higher than typical land-based deposits with a corresponding smaller footprint on the Earth's surface.

• Minimal overburden or stripping. The ore generally occurs directly on the seafloor and will not require large pre-strips or overburden removal.

• Improved worker safety. Operations will be mostly robotic and won't require human exposure to typically dangerous mining or "cutting face" activities. Only a hundred or so people will be employed on the production vessel, with a handful more included in the support logistics.

 

robot mining
Credit: Nautilus Minerals Inc.

 

Interest in deep sea mining first emerged in the 1960s – but consistently low prices of mineral resources at the time halted any serious implementation. By the 2000s, the only resource being mined in bulk was diamonds, and even then, just a few hundred metres below the surface. In recent years, however, there has been renewed interest, due to a combination of rising demand and improvements in exploration technology.

The UN's International Seabed Authority (ISA) was set up to manage these operations and prevent them from descending into a free-for-all. Until 2011, only a handful of exploration permits had been issued – but since then, demand has surged. This week, seven new licences were issued to companies based in Brazil, Germany, India, Russia, Singapore and the UK. The number is expected to reach 26 by the end of 2014, covering a total area of seabed greater than 1.2 million sq km (463,000 sq mi).

Michael Lodge of the ISA told the BBC: "There's definitely growing interest. Most of the latest group are commercial companies so they're looking forward to exploitation in a reasonably short time – this move brings that closer."

So far, only licences for exploration have been issued, but full mining rights are likely to be granted over the next few years. The first commercial activity will take place off the coast of Papua New Guinea, where a Canadian company – Nautilus Minerals – plans to extract copper, gold and silver from hydrothermal vents. After 18 months of delays, this was approved outside the ISA system and is expected to commence in 2016. Nautilus has been developing Seafloor Production Tools (SPTs), the first of which was completed in April. This huge robotic machine is known as the Bulk Cutter and weighs 310 tonnes when fully assembled. The SPTs have been designed to work at depths of 1 mile (1.6 km), but operations as far down as 2.5 miles (4 km) should be possible eventually.

As with any mining activity, concerns have been raised from scientists and conservationists regarding the environmental impact of these plans, but the ISA says it will continue to demand high levels of environmental assessment from its applicants. Looking ahead, analysts believe that deep sea mining could be widespread in many parts of the world by 2040.

 

 

  speech bubble Comments »
 

 

 

16th July 2014

Southern Australia in permanent drought:
40% less rainfall by late 21st century

Scientists at the National Oceanic and Atmospheric Administration (NOAA) have developed a new high-resolution climate model, showing that southwestern Australia's long-term decline in fall and winter rainfall is caused by manmade greenhouse gas emissions and ozone depletion.

 

australia future rainfall map

 

"This new high-resolution climate model is able to simulate regional-scale precipitation with considerably improved accuracy compared to previous generation models," said Tom Delworth, a research scientist at NOAA's Geophysical Fluid Dynamics Laboratory in Princeton, N.J., who helped develop the new model and is co-author of the paper. "This model is a major step forward in our effort to improve the prediction of regional climate change, particularly involving water resources."

NOAA researchers conducted several climate simulations using this global climate model to study long-term changes in rainfall in various regions across the globe. One of the most striking signals of change emerged over Australia, where a long-term decline in fall and winter rainfall has been observed over parts of southern Australia. Simulating natural and manmade climate drivers, scientists showed that the decline in rainfall is primarily a response to manmade increases in greenhouse gases as well as a thinning of the ozone caused by manmade aerosol emissions. Several natural causes were tested with the model, including volcano eruptions and changes in the sun's radiation. However, none of these natural drivers reproduced the long-term observed drying, indicating this trend is clearly due to human activity.

Southern Australia's decline in rainfall began around 1970 and has increased over the last four decades. The model projects a continued decline in winter rainfall throughout the rest of the 21st century, with significant implications for regional water resources. The drying is most severe over the southwest, predicted to see a 40 percent decline in average rainfall by the late 21st century.

"Predicting potential future changes in water resources, including drought, are an immense societal challenge," said Delworth. "This new climate model will help us more accurately and quickly provide resource planners with environmental intelligence at the regional level. The study of Australian drought helps to validate this new model, and thus builds confidence in this model for ongoing studies of North American drought."

The research is published this week in Nature Geoscience.

 

  speech bubble Comments »
 

 

 

15th July 2014

World's first climate-controlled city planned for Dubai

Dubai is already known for its luxury tourist experience, super-tall skyscrapers and extravagant megaprojects. Now developers have announced it will host the world's first temperature-controlled city – incorporating the largest mall, largest domed park, cultural theatres and wellness resorts. Known as the "Mall of the World", this gigantic $7bn project will encompass 50 million square feet of floorspace, taking 10 years to construct.

Intended as a year-round destination, its capacity will be large enough to accommodate 180 million visitors each year in 100 hotels and serviced apartment buildings. Glass-roofed streets, modelled on New York's Broadway and London's Oxford Street, will stretch for 7 km (4.6 miles). These will be air-conditioned in summer as temperatures soar above 40°C, but the mall and its glass dome will be open to the elements during cooler winter months. Cars will be redundant in this "integrated pedestrian city."

 

mall of the world
Credit: Dubai Holding

 

"The project will follow the green and environmentally friendly guidelines of the Smart Dubai model," explained Ahmad bin Byat, the chief executive of Dubai Holding. "It will be built using state-of-the-art technology to reduce energy consumption and carbon footprint, ensuring high levels of environmental sustainability and operational efficiency."

In response to concerns about another real estate bubble, he insisted there was demand for such a project: "The way things are growing I think we are barely coping with the demand ... tourism is growing in Dubai," he said in an interview with Reuters. "This is a long-term project and we are betting strongly on Dubai."

Speaking at the launch of the mall, Sheikh Mohammed said: "The growth in family and retail tourism underpins the need to enhance Dubai's tourism infrastructure as soon as possible. This project complements our plans to transform Dubai into a cultural, tourist and economic hub for the 2 billion people living in the region around us – and we are determined to achieve our vision."

Mall of the World is one of several hi-tech, futuristic cities that could set the standard for eco-city designs in the coming decades. Others include China's car-free "Great City" (planned to be finished by 2020) and the Masdar City arcology (due in 2025).

 

 

  speech bubble Comments »
 

 

 

14th July 2014

Largest ever study of its kind finds significant differences between organic and non-organic food

The largest ever study of its kind has found significant differences between organic food and conventionally-grown crops. Organic food contains almost 70% more antioxidants and significantly lower levels of toxic heavy metals.

 

organic and non organic food
Conventionally-grown potatoes on the left of the picture and organically grown potatoes on the right. Credit: Newcastle University

 

Analysing 343 studies into the differences between organic and conventional crops, an international team of experts led by Newcastle University, UK, found that a switch to eating organic fruit, vegetable and cereals – and food made from them – would provide additional antioxidants equivalent to eating between 1-2 extra portions of fruit and vegetables a day.

The study, published in the British Journal of Nutrition, also shows significantly lower levels of toxic heavy metals in organic crops. Cadmium – one of only three metal contaminants along with lead and mercury for which the European Commission has set maximum permitted contamination levels in food – was found to be almost 50% lower in organic crops than conventionally-grown ones.

Professor Carlo Leifert, who led the study, says: “This study demonstrates that choosing food produced according to organic standards can lead to increased intake of nutritionally desirable antioxidants and reduced exposure to toxic heavy metals. This constitutes an important addition to the information currently available to consumers which until now has been confusing and in many cases is conflicting.”

New methods used to analyse the data

This is the most extensive analysis of the nutrient content in organic vs conventionally-produced foods ever undertaken and is the result of a groundbreaking new systematic literature review and meta-analysis by the international team. 

The findings contradict those of a 2009 UK Food Standards Agency (FSA) commissioned study, which found there were no substantial differences or significant nutritional benefits from organic food. The FSA study based its conclusions on just 46 publications covering crops, meat and dairy, while Newcastle led meta-analysis is based on data from 343 peer-reviewed publications on composition difference between organic and conventional crops now available.

“The main difference between the two studies is time,” explains Professor Leifert, who is Professor of Ecological Agriculture at Newcastle University. “Research in this area has been slow to take off the ground and we have far more data available to us now than five years ago.”

Dr Gavin Stewart, a Lecturer in Evidence Synthesis and the meta-analysis expert in the Newcastle team, added: “The much larger evidence base available in this synthesis allowed us to use more appropriate statistical methods to draw more definitive conclusions regarding the differences between organic and conventional crops.”

What the findings mean

The study, funded jointly by the European Framework 6 programme and the Sheepdrove Trust, found that concentrations of antioxidants such as polyphenolics were between 18-69% higher in organically-grown crops. Numerous studies have linked antioxidants to a reduced risk of chronic diseases, including cardiovascular and neurodegenerative diseases and certain cancers. Substantially lower concentrations of a range of the toxic heavy metal cadmium were also detected in organic crops (on average 48% lower).

Nitrogen concentrations were found to be significantly lower in organic crops. Concentrations of total nitrogen were 10%, nitrate 30% and nitrite 87% lower in organic compared to conventional crops. The study also found that pesticide residues were four times more likely to be found in conventional crops than organic ones.

 

tractor pesticides

 

Professor Charles Benbrook, one of the authors of the study and a leading scientist based at Washington State University, explains: “Our results are highly relevant and significant and will help both scientists and consumers sort through the often conflicting information currently available on the nutrient density of organic and conventional plant-based foods.”

Professor Leifert added: “The organic vs non-organic debate has rumbled on for decades now, but the evidence from this study is overwhelming – organic food is high in antioxidants and lower in toxic metals and pesticides.

“But this study should just be a starting point. We have shown without doubt there are composition differences between organic and conventional crops, now there is an urgent need to carry out well-controlled human dietary intervention and cohort studies specifically designed to identify and quantify the health impacts of switching to organic food.”

The authors of this study welcome the continued public and scientific debate on this important subject. The entire database generated and used for this analysis is freely available on the Newcastle University website for the benefit of other experts and interested members of the public.

 

  speech bubble Comments »
 

 

 

13th July 2014

Interactive map shows city temperatures by 2100

A new interactive graphic and analysis released this week by research and journalism organisation Climate Central illustrates how much hotter summers will be in 1,001 U.S. cities by 2100, if current emissions trends continue, and shows which cities they are going to most feel like.

 

 

"Summer temperatures in most American cities are going to feel like summers now in Texas and Florida — very, very hot," comments Alyson Kenward, lead researcher of the analysis, which looked at projected changes in summer (June-July-August) high temperatures. On average, those temperatures will be 3.9 to 5.6°C (7-10°F) hotter, with some cities as much as 6.7°C (12°F) hotter by the end of the century.

Among the most striking examples featured in the interactive are:

• Boston, where average summer high temperatures will likely be more than 5.6°C (10°F) hotter than they are now, making it feel as steamy as North Miami Beach is today.

• Saint Paul, Minnesota, where summer highs are expected to rise by an average of 6.7°C (12°F), putting it on par with Mesquite, Texas.

• Memphis, where summer high temperatures could average a sizzling 37.8°C (100°F), typical of Laredo, Texas.

• Las Vegas, with summer highs projected to average a scorching 43.9°C (111°F), like summers today in Riyadh, Saudi Arabia.

• Phoenix, where summer high temperatures would average a sweltering 45.6°C (114°F), which will feel like Kuwait City.

This analysis only accounts for daytime summer heat — the hottest temperatures of the day, on average between June-July-August — and doesn't incorporate humidity or dewpoint, both of which contribute to how uncomfortable summer heat can feel. Other impacts the map does not include are rising sea levels and a likely increase in storms and severe weather events.

Recent articles by Fox News and the Daily Telegraph claimed that scientists have been "tampering" with U.S. temperature data. For those who care about real science (as opposed to conspiracy theories), Skeptical Science has a thorough debunking here.

 

  speech bubble Comments »
 

 

 

10th July 2014

Bee foraging chronically impaired by pesticide exposure

A study co-authored by a University of Guelph scientist that involved fitting bumblebees with tiny radio frequency tags shows long-term exposure to a neonicotinoid pesticide hampers bees’ ability to forage for pollen.

 

bumblebees with rfid tags
Bees fitted with RFID tags. Credit: Richard Gill

 

The research by Nigel Raine, a professor in Guelph’s School of Environmental Sciences, and Richard Gill of Imperial College London is published in the British Ecological Society’s journal Functional Ecology. The study shows how long-term pesticide exposure affects individual bees’ day-to-day behaviour, including pollen collection and which flowers the worker bees chose to visit.

“Bees have to learn many things about their environment, including how to collect pollen from flowers,” says Raine, who holds the Rebanks Family Chair in Pollinator Conservation. “Exposure to this neonicotinoid pesticide seems to prevent bees from being able to learn these essential skills.”

The researchers monitored bee activity using radio frequency identification (RFID) tags – seen in the photograph above – similar to those used by courier firms to track parcels. They tracked when individual bees left and returned to the colony, how much pollen they collected and from which flowers.

The bees from untreated colonies got better at collecting pollen as they learned to forage. However, bees exposed to neonicotinoid insecticides became less successful over time at collecting pollen. Neonicotinoid-treated colonies even sent out more foragers to try to compensate for lack of pollen from individual bees. Besides collecting less pollen, said Raine, “flower preferences of neonicotinoid-exposed bees were different to those of foraging bees from untreated colonies.”

Raine and Gill studied the effects of two pesticides – imidacloprid, one of three neonicotinoid pesticides currently banned for use on crops attractive to bees by the European Commission, and pyrethroid  (lambda cyhalothrin) – used both alone and together, on the behaviour of individual bumblebees from 40 colonies over four weeks.

“Although pesticide exposure has been implicated as a possible cause for bee decline, until now we had limited understanding of the risk these chemicals pose, especially how it affects natural foraging behaviour,” Raine said.

Neonicotinoids make up about 30 per cent of the global pesticide market. Plants grown from neonicotinoid-treated seed have the pesticide in all their tissues, including the nectar and pollen.

“If pesticides are affecting the normal behaviour of individual bees, this could have serious knock-on consequences for the growth and survival of colonies,” explained Raine.

He suggests reform of pesticide regulations, including adding bumblebees and solitary bees to risk assessments that currently cover only honeybees.

“Bumblebees may be much more sensitive to pesticide impacts as their colonies contain a few hundred workers at most, compared to tens of thousands in a honeybee colony,” he added.

 

  speech bubble Comments »
 

 

 
     
       
     
   
« Previous  
   
     
   

 
     
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2014

 

2013

 

2012

 

2011

 

2010

 

 
 
 
 

 


future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed