The U.S. government has announced plans to expand two major marine sanctuaries off the coast of Northern California – home to a vast array of sea life.
Cordell Bank and Gulf of the Farallones national marine sanctuaries off northern California will both increase dramatically in size after a final ruling by the National Oceanic and Atmospheric Administration (NOAA). The expansion will help to protect the region's marine and coastal habitats, biological resources and rare ecological features.
Cordell Bank National Marine Sanctuary, located 42 miles north of San Francisco, will expand from 529 square miles to 1,286 square miles. Gulf of the Farallones National Marine Sanctuary will expand from 1,282 square miles to 3,295 square miles of ocean and coastal waters.
"We are thrilled to announce the expansion of two of our sanctuaries in California," said Holly Bamford, NOAA's deputy administrator. "It's important to conserve these special places that encourage partnerships in science, education, technology, management and community."
The expansion is based on years of public comment and research by NOAA and its scientific partners that identified the nutrient-rich upwelling zone originating off Point Arena and flowing south into the original sanctuaries as one of the most productive in North America.
Cordell Bank and Gulf of Farallones national marine sanctuaries represent globally significant, extraordinarily diverse, and productive marine ecosystems that encompass areas as varied as estuarine wetlands, rocky intertidal habitat, open ocean and shallow marine banks. They include areas of major upwelling where nutrients come to the surface and support a vast array of sea life – including 25 endangered or threatened species, 36 marine mammals such as blue, gray and humpback whales, harbour seals, elephant seals, Pacific white-sided dolphins, and one of the southernmost U.S. populations of Steller sea lions; over a quarter of a million breeding seabirds; and one of the most significant white shark populations on the planet.
"I'm glad the administration stepped up and used its authority as prior administrations, both Republican and Democrat, have done," said Jared Huffman, U.S. Representative for California's 2nd congressional district, in a San Francisco Chronicle interview. "The whole California coast has been in the crosshairs of oil and gas development for a long time."
"This expansion is the outcome of a tremendous collaborative effort by government, local communities, academia and elected officials to provide additional protection for critical marine resources," said Daniel J. Basta, director of the NOAA's Office of National Marine Sanctuaries. "It presents a bold vision for protecting the waters off the northern California coast for current and future generations."
The number of wild giant pandas has increased nearly 17% over the last decade, according to a new survey by the Chinese government.
New figures released by the Chinese government show that the global population of wild giant pandas has reached 1,864 – up from 1,596 when their numbers were last surveyed in 2003. A symbol of wildlife conservation, giant pandas are only found in China's Sichuan, Shaanxi and Gansu provinces.
“The rise in the population of wild giant pandas is a victory for conservation and definitely one to celebrate,” said Ginette Hemley, Senior Vice President of Wildlife Conservation, World Wildlife Fund (WWF).
“This increase in the population of wild giant pandas is a testament to the commitment made by the Chinese government for the last 30-plus years to wild panda conservation,” Hemley said. “WWF is grateful to have had the opportunity to partner with the Chinese government to contribute to panda conservation efforts.”
According to the Fourth National Giant Panda Survey, 1246 wild giant pandas live within nature reserves, accounting for 66.8% of the total wild population size and 53.8% of the total habitat area. There are currently 67 panda nature reserves in China – an increase of 27 since the last survey.
The report found the total area inhabited by wild giant pandas in China now equals 6,370,000 acres, an expansion of 11.8% since 2003.
Despite a positive trend in the number of wild giant pandas, the species still faces challenges. 46% of panda habitat and 33.2% of the population live outside of protected nature reserves. Habitat fragmentation – the separation of wildlife populations by physical barriers – is increasingly noticeable with about 12% of individuals facing higher risks to their survival.
Though there appears to be a decline in traditional threats to pandas such as poaching, large-scale infrastructure projects like mining, hydro-power, and supporting roads and railroads are becoming more severe and were referenced in the survey for the first time.
WWF supports the government of China’s work by establishing panda nature reserves and a conservation network that integrates those reserves with forests farms and corridors of forest that allow pandas to find food and meet mates. The organisation’s work ensures the legal protection of a large percentage of panda habitat and an improvement in how conservation efforts are carried out. WWF was also involved with the survey produced.
Xiaohai Liu, Executive Program Director, WWF-China said, “The survey result demonstrates the effectiveness of nature reserves in boosting wild giant panda numbers.”
WWF's 2015-2025 giant panda conservation strategy sets the course for panda protection efforts over the next decade, and will focus on improving panda habitat in a manner that balances conservation with local sustainable development.
The Solar Impulse team have begun their epic journey around the world.
Solar Impulse – a long-range, solar-powered aircraft project – took off this morning from Abu Dhabi in the United Arab Emirates. It aims to become the first plane to fly around the world using only energy from the Sun.
On the plane's 35,000 km (21,747 miles) route, pilots Bertrand Piccard and Andre Borschberg will take turns in the cockpit as the aircraft makes it way eastwards from Abu Dhabi – stopping in cities including Muscat, Oman; Ahmedabad and Varanasi in India; Mandalay in Myanmar; Chongqing and Nanjing in China; and Hawaii, Phoenix and New York in the United States – before crossing the Atlantic on its way back to Abu Dhabi, where it is expected to arrive in mid-2015.
The most challenging leg of the journey will be a non-stop flight of five days and nights across the Pacific Ocean, from China to Hawaii. The plane, powered by 17,248 solar cells, will ascend to altitudes approaching 10,000 metres during the day, while fully charging its batteries to stay aloft throughout the night.
"When we speak of clean technologies for the world, it is not a dream, it is real," said Piccard, the Swiss aviation pioneer who was part of the first team to circle the earth in a balloon in 1999. His partner Borschberg sees "technology changing much faster than we could ever have imagined."
Two futuristic concept tires unveiled by Goodyear at this week’s Geneva International Motor Show could radically change the role of car tires in the future.
“BHO3” (left) and “Triple Tube” (right). Credit: Goodyear
Although the tires pictured here are only concept products at this stage, the technologies in their designs offer a glimpse of what practical innovations may be on the horizon.
The first concept – named “BHO3” – offers the possibility of charging the batteries of electric cars by transforming the heat generated by the rolling tire into useful electrical energy. The second concept – named “Triple Tube” – contains three tubes that adjust tire inflation pressure in response to changing road conditions, delivering new levels of performance and versatility.
“These concept tires reimagine the role that tires may play in the future,” said Joe Zekoski, Goodyear’s senior vice president and chief technical officer. “We envision a future in which our products become more integrated with the vehicle and the consumer, more environmentally friendly and more versatile.”
Additional details on the two concept tires:
This tire generates electricity through the action of thermoelectric and piezoelectric materials in the tire that capture and transform the energy created by heat when it flexes as it rolls during normal driving conditions. The materials used would optimise the tire’s electricity generation capabilities as well as its rolling resistance.
As demand for electric cars grows, this technology has the potential to significantly contribute to the solution of future mobility challenges. This visionary tire technology could help to alleviate the vehicle-range anxiety motorists may have with electric cars.
This tire features three internal tubes within the tire. Tubes are located beneath the tread and near the inboard and outboard shoulders of the tire, as well as the centre. The tire relies on an internal pump that moves air from the main air chamber to the three individual air chambers, or tubes. The tire automatically adjusts – on its own – to three different positions based on road conditions.
• The Eco/Safety position – with maximum inflation in all three tubes – offers reduced rolling resistance.
• The Sporty position – with reduced inflation within the inboard shoulder tube – gives drivers dry handling through an optimised contact patch.
• The Wet Traction position – with maximised inflation in the centre tube – provides high aquaplaning resistance through a raised tread in the centre of the tire.
Although these tires are future concepts, Zekoski says they represent an essential aspect of Goodyear’s innovation strategy, instilling a forward-looking, market-back mindset in the company’s research and development teams.
“It is more important than ever for us to stay firmly rooted in our market-back innovation process, which calls on us to focus on, and anticipate, the rapidly evolving needs of our customers,” said Zekoski.
Between 4.8 and 12.7 million metric tons of plastic entered the Earth's oceans in 2010 from people living within 50 km (31 mi) of a coastline, according to the results of a new study. Unless action is taken to improve recycling and waste disposal, the world faces an environmental crisis in the decades ahead.
Credit: Timothy Townsend
A new study published yesterday in the journal Science provides another stark reminder of the impact that humans and their global civilisation are having on the world's oceans. Researchers from the University of Georgia looked at data from 192 countries with a coastline (i.e. not land-locked). They found that during 2010, these nations produced a total of 2.5 billion tons of waste. Of that, slightly over 10% or about 275 million metric tons was plastic. It is estimated that between 4.8 and 12.7 million metric tons — 1.7% to 4.6% — of plastic found its way into the oceans from two billion people within 50 km (31 mi) of a coast.
Using eight million metric tons for a midpoint, "is the equivalent to finding five grocery bags full of plastic on every foot of coastline," says Jenna Jambeck, assistant professor of environmental engineering and the paper's lead author.
"For the first time, we're estimating the amount of plastic that enters the oceans in a given year," said co-author Kara Lavender Law, research professor at the Massachusetts-based Sea Education Association. "Nobody has had a good sense of the size of that problem until now."
If trends continue, Jambeck forecasts that the cumulative impact on the oceans will reach 155 million metric tons by 2025 – a year that is also predicted to see a crisis in landfill and urban waste, as well. This problem is likely to become even worse in the century ahead, based on World Bank calculations. On a business-as-usual scenario, the world is not predicted to achieve "peak waste" until at least 2100. These materials could remain in the biosphere until the year 2600 AD.
Unaltered stomach contents of a dead albatross, Midway Atoll National Wildlife Refuge in the Pacific. Photo by Chris Jordan (via U.S. Fish and Wildlife Service Headquarters) / CC BY 2.0
The plastic entering our oceans includes not only bags, bottles, six-pack rings and other packaging – but also microplastics, which are easy to swallow and therefore pose a danger to marine life of all sizes, from barnacles to whales. In addition, plastic pollution can serve as a carrier of invasive species and diseases, threatening native ecosystems. This ongoing disruption will contribute to many extinctions in the years and decades ahead.
Up to 245,000 tons of plastic may be floating at the surface, according to Jambeck and her team – a figure that corroborates a similar study published in December 2014 that put the number at 269,000 tons, comprising a minimum of 5.2 trillion particles. Of particular concern is the North Pacific, where a swirling gyre of debris known as the Great Pacific garbage patch has accumulated, mostly consisting of plastic. Including four smaller gyres in other parts of the world, there is thought to be six times more plastic than zooplankton by dry weight.
These impacts are not limited to wildlife, but affect human society as well. Plastics and their accompanying toxic chemicals in the food chain are known to contribute to cancer, malformation and impaired reproductive ability. Fishing vessels and other boats are damaged by marine debris, while beaches require cleaning up. In total, plastic pollution causes over $13 billion in damages worldwide each year.
Credit: Lindsay Robinson/UGA
The main culprits in this global problem are the emerging nations in Asia and Africa, which have rapidly growing economies, but have yet to implement sufficient regulations and waste management systems. By far the largest share of plastic waste originates from China, which contributes 3.5 million metric tons annually – compared to just 0.1 million for the US. After China, the next largest contributors are Indonesia (1.29m), the Philippines (0.75m), Vietnam (0.73) and Sri Lanka (0.64m).
Plastic pollution in the ocean was first reported by the scientific literature in the early 1970s. In the 40 years since, there were no rigorous estimates of the amount and origin of plastic debris making its way into the marine environment, until Jambeck's current study. Part of the issue is that plastic is a relatively new problem coupled with a relatively new solution. Plastic first appeared on the consumer market in the 1930s and '40s. Waste management didn't start developing its current infrastructure in the US, Europe and parts of Asia until the mid-1970s.
"It is incredible how far we have come in environmental engineering, advancing recycling and waste management systems to protect human health and the environment, in a relatively short amount of time," says Jambeck. "However – these protections are unfortunately not available equally throughout the world."
One possible solution is the Ocean Cleanup Array designed by 21-year-old Boyan Slat. This concept would use a network of floating booms connected to processing stations that funnel plastic towards a central platform (attached to the booms) where a robotic system could sort plastic from plankton and store it for recycling. Following a year of research involving 100 volunteers and professionals, Slat's team announced the successful outcome of their feasibility study in June 2014. This is definitely a project to keep a close eye on.
Plant biologists report that drought tolerance in plants can be improved by engineering them to activate water-conserving processes in response to an agrochemical already in use – an approach that could be broadly applied to other parts of the same drought-response pathway and a range of other agrochemicals.
An experiment with non-modified (left) and modified plants (right), after water is withheld for 12 days, to simulate drought conditions. Photo credit: Sang-Youl Park.
Crops and other plants are constantly faced with adverse environmental conditions, such as rising temperatures (2014 was the warmest year on record) and lessening fresh water supplies, which lower yields and cost farmers billions of dollars annually.
Drought is a major environmental factor affecting plant growth and development. When plants encounter drought, they produce a stress hormone, abscisic acid (ABA), which inhibits plant growth and reduces water consumption. Specifically, the hormone turns on a receptor (special protein) in plants when it binds to the receptor like a hand fitting into a glove, resulting in beneficial changes – such as the closing of guard cells on leaves, called stomata, to reduce water loss – that help the plants survive.
While it is true that crops could be sprayed with ABA to assist their survival during drought, ABA is costly to make, rapidly inactivated inside plant cells and light-sensitive, and has therefore failed to find much direct use in agriculture. Several research groups are working to develop synthetic ABA mimics to modulate drought tolerance – but once discovered, these mimics are expected to face lengthy and costly development processes.
The agrochemical mandipropamid, however, is already widely used in agricultural production to control late blight of fruit and vegetable crops. Could drought-threatened crops be engineered to respond to mandipropamid as if it were ABA, and thus enhance their survival during drought?
Yes, according to a team of scientists, led by Sean Cutler at the University of California, Riverside.
The researchers worked with Arabidopsis, a model plant used widely in plant biology labs, and the tomato plant. In the lab, they used synthetic biological methods to develop a new version of these plants' abscisic acid receptors, engineered to be activated by mandipropamid instead of ABA. The researchers showed that when the reprogrammed plants were sprayed with mandipropamid, the plants effectively survived drought conditions by turning on the abscisic acid pathway, which closed the stomata on their leaves to prevent water loss.
This finding illustrates the power of synthetic biological approaches for manipulating crops and opens new doors for crop improvement that could benefit a growing world population.
"We successfully repurposed an agrochemical for a new application by genetically engineering a plant receptor – something that has not been done before," says Cutler, an associate professor of botany and plant sciences. "We anticipate that this strategy of reprogramming plant responses using synthetic biology will allow other agrochemicals to control other useful traits – such as disease resistance or growth rates, for example."
Cutler explained that discovering a new chemical and then having it evaluated and approved for use is an extremely involved and expensive process that can take many years.
"We have, in effect, circumvented this hurdle using synthetic biology – in essence, we took something that already works in the real world and reprogrammed the plant so that the chemical could control water use," he said.
The study results appear this week in the journal Nature.
A robot with advanced, non-invasive sensors and mobility could provide fast, accurate and objective data on the state of farms and vineyards.
French, German, Italian and Spanish universities and companies are developing an unmanned robot to assist with agriculture and wine production. Equipped with non-invasive advanced sensors and artificial intelligence systems, this machine will provide fast, reliable and objective information on the state of vineyards to grape growers – such as vegetative development, water status, production and grape composition.
The robot is part of the European project VineRobot, whose partners met recently at the Universitat Politècnica de València (UPV). The major advantage of this new technology is the large quantity of automatically obtained data, which any user can interpret easily, since it is represented on simple maps; as well as the wireless transmission of information from the smallholding.
"Robotics and precision agriculture provide producers with powerful tools in order to improve the competitiveness of their farms," says Javier Tardaguila, project manager and researcher at the University of La Rioja, Spain. "Robots like the one we are developing within this project will not substitute the vine grower, but will facilitate their work, so they can avoid the hardest part in field. It has several advantages, including the ability to predict grape production or its degree of ripeness in order to immediately assess its quality without touching it."
An additional benefit, explains Rovira, is the attractiveness of this new technology for young farmers, "as the high average age of farmers is a recurring matter of concern in industrialised countries."
During the project meeting held at the UPV, the researchers presented their first prototype, which they have been working on for a year. This includes a basic safety circuit with emergency switches and a bumper to stop the robot at any obstacle. The initial work has focused on two main areas: mobility in the field, improving the suspension and traction systems in order to climb up slopes with weeds; and the development of the various sensors.
The challenges for the next year are to give the robot enough autonomy to safely drive between vineyard lines using stereoscopic vision, integrating a side camera to provide information about the vegetation status of plants and possible bunches; and the coupling of the sensors on the robot.
The project will be completed in 2016, by which time a range of hi-tech machines are predicted to be appearing on farms, as this technology begins to enter the mainstream. In subsequent decades, the world faces a major challenge in terms of food and water production. Wine industries in particular will be severely affected by 2050, due to climate change. These fast, accurate and intelligent machines could go some way towards mitigating the impacts.
Long-term carbon sequestration is viewed as a way of mitigating climate change. It may be harder to achieve than previously thought, however, due to problems converting the gas to a solid state after injection underground, MIT reports.
Carbon sequestration promises to address human-made greenhouse-gas emissions by capturing carbon dioxide from the atmosphere and injecting it deep below the Earth’s surface, where it would permanently solidify into rock. The U.S. Environmental Protection Agency estimates that current sequestration technologies may eliminate up to 90 percent of carbon dioxide emissions from coal-fired power plants.
While such technologies may successfully remove greenhouse gases from the atmosphere, keeping them locked underground is another matter entirely. Researchers in the Department of Earth, Atmospheric and Planetary Sciences at MIT have found that once injected into the ground, less carbon dioxide is converted to rock than previously imagined. The team studied the chemical reactions between carbon dioxide and its surroundings once the gas is injected into the Earth – finding that as carbon dioxide works its way underground, only a small fraction turns to rock. The remaining gas stays in a more tenuous form.
“If it turns into rock, it’s stable and will remain there permanently,” says Yossi Cohen, a postdoctoral research associate. “However, if it stays in its gaseous or liquid phase, it remains mobile and it can possibly return back to the atmosphere.”
Current techniques aim to inject carbon dioxide into the subsurface some 7,000 feet below ground – a depth equivalent to five Empire State Buildings stacked end-to-end. At such depths, carbon dioxide is stored in deep-saline aquifers: large pockets of brine that can chemically react to solidify the carbon dioxide gas.
Cohen and Daniel Rothman, a professor of geophysics, sought to model the chemical reactions that occur after carbon dioxide is injected into a briny, rocky environment. When carbon dioxide is pumped into the ground, it rushes into open pockets within rock, displacing any existing fluid, such as brine. What remains are bubbles of carbon dioxide, along with carbon dioxide dissolved in water. The dissolved carbon dioxide takes the form of bicarbonate and carbonic acid, which create an acidic environment. To precipitate, or solidify into rock, carbon dioxide requires a basic environment, such as brine.
The researchers modelled the chemical reactions between two main regions:
• An acidic, low-pH region, with a high concentration of carbon dioxide
• A higher-pH region filled with brine, or salty water
As each carbonate species reacts differently when diffusing or flowing through water, the team characterised each reaction, then worked each one into a reactive diffusion model – a simulation of chemical reactions as carbon dioxide flows through a briny, rocky environment. When the team analysed the chemical reactions between regions rich in carbon dioxide and regions of brine, they found that the carbon dioxide solidifies – but only at the interface. The reaction essentially creates a solid wall at the point where carbon dioxide meets brine, keeping the bulk of the gas from reacting with the brine.
“This can basically close the channel, and no more material can move farther into the brine, because as soon as it touches the brine, it will become solid,” Cohen says. “The expectation was that most of the carbon dioxide would become solid mineral. Our work suggests that significantly less will precipitate.”
Cohen and Rothman point out that their theoretical predictions require experimental study to determine the magnitude of this effect.
“Experiments would help determine the kind of rock that would minimise this clogging phenomenon,” Cohen says. “There are many factors, such as the porosity and connectivity between pores in rocks, that will determine if and when carbon dioxide mineralises. Our study reveals new features of this problem that may help identify the optimal geologic formations for long-term sequestration.”
The Japan Meteorological Agency (JMA) has confirmed 2014 as the hottest year on record globally, surpassing the previous record of 1998.
The Japan Meteorological Agency (JMA) has confirmed 2014 as the hottest year on record globally, surpassing the previous record of 1998. What makes last year especially notable is that a new highest temperature occurred even without a significant El Niño, the phenomenon largely responsible for the enormous spike witnessed in 1998. Two other agencies – NASA, and the National Oceanic and Atmospheric Administration (NOAA) – are expected to make similar announcements as their data is released later this month. NASA and NOAA (who use different datasets to JMA) both currently have 2010 tied with 2005 as the hottest year.
While much of the USA was unusually cold during 2014, almost everywhere else on land saw either warmer than average or record high temperatures. Particularly warm regions included Australia, Europe and Siberia. The Met Office announced yesterday that 2014 was the UK's hottest year. In Australia, heatwaves reached higher than 49°C (120°F) at the start of last year, with records broken across the continent for the second year running.
The overwhelming majority (99.9%) of published, peer-reviewed studies agree that human emissions of heat-trapping greenhouse gases such as CO2 are the main cause of recent warming. As of today, no scientific institute of national or international standing disputes this. The U.S. military is now deeply concerned about the geopolitical consequences of a warmer world. Meanwhile, the insurance industry has warned of the mounting costs, with a tripling in the overall number of climate-related disasters that have resulted in losses since 1980. A rise of 0.8°C (1.4°F) may sound like a small amount, but on a planetary scale it's a huge quantity of energy: equivalent to four Hiroshima atomic bombs detonating every second.
As seen in the video below (released by NOAA last month), the atmospheric level of CO2 now stands at 400ppm and is forecast to hit 450ppm by 2030 – compared to around 280ppm prior to the Industrial Revolution. This rate of increase is unprecedented on a geologic timescale and is especially apparent in the chemistry of our oceans, which are now acidifying faster than at any time during the last 300 million years. World leaders are expected to agree a treaty on carbon emissions at the UN Climate Change Conference in Paris later this year – though judging by previous attempts, it is unlikely to be anything substantial.
Moscow State University has announced the creation of a DNA bank to store genetic samples from every living thing on Earth. This new facility, funded by the country's largest ever scientific grant, will be opened in 2018.
According to latest estimates, there are 8.7 million living species on Earth (excluding bacteria and other single-celled microorganisms). The total number may never be known for sure. New organisms are discovered while many others are disappearing on a regular basis – and most of those being lost are never scientifically documented. The first life, known as prokaryotes (simple cells), developed in ancient oceans 3.6 billion years ago. Since that time, it is believed that 99.9% of species have gone extinct. The rate of extinctions has accelerated dramatically in the last century, with experts now reporting a 1,000-fold increase compared to the natural "background" level seen in the fossil record. Up to half of presently existing plant and animal species may vanish by 2100, a disaster similar in scale to the K–Pg event that killed off the dinosaurs.
In order to preserve as much of what remains as possible, Russia intends to build a gigantic facility – 430km² (166mi²) in size – located at one of the central campuses of Moscow State University (MSU). Costing 1 billion rubles (US$18 million), this is funded by the country's largest ever scientific grant and will serve as a repository for millions of genetic samples. All of the university's departments will be involved in research and gathering of materials when the project begins from 2018 onwards. Collaboration with other facilities both in Russia and internationally is also being considered.
"I call the project 'Noah's Ark.' It will involve the creation of a depository – a databank for the storing of every living thing on Earth, including not only living but disappearing and extinct organisms. This is the challenge we have set for ourselves," says Viktor Sadivnichy, MSU rector. "It will enable us to cryogenically freeze and store various cellular materials, which can then reproduce. It will also contain information systems. Not everything needs to be kept in a petri dish."
Given the sheer numbers involved, it could be many decades before samples are retrieved from a significant majority of organisms – and even longer before a global rewilding effort takes shape – but this Noah's Ark of DNA is an important step towards that eventual long-term goal. Other countries have attempted similar projects in recent years, such as Norway's Global Seed Vault and Britain's Frozen Ark. Perhaps in a few centuries, these same efforts will be conducted on alien planets.
If present rates of degradation continue, all of the world's topsoil could be lost by 2075, according to a senior UN official.
Topsoil is the layer of soil that contains the greatest concentration of nutrients, organic matter and microorganisms. It is vital for maintaining a healthy root base and plant growth, enabling farmers to till and produce their food crops. To generate only 3 cm (1.2 in) of topsoil requires between 500 and 1,000 years through natural processes.
Modern agricultural techniques – requiring the soil to be ploughed and replanted each year, together with heavy use of chemicals – have resulted in the gradual erosion of topsoil. Deforestation and global warming also play a role. Worldwide, approximately one-third of topsoil has been lost since the Industrial Revolution. This is already harming the livelihoods of a billion people. If trends continue, all of the world's topsoil will be gone within 60 years, according to a United Nations statement on World Soil Day.
"Soils are the basis of life," said Maria-Helena Semedo, deputy director general of natural resources at the Food and Agriculture Organisation (FAO). "Ninety-five percent of our food comes from the soil."
With global population expected to surpass 9 billion by mid-century – along with huge increases in biofuel production – the amount of arable and productive land per person in 2050 will be only a quarter of the level in 1960, unless a radical transformation of agriculture occurs. In addition to providing food, soils play an essential role in the carbon cycle and water filtration. Soil destruction creates a vicious cycle, in which less carbon is stored, the world gets hotter, and the land is further degraded.
While the rate of degradation is not the same everywhere, "we are losing 30 soccer fields of soil every minute – mostly due to intensive farming," according to Volkert Engelsman, an activist with the International Federation of Organic Agriculture Movements, who spoke to the forum at the FAO's headquarters in Rome.
Organic farming can reduce toxic chemicals and carbon emissions, but requires more land. Vertical greenhouses can work for some crops, but their economic and technical feasibility have yet to be fully proven. In the 2030s, perennial wheat and corn could enable crops to be grown continuously for two or more years – offering a huge improvement over traditional annual crops. Hi-tech solutions might also emerge in the form of 3D-printed or laboratory-grown meat. In the more distant future, humans could upgrade their bodies to become partially or fully non-biological, drawing energy from sources other than food. Whatever solutions are eventually developed, this announcement from the UN is a sobering reminder of just how rapidly our world is changing.
A new solar cell efficiency record of 46% has been achieved by a French-German collaboration.
A new world record for the direct conversion of sunlight into electricity has been established. The multi-junction solar cell converts 46% of the solar light into electrical energy. It was developed in a joint collaboration between Soitec and CEA-Leti (France), together with the Fraunhofer Institute for Solar Energy Systems ISE (Germany).
Multi-junction cells are used in concentrator photovoltaic (CPV) systems to produce low-cost electricity in photovoltaic power plants, in regions with a large amount of direct solar radiation. It is the group's second world record in the last year – after the one previously announced in September 2013 – and clearly demonstrates the strong competitiveness of European photovoltaic research.
Multi-junction solar cells are based on a selection of III-V compound semiconductor materials. The world record cell is a four-junction cell, and each of its sub-cells converts precisely one quarter of the incoming photons in the wavelength range between 300 and 1750 nm into electricity. When applied in concentrator PV, a very small cell is used with a Fresnel lens, which concentrates the sunlight onto the cell. The new record efficiency was measured at a concentration of 508 suns and has been confirmed by the Japanese AIST (National Institute of Advanced Industrial Science and Technology), one of the leading centres for independent verification of solar cell performance under standard testing conditions.
A special challenge that had to be met by this cell was the exact distribution of photons among the four sub-cells. It was achieved by precise tuning of the composition and thicknesses of each layer inside the structure. “This is a major milestone for our French-German collaboration. We are extremely pleased to hear that our result of 46% efficiency has now been independently confirmed by AIST in Japan,” explains Dr. Frank Dimroth, project manager for the cell development at the German Fraunhofer Institute. “CPV is the most efficient solar technology today and suitable for all countries with high direct normal irradiance.”
Jocelyne Wasselin, Vice President at Soitec in France: “We are very proud of this new world record. It confirms we made the right technology choice when we decided to develop this four-junction solar cell and clearly indicates that we can demonstrate 50% efficiency in the near future. To produce this new generation of solar cells, we have already installed a line in France. It uses our bonding and layer-transfer technologies and already employs more than 25 engineers and technicians. I have no doubt that this successful cooperation with our French and German partners will drive further increase of CPV technology efficiency and competitiveness.”
Japanese engineering firm, Shimizu Corp, has announced plans for "Ocean Spiral", an underwater city that would form a nine mile (15 km) structure plunging down to the sea floor. Costing three trillion yen ($25 billion), it would feature residential, hotel and business zones at its top, with resource development facilities at its base to harvest rare earth metals and minerals. Electrical power could be generated by exploiting the wide differences in water temperature between the top and bottom of the ocean. Construction would be achieved with industrial-scale 3D printers using resin components instead of concrete. Shimizu believes the technology required for this project could be available by 2030. The company has been behind a number of previous futuristic concepts, including a "Luna Ring" of solar panels going around the Moon and a floating botanical city that could absorb CO2.
“We had this in Japan in the 1980s when the same corporations were proposing underground and ‘swimming’ cities and 1 kilometre-high towers as part of the rush to development during the height of the bubble economy," says Christian Dimmer, assistant professor in urban studies at Tokyo University. “It’s good that many creative minds are picking their brains as to how to deal with climate change, rising sea levels and the creation of resilient societies – but I hope we don’t forget to think about more open and democratic urban futures in which citizens can take an active role in their creation, rather than being mere passengers in a corporation’s sealed vision of utopia.”
Global warming will cause lightning strikes in the U.S. to increase 50% by 2100, according to a study by the University of California (UC).
New climate models predict a 50 percent increase in lightning strikes across the United States during this century as a result of warming temperatures associated with climate change.
Reporting in the peer-reviewed journal Science, UC Berkeley climate scientist David Romps and his colleagues look at predictions of precipitation and cloud buoyancy in 11 different climate models and conclude that their combined effect will generate more frequent electrical discharges to the ground.
“With warming, thunderstorms become more explosive,” says Romps, an assistant professor of earth and planetary science and a faculty scientist at Lawrence Berkeley National Laboratory. “This has to do with water vapour, which is the fuel for explosive deep convection in the atmosphere. Warming causes there to be more water vapour in the atmosphere – and if you have more fuel lying around, when you get ignition, it can go big time.”
More lightning strikes mean more human injuries; estimates of people struck each year range from hundreds to nearly a thousand, with many deaths. But another significant impact of increased lightning strikes would be more wildfires, since half of all fires – and often the hardest to fight – are ignited by lightning, Romps said. More lightning would also generate more nitrogen oxides in the atmosphere, exerting a strong control on atmospheric chemistry.
While some studies have shown changes in lightning associated with seasonal or year-to-year variations in temperature, there have been no reliable analyses to indicate what the future may hold. Romps and graduate student Jacob Seeley hypothesised that two atmospheric properties — precipitation and cloud buoyancy — together might be a predictor of lightning, and looked at observations during 2011 to see if there was a correlation.
“Lightning is caused by charge separation within clouds, and to maximise charge separation, you have to loft more water vapour and heavy ice particles into the atmosphere,” he said. “We already know that the faster the updrafts, the more lightning, and the more precipitation, the more lightning.”
Precipitation – the total amount of water hitting the ground in the form of rain, snow, hail or other forms – is basically a measure of how convective the atmosphere is, and convection generates lightning. The ascent speeds of those convective clouds are determined by a factor called CAPE — convective available potential energy — which is measured by balloon-borne instruments, called radiosondes, released around the United States twice a day.
“CAPE is a measure of how potentially explosive the atmosphere is – that is, how buoyant a parcel of air would be if you got it convecting, if you got it to punch through overlying air into the free troposphere,” Romps said. “We hypothesised that the product of precipitation and CAPE would predict lightning.”
Using U.S. Weather Service data on precipitation, radiosonde measurements of CAPE and lightning-strike counts from the National Lightning Detection Network at the University of Albany, State University of New York (UAlbany), they concluded that 77 percent of the variations in lightning strikes could be predicted from knowing just these two parameters.
“We were blown away by how incredibly well that worked to predict lightning strikes,” he said.
The intensity of lightning flashes averaged over the year in the lower 48 states during 2011. Data from NLDN.
They then looked at 11 different climate models that predict precipitation and CAPE through this century and are archived in the most recent Coupled Model Intercomparison Project (CMIP5). CMIP was established as a resource for climate scientists, providing a repository of output from global climate models that can be used for comparison and validation.
“With CMIP5, we now have for the first time the CAPE and precipitation data to calculate these time series,” Romps said.
On average, the models predicted an 11 percent increase in CAPE in the U.S. per degree Celsius rise in global average temperature by the end of the 21st century. Because the models predict little average precipitation increase nationwide over this period, the product of CAPE and precipitation gives about a 12 percent rise in cloud-to-ground lightning strikes per degree in the contiguous U.S., or a roughly 50 percent increase by 2100 if Earth sees the expected 4-degree Celsius increase (7 degrees Fahrenheit) in temperature.
Exactly why CAPE increases as the climate warms is still an area of active research, Romps said, though it is clear that it has to do with the fundamental physics of water. Warm air typically contains more water vapour than cold air; in fact, the amount of water vapour that air can “hold” increases exponentially with temperature. Since water vapour is the fuel for thunderstorms, lightning rates can depend very sensitively on temperature.
In the future, Romps plans to look at the distribution of lightning-strike increases around the U.S. and also explore what lightning data can tell climatologists about atmospheric convection.
A new global temperature record for October has been set, according to data from the Japan Meteorological Agency (JMA).
Globally, last month was the hottest October on record – by far – according to data just released by the Japan Meteorological Agency (JMA). This follows the hottest March–May, June, August and September, also recorded this year. Near-surface land and sea surface temperatures were 0.67°C (1.2°F) higher than the 20th century average. Despite oft-repeated claims of a "pause", it seems increasingly likely that 2014 is on course to be the all-time hottest year since the JMA began record-keeping in 1891. Data from the National Oceanic and Atmospheric Administration (NOAA) – the U.S. equivalent of Japan's agency – presents a similar trend, with October 2013 to September 2014 being the warmest 12-month period among all months since 1880. These records have occurred even without the latest El Niño, which has yet to begin, meaning that 2015 could be even hotter.
The Intergovernmental Panel on Climate Change (IPCC) has just released the final part of its Fifth Assessment Report. This further discusses the future impacts of climate change and – it is hoped – will pave the way for a global, legally binding treaty on carbon emissions at the UN Climate Change Conference in Paris during late 2015. This week in Beijing, Chinese President Xi Jinping met with Barack Obama to announce a "historic" agreement that would see U.S. emissions fall 26%-28% below 2005 levels by 2025, while China's would peak by 2030. By announcing these targets now, they hope to inject momentum into the global climate negotiations and inspire other countries to join in coming forward with ambitious actions as soon as possible, preferably before the first quarter of 2015. The two Presidents resolved to work closely together over the next year to address major impediments to reaching a successful treaty in Paris. UN climate chief, Christiana Figueres, said: "These two crucial countries have today announced important pathways towards a better and more secure future for humankind."
Unfortunately for Barack Obama, the U.S. midterm election was a disaster for the Democrats. They will now lose control of the Senate, for the first time since January 2007, with Republicans also increasing their majority in the House. The incoming Senate Majority Leader, Mitch McConnell, stated that his top priority is to "get the EPA reined in" and to dismantle the new emissions rules for coal power plants. In a related development, the controversial Keystone XL was approved yesterday with a 252-161 vote. This 875-mile (1,408 km) pipeline will carry tar sands oil from Alberta, Canada, to the US state of Nebraska where it joins pipes running down to Texas. While creating only 35 permanent jobs, it will transport 51 coal plants' worth of CO2 and do nothing to lower U.S. gas prices.
Meanwhile, the G20 summit now underway in Brisbane, Australia, has seen hundreds of people staging a "head in the sand" protest over the lack of discussions on climate change. Australian Prime Minister, Tony Abbott, recently declared that "coal is good for humanity" while opening a new coal plant and expressing his belief that "the trajectory should be up and up and up in the years and decades to come ... The future for coal is bright."
A new report from the Overseas Development Institute (ODI) and Oil Change International highlights the fact that G20 governments are now spending almost £56bn ($90bn) a year on finding new oil, gas and coal reserves. This is despite clear evidence that two-thirds of fossil fuels must be left in the ground to avoid tipping the world into a climate catastrophe. Phasing out these perverse subsidies may form a crucial part of the negotiations at the Paris conference in 2015.
The science of global warming is clearer than ever. Back in April, a report by McGill University concluded "with confidence levels greater than 99% and most likely greater than 99.9%" that recent warming is not caused by natural factors but is man-made. A new generation of supercomputers – able to crunch hundreds of terabytes' worth of data – has led to what one researcher calls "a golden age for high-resolution climate modelling" with accurate simulations of intense weather and climate events. These models will only get better in the years ahead. On current trends, it should be possible to achieve resolutions down to a square metre by 2030. And yet, even without these models or the IPCC, we know the problem is real.
A project creating the first solar-powered bicycle path will be officially opened in the Netherlands next week. If successful, it could be applied to 20% of the country's roads in the future.
Developed by the Netherlands' TNO research institute, SolaRoad is the first road in the world that converts sunlight into electricity. The pilot project of just a hundred metres will be used as a bike path and consists of concrete modules each measuring 2.5 by 3.5 metres. Solar cells are fitted in one travelling direction underneath an extremely strong top layer of glass with a dirt and abrasion-resistant coating about 1 cm thick.
There are no solar cells on the other side of the road and this is used to test various top layers. In time, energy generated from the road will be used for practical applications in street lighting, traffic systems, electric cars (which will drive on the surface) and households. This first section of SolaRoad is located in Krommenie, along the provincial road N203, next to the Texaco garage on the railway track side (see Google Street View).
For a three-year period, various measurements will be taken and tests performed to enable SolaRoad to undergo further development. The tests must answer questions such as: How does it behave in practice? How much energy does it produce? What is it like to cycle over? In the run-up to the surface being laid, laboratory tests were conducted to ensure all safety and other requirements were met. The modules were found to successfully carry the weight of heavy vehicles such as tractors, though how they respond to longer term wear and tear remains to be seen.
A spokesperson for the project, Sten de Wit, claims that up to 20% of the Netherlands' 140,000 km (87,000 miles) of road could potentially be adapted. The pilot road will be officially opened on 12th November by Dutch Minister of Economic Affairs, Henk Kamp.
A similar concept – Solar Roadways – is being developed in the US, though its technical and financial viability seems to have come under a lot of criticism in the blogosphere and elsewhere. Perhaps this Dutch effort can prove to be more successful.
A new multi-scenario modelling of world human population concludes that even draconian fertility restrictions or a catastrophic mass mortality won't be enough to solve issues of global sustainability by 2100.
Published today in the Proceedings of the National Academy of Sciences of the USA, ecologists Professor Corey Bradshaw and Professor Barry Brook from the University of Adelaide's Environment Institute say that our "virtually locked-in" population growth means the world must focus on policies and technologies that reverse rising consumption of natural resources and enhance recycling, for more immediate sustainability gains.
Fertility reduction efforts, however, through increased family-planning assistance and education, should still be pursued, as this will lead to hundreds of millions fewer people to feed by mid-century.
"Global population has risen so fast over the past century that roughly 14% of all the human beings that have ever existed are still alive today. That's a sobering statistic," says Professor Bradshaw, Director of Ecological Modelling. "This is considered unsustainable for a range of reasons – not least being able to feed everyone, as well as the impact on the climate and environment.
"We examined various scenarios for global human population change to the year 2100 by adjusting fertility and mortality rates to determine the plausible range of population sizes at the end of this century. Even a worldwide one-child policy like China's, implemented over the coming century, or catastrophic mortality events like global conflict or a disease pandemic, would still likely result in 5-10 billion people by 2100."
The team constructed nine different scenarios for continuing population, ranging from "business as usual" through various fertility reductions, to highly unlikely broad-scale catastrophes resulting in billions of deaths.
"We were surprised that a five-year WWIII scenario – mimicking the same proportion of people killed in the First and Second World Wars combined – barely registered a blip on the human population trajectory this century," says Professor Barry Brook.
"Often when I give public lectures about policies to address global change, someone will claim that we are ignoring the 'elephant in the room' of human population size. Yet, as our models show clearly, while there needs to be more policy discussion on this issue, the current inexorable momentum of the global human population precludes any demographic 'quick fixes' to our sustainability problems.
"Our work reveals that effective family planning and reproduction education worldwide have great potential to constrain the size of the human population and alleviate pressure on resource availability over the longer term. Our great-great-great-great grandchildren might ultimately benefit from such planning, but people alive today will not.
"The corollary of these findings is that society's efforts towards sustainability would be directed more productively towards reducing our impact as much as possible through technological and social innovation."
A new method of producing solar cells could reduce the amount of silicon per unit area by 90 per cent compared to the current standard. With the high prices of pure silicon, this could help cut the cost of solar power.
Researchers at the Norwegian University of Science and Technology (NUST) have pioneered a new approach to manufacturing solar cells that requires less silicon and can accommodate silicon 1,000 times less pure than is currently the standard. This breakthrough means that solar cells could be made much more cheaply than at present.
“We're using less expensive raw materials, and smaller amounts of them, we have fewer production steps and our total energy consumption is potentially lower,” explains PhD candidate Fredrik Martinsen and Professor Ursula Gibson, from NUST's Department of Physics.
The researchers’ solar cells are composed of silicon fibres coated in glass. A silicon core is inserted into a glass tube about 30 mm in diameter. This is then heated so that the silicon melts and the glass softens. The tube is stretched out into a thin glass fibre filled with silicon. The process of heating and stretching makes the fibre up to 100 times thinner.
This is the widely accepted industrial method used to produce fibre optic cables. But the NUST researchers – in collaboration with Clemson University in the USA – are the first to use silicon-core fibres made this way in solar cells. The active part of these solar cells is the silicon core, with a diameter of about 100 micrometres.
This production method also enabled them to solve another problem: traditional solar cells require very pure silicon. Manufacturing pure silicon wafers is laborious, energy intensive and expensive. Using their new process, it takes only one-third of the energy to manufacture solar cells compared to the traditional approach of producing silicon wafers.
“We can use relatively dirty silicon – and the purification occurs naturally as part of the process of melting and re-solidifying in fibre form. This means that you save energy, and several steps in production,” says Gibson.
These new solar cells are based on the vertical rod radial-junction design, a relatively new approach.
“The vertical rod design still isn’t common in commercial use. Currently, silicon rods are produced using advanced and expensive nano-techniques that are difficult to scale,” says Martinsen. “But we’re using a tried-and-true industrial bulk process, which can make production a lot cheaper.”
The power produced by these prototype cells is not yet up to commercial standards. The efficiency of modern solar cells is typically about 20 per cent, while the NTNU's version has only managed 3.6 per cent. However, Martinsen claims their work has great potential for improvement – so this new production method is something we might see appearing in future decades, as nanotechnology continues to advance.
“These are the first solar cells produced this way, using impure silicon. So it isn’t surprising that the power output isn’t very high. It’s a little unfair to compare our method to conventional solar cells, which have had 40 years to fine-tune the entire production process. We’ve had a steep learning curve, but not all the steps of our process are fully developed yet. We’re the first to show that you can make solar cells this way. The results are published and the process is set in motion.”
Globally, 2014 is on track for the hottest year ever. September 2014 was the hottest September on record, after the hottest August, which was part of the hottest summer on record. The past 12 months — October 2013–September 2014 — were the warmest 12-month period among all months since records began in 1880.
The combined average temperature over global land and ocean surfaces for September 2014 was the highest on record for September, at 0.72°C (1.30°F) above the 20th century average of 15.0°C (59.0°F).
The global land surface temperature was 0.89°C (1.60°F) above the 20th century average of 12.0°C (53.6°F), the sixth highest for September on record. For the ocean, the September global sea surface temperature was 0.66°C (1.19°F) above the 20th century average of 16.2°C (61.1°F), the highest on record for September and also the highest on record for any month.
The combined global land and ocean average surface temperature for the January–September period (year-to-date) was 0.68°C (1.22°F) above the 20th century average of 14.1°C (57.5°F), tying with 1998 as the warmest such period on record.
Last month, Britain had its driest September since national records began in 1910, with just 20% of the average rainfall for the month. Besides breaking the record itself, this rainfall deficit is especially notable as the preceding eight-month period (January–August) was the wettest such period on record. Meanwhile, 30.6% of the contiguous USA was in drought, with conditions worsening in many regions. Nearly 100% of California and Nevada were in "moderate-to-exceptional" drought.
If 2014 maintains its current trend for the remainder of the year, it will be the warmest calendar year on record, says NOAA. The agency's findings are in strong agreement with both NASA and the JMA, who both reported a record warm September earlier this month too. It also seems quite likely that we'll see an El Niño event during the winter, which could send global temperature anomalies even higher.
The world’s first commercial-scale carbon capture and storage (CCS) process on a coal-fired power plant has been officially opened at Canada's Boundary Dam Power Station. This $1.4 billion project will cut CO2 emissions from the plant by 90% and sulphur dioxide emissions by 100%.
Electric utility company SaskPower’s new process involves retrofitting an old 110-megawatt (MW) coal-fired plant (that was first commissioned in 1959), adding solvent-based processors to strip away carbon dioxide, and then piping the CO2 to a nearby oil field. When fully optimised, it will capture up to a million tonnes of carbon dioxide annually, the equivalent of taking 250,000 cars off the road. The power unit equipped with CCS technology will continue to use coal to power approximately 100,000 homes and businesses in Saskatchewan, near the Canada-U.S. border. The captured CO2 will be used for enhanced oil recovery, with the remainder stored safely and permanently deep underground and continuously monitored.
The Canadian federal government paid $240 million towards the project. The launch was attended by more than 250 people from over 20 countries representing governments, industries and media. Attendees at the event toured the facility and learned how they can access SaskPower’s expertise and knowledge to develop their own CCS initiatives.
“This project is important because it is applicable to 95% of the world’s coal plants,” said Bill Boyd, Saskatchewan Minister of the Economy. “As nations develop emission regulations, they will come to us to see how we continue to provide affordable coal power to customers, but in an environmentally sustainable way.”
This follows news last month of a similar project being developed in Jacksonville, USA. The Environmental Protection Agency (EPA) approved permits allowing the FutureGen Industrial Alliance to capture and store CO2 deep underground – the first project of its kind in the U.S.
“The opening of this new SaskPower plant reinforces the great innovation and development that can take place if you have strong investment and partnerships from the government and industry,” said U.S. Senator Heidi Heitkamp (D-ND). “From my more than a decade working at Dakota Gasification in North Dakota, and from visiting the construction of the SaskPower facility just over a year ago, I understand just how important it is that we look to the future in how we harness our energy. Coal is a key resource in both Canada and the U.S., and through the development of clean coal technology, we can create North American independence and energy security, while also reducing emissions. We need to develop more clean coal plants to make that possible, and in the U.S., we can learn from the steps Canada has taken to find a realistic path forward for coal.”
The economics of CCS are still a major issue, however. At present, SaskPower's project is expensive and depends on having a nearby source of coal alongside an additional revenue stream from the enhanced oil recovery. Environmentalists have also continued to express concerns.
“At the end of the day, many people are going to wonder why SaskPower is investing $1.4-billion in 'clean coal' technology instead of wind, solar or geothermal energy,” said Victor Lau, Saskatchewan Greens Leader. “Our party will be monitoring future developments of this project very carefully.”
This week, Lockheed Martin announced plans for a small-scale fusion power plant to be developed in as little as 10 years. A number of experts have expressed doubts over its viability.
If it ever became a reality, fusion power would be truly world-altering – a clean, safe and essentially limitless supply of energy allowing humanity's continued survival for centuries and millennia to come. The international project known as ITER is planned for operation in 2022 and its eventual successor may emerge in the 2040s. Widespread deployment of fusion is not expected until 2070.
U.S. defence giant Lockheed Martin hopes to accelerate progress in this area, by developing what it calls a compact fusion reactor (CFR). This would be around 10 times smaller than conventional tokamak designs, small enough to fit on the back of a truck and generating 100 megawatts (MW) of power. The company intends to build a prototype within five years – according to its press release – with commercial introduction five years after that. It has several patents pending for the work and is looking for partners in academia, industry and among government laboratories.
As illustrated above, the main improvement over ITER would be the use of a superconducting torus to create a differently shaped magnetic field, able to contain plasma far better than previous configurations. These small reactors could be fitted in U.S. Navy warships and submarines while eliminating the need for other fuel types. They could power small cities of up to 100,000 people, allow planes to fly with unlimited range, or even be used in spacecraft to cut journey times to Mars from six months to a single month. Using a CFR, the cost of desalinated water could fall by 60 percent.
If this sounds too good to be true, it may well be. Although Lockheed has been successful in its magnetised ion confinement experiments, a number of significant challenges remain for a working prototype with plasma confinement – let alone a commercialised version.
"I think it's very overplayed," University of California nuclear engineering professor Dr. Edward Morse told The Register. "They are being very cagey about divulging details."
"Getting net energy from fusion is such a goddamn difficult undertaking," said University of Texas physicist Dr. Swadesh M. Mahajan, in an interview with Mother Jones. "We know of no materials that would be able to handle anywhere near that amount of heat."
"The nuclear engineering clearly fails to be cost effective," Tom Jarboe told Business Insider in an email.
For these reasons, it is perhaps best to wait for more news and developments before adding the CFR to our timeline. We will, of course, keep you updated on Lockheed's progress as it emerges. You can also discuss this project on our forum.
Generating electricity from onshore wind is cheaper than gas, coal and nuclear when externalities are stacked with the levelised cost of energy and subsidies, according to a new study ordered and endorsed by the European Commission.
A new report by the energy consultancy firm Ecofys has been analysed by the European Wind Energy Association (EWEA). Data in the report shows that onshore wind now has an approximate cost of €105 per megawatt hour (MWh) which is cheaper than gas (up to €164), nuclear (€133) and coal (between €162-233). Offshore wind comes in at €186 and solar PV has a cost of around €217 per MWh.
The total cost of energy production – which factors in externalities such as air quality, climate change and human toxicity among others – shows that coal is more expensive than the highest retail electricity price in the EU. The report puts the figure of external costs of the EU's energy mix in 2012 at between €150 and €310 billion (US$190 and US$394 billion).
Justin Wilkes, deputy chief executive officer of the European Wind Energy Association, said: "This report highlights the true cost of Europe's dependence on fossil fuels. Renewables are regularly denigrated for being too expensive and a drain on the taxpayer. Not only does the Commission's report show the alarming cost of coal but it also presents onshore wind as both cheaper and more environmentally-friendly."
Onshore and offshore wind technologies also have room for significant cost reduction. Coal on the other hand is a fully mature technology and is unlikely to reduce costs any further.
He added: "We are heavily subsidising the dirtiest form of electricity generation while proponents use coal's supposed affordability as a justification for its continued use. The irony is that coal is the most expensive form of energy in the European Union. This report shows that we should use the 2030 climate and energy package as a foundation for increasing the use of wind energy in Europe to improve our competitiveness, security and environment."
The University of Washington is developing a new fusion reactor design that could be one-tenth the cost of ITER – while producing five times the amount of energy.
Fusion energy sounds almost too good to be true – zero greenhouse gas emissions, no long-lived radioactive waste, and a nearly unlimited fuel supply. Perhaps the biggest roadblock to adopting fusion energy is that the economics haven't worked out. Fusion power designs aren't cheap enough to outperform systems that use fossil fuels such as coal and natural gas.
Engineers at the University of Washington (UW) hope to change that. They have designed a concept for a fusion reactor that, when scaled up to the size of a large electrical power plant, would rival costs for a new coal-fired plant with similar electrical output. The team will present its reactor design and cost-analysis findings on 17th October at the Fusion Energy Conference in St. Petersburg, Russia.
“Right now, this design has the greatest potential of producing economical fusion power of any current concept,” says Thomas Jarboe, a UW professor of aeronautics and astronautics and an adjunct professor in physics.
The reactor – called the dynomak – began as a class project taught by Jarboe two years ago. After the class had ended, Jarboe and doctoral student Derek Sutherland (who previously worked on a reactor design at MIT) continued to develop and refine the concept.
The design builds on existing technology and creates a magnetic field within a closed space to hold plasma in place long enough for fusion to occur, allowing the hot plasma to react and burn. The reactor itself would be largely self-sustaining, meaning it would continuously heat the plasma to maintain thermonuclear conditions. Heat generated from the reactor would heat up a coolant that is used to spin a turbine and generate electricity, similar to how a typical power reactor works.
“This is a much more elegant solution, because the medium in which you generate fusion is the medium in which you’re also driving all the current required to confine it,” Sutherland says.
There are several ways to create a magnetic field, which is crucial to keeping a fusion reactor going. The UW’s design is known as a spheromak – meaning it generates the majority of magnetic fields by driving electrical currents into the plasma itself. This reduces the amount of required materials and actually allows researchers to shrink the overall size of the reactor.
Other designs, such as the ITER experimental fusion reactor being built in France – due to be operational in 2022 – have to be much larger than UW’s because they rely on superconducting coils that circle around the outside of the device to provide a similar magnetic field. When compared with the fusion reactor concept in France, the UW’s is much less expensive – about one-tenth the cost of ITER – while producing five times the amount of energy.
The UW researchers factored the cost of building a fusion reactor power plant using their design and compared that with building a coal power plant. They used a metric called “overnight capital costs,” which includes all costs, particularly startup infrastructure fees. A fusion power plant producing a gigawatt (1 billion watts) of power would cost $2.7 billion, while a coal plant of the same output would cost $2.8 billion, according to their analysis.
“If we do invest in this type of fusion, we could be rewarded because the commercial reactor unit already looks economical,” Sutherland said. “It’s very exciting.”
Right now, the UW’s concept is about one-tenth the size and power output of a final product, which is still years away. The researchers have successfully tested the prototype’s ability to sustain plasma efficiently, and as they further develop and expand the size of the device, they can ramp up to higher-temperature plasma and get significant fusion power output. The team has filed patents on the concept with the UW’s Centre for Commercialisation and plans to continue developing and scaling up its prototypes. The research was funded by the U.S. Department of Energy.
The evidence for global warming continues to pour in. A new study of ocean heat content shows that temperatures have been greatly underestimated in the Southern Hemisphere. As a result, the world's oceans are now absorbing between 24 and 58 per cent more energy than previously thought.
Like a fleet of miniature research vessels, more than 3,600 robotic floats provide data on upper layers of the world's ocean currents.
Scientists from Lawrence Livermore National Laboratory in California, using satellite observations and a large suite of climate models, have found that long-term ocean warming in the upper 700 metres of Southern Hemisphere oceans has been greatly underestimated.
"This underestimation is a result of poor sampling prior to the last decade, and limitations of the analysis methods that conservatively estimated temperature changes in data-sparse regions," said LLNL oceanographer Paul Durack, lead author of a paper in the 5th October issue of the journal Nature Climate Change.
Ocean heat storage is important because it accounts for over 90 percent of excess heat associated with global warming. The observed ocean and atmosphere warming is a result of continuing greenhouse gas emissions. The Southern Hemisphere oceans make up 60 percent of the world's oceans.
The researchers found that climate models simulating the relative increase in sea surface height between Northern and Southern hemispheres were consistent with highly accurate altimeter observations. However, the simulated upper-ocean warming in Northern and Southern hemispheres was inconsistent with observed estimates of ocean heat content change. These sea level and ocean heat content changes should have been consistent, suggesting that until recent improvements in observational data, Southern Hemisphere ocean heat content changes were underestimated.
Since 2004, automated profiling floats called Argo (pictured above) have been used to measure global ocean temperatures from the surface down to 2,000 m (6,560 ft). These 3,600 floats currently observing the global ocean provide systematic coverage of the Southern Hemisphere for the first time. Argo float data over the last decade, as well as earlier measurements, show that the ocean has been steadily warming, according to Durack.
"The Argo data is really critical," he said. "Estimates that we had until now have been pretty systematically underestimating the changes. Prior to 2004, research has been very limited by poor measurement coverage. Our results suggest that ocean warming has been underestimated by 24 to 58 percent. The conclusion that warming has been underestimated agrees with previous studies. However, it's the first time that scientists have tried to estimate how much heat we've missed."
Given that most of the excess heat associated with global warming is in the oceans, this study has important implications for how scientists view the Earth's overall energy budget. Heat currently stored by the oceans will eventually be released, causing land temperatures to accelerate and triggering more extreme climate events.
"We continue to be stunned at how rapidly the ocean is warming," said Sarah Gille, a Scripps Institution of Oceanography professor who was not involved in the study. "Even if we stopped all greenhouse gas emissions today, we'd still have an ocean that is warmer than the ocean of 1950, and that heat commits us to a warmer climate. Extra heat means extra sea level rise, since warmer water is less dense, so a warmer ocean expands."
"An important result of this paper is the demonstration that the oceans have continued to warm over the past decade, at a rate consistent with estimates of Earth’s net energy imbalance," says Prof. Steve Rintoul, from Australia’s Commonwealth Scientific and Industrial Research Organisation. "While the rate of increase in surface air temperatures slowed in the last 10 to 15 years, the heat stored by the planet, which is heavily dominated by the oceans, has steadily increased as greenhouse gases have continued to rise."
These new results are consistent with another new paper that appears in the same issue of Nature Climate Change. Co-author Felix Landerer of NASA's Jet Propulsion Laboratory, who contributed to both studies, says, "Our other new study on deep-ocean warming found that from 2005 to the present, Argo measurements recorded a continuing warming of the upper-ocean. Using the latest available observations, we're able to show that this upper-ocean warming and satellite measurements are consistent."
In related news, a report by Edinburgh's Heriot-Watt University – based on the work of 30 experts – finds that ocean acidification has increased by 26% since pre-industrial times. It is now causing nearly $1 trillion of damage to coral reefs each year, threatening the livelihoods of 400 million people.
This year marks another milestone for the Aral Sea — a once huge lake in Central Asia that has been shrinking rapidly since the 1960s. For the
first time in modern history, its eastern basin has completely dried up.
These images, taken by NASA's flagship Terra satellite, show how the Aral Sea has changed in just 14 years. It is now apparent that its eastern basin has completely dried up. The transformation is especially stark when compared to the approximate shoreline location in 1960 (black outline).
"This is the first time the eastern basin has completely dried in modern times," says Philip Micklin, a geographer from Western Michigan University and expert on the Aral Sea. "And it is likely the first time it has completely dried in 600 years, since Medieval desiccation associated with diversion of Amu Darya to the Caspian Sea."
In the 1950s and 60s, the government of the former Soviet Union diverted the Amu Darya and the Syr Darya – the region's two major rivers – in order to irrigate farmland. This diversion began the lake's gradual retreat. By the year 2000, the lake had separated into the North (Small) Aral Sea in Kazakhstan and the South (Large) Aral Sea in Uzbekistan. The South Aral had further split into western and eastern lobes.
The rusting remains of abandoned boats in the Aral Sea, Kazakhstan.
The eastern lobe of the South Aral nearly dried in 2009, then saw a huge rebound in 2010. Water levels continued to fluctuate annually in alternately dry and wet years.
According to Micklin, the desiccation in 2014 occurred because there has been less rain and snow in the watershed that starts in the Pamir Mountains; this has greatly reduced water flow on the Amu Darya. In addition, huge amounts of river water continue to be withdrawn for irrigation. The Kok-Aral Dam across the Berg Strait – a channel that connects the northern Aral Sea with the southern part – played some role, but has not been a major factor this year, he said.
Formerly the world's fourth largest lake (pictured below in 1964), the Aral Sea is often described as the worst ecological disaster on the planet. With its eastern half now gone, what remains of the western half is expected to vanish by 2019.