future timeline technology singularity humanity
 
   
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed
 
     
     
 
       
 
 
 

Blog » Energy & the Environment

 
     
 

12th December 2014

The world has 60 years of topsoil left

If present rates of degradation continue, all of the world's topsoil could be lost by 2075, according to a senior UN official.

 

topsoil

 

Topsoil is the layer of soil that contains the greatest concentration of nutrients, organic matter and microorganisms. It is vital for maintaining a healthy root base and plant growth, enabling farmers to till and produce their food crops. To generate only 3 cm (1.2 in) of topsoil requires between 500 and 1,000 years through natural processes.

Modern agricultural techniques – requiring the soil to be ploughed and replanted each year, together with heavy use of chemicals – have resulted in the gradual erosion of topsoil. Deforestation and global warming also play a role. Worldwide, approximately one-third of topsoil has been lost since the Industrial Revolution. This is already harming the livelihoods of a billion people. If trends continue, all of the world's topsoil will be gone within 60 years, according to a United Nations statement on World Soil Day.

"Soils are the basis of life," said Maria-Helena Semedo, deputy director general of natural resources at the Food and Agriculture Organisation (FAO). "Ninety-five percent of our food comes from the soil."

With global population expected to surpass 9 billion by mid-century – along with huge increases in biofuel production – the amount of arable and productive land per person in 2050 will be only a quarter of the level in 1960, unless a radical transformation of agriculture occurs. In addition to providing food, soils play an essential role in the carbon cycle and water filtration. Soil destruction creates a vicious cycle, in which less carbon is stored, the world gets hotter, and the land is further degraded.

While the rate of degradation is not the same everywhere, "we are losing 30 soccer fields of soil every minute – mostly due to intensive farming," according to Volkert Engelsman, an activist with the International Federation of Organic Agriculture Movements, who spoke to the forum at the FAO's headquarters in Rome.

Organic farming can reduce toxic chemicals and carbon emissions, but requires more land. Vertical greenhouses can work for some crops, but their economic and technical feasibility have yet to be fully proven. In the 2030s, perennial wheat and corn could enable crops to be grown continuously for two or more years – offering a huge improvement over traditional annual crops. Hi-tech solutions might also emerge in the form of 3D-printed or laboratory-grown meat. In the more distant future, humans could upgrade their bodies to become partially or fully non-biological, drawing energy from sources other than food. Whatever solutions are eventually developed, this announcement from the UN is a sobering reminder of just how rapidly our world is changing.

 

  speech bubble Comments »
 

 

 

4th December 2014

New solar cell efficiency record

A new solar cell efficiency record of 46% has been achieved by a French-German collaboration.

 

new solar cell record

 

A new world record for the direct conversion of sunlight into electricity has been established. The multi-junction solar cell converts 46% of the solar light into electrical energy. It was developed in a joint collaboration between Soitec and CEA-Leti (France), together with the Fraunhofer Institute for Solar Energy Systems ISE (Germany).

Multi-junction cells are used in concentrator photovoltaic (CPV) systems to produce low-cost electricity in photovoltaic power plants, in regions with a large amount of direct solar radiation. It is the group's second world record in the last year – after the one previously announced in September 2013 – and clearly demonstrates the strong competitiveness of European photovoltaic research.

Multi-junction solar cells are based on a selection of III-V compound semiconductor materials. The world record cell is a four-junction cell, and each of its sub-cells converts precisely one quarter of the incoming photons in the wavelength range between 300 and 1750 nm into electricity. When applied in concentrator PV, a very small cell is used with a Fresnel lens, which concentrates the sunlight onto the cell. The new record efficiency was measured at a concentration of 508 suns and has been confirmed by the Japanese AIST (National Institute of Advanced Industrial Science and Technology), one of the leading centres for independent verification of solar cell performance under standard testing conditions.

 

new solar cell record
© Fraunhofer ISE/Photo Alexander Wekkeli

 

A special challenge that had to be met by this cell was the exact distribution of photons among the four sub-cells. It was achieved by precise tuning of the composition and thicknesses of each layer inside the structure. “This is a major milestone for our French-German collaboration. We are extremely pleased to hear that our result of 46% efficiency has now been independently confirmed by AIST in Japan,” explains Dr. Frank Dimroth, project manager for the cell development at the German Fraunhofer Institute. “CPV is the most efficient solar technology today and suitable for all countries with high direct normal irradiance.”

Jocelyne Wasselin, Vice President at Soitec in France: “We are very proud of this new world record. It confirms we made the right technology choice when we decided to develop this four-junction solar cell and clearly indicates that we can demonstrate 50% efficiency in the near future. To produce this new generation of solar cells, we have already installed a line in France. It uses our bonding and layer-transfer technologies and already employs more than 25 engineers and technicians. I have no doubt that this successful cooperation with our French and German partners will drive further increase of CPV technology efficiency and competitiveness.”

 

  speech bubble Comments »
 

 

 

24th November 2014

Ocean Spiral – an underwater city

Japanese engineering firm, Shimizu Corp, has announced plans for "Ocean Spiral", an underwater city that would form a nine mile (15 km) structure plunging down to the sea floor. Costing three trillion yen ($25 billion), it would feature residential, hotel and business zones at its top, with resource development facilities at its base to harvest rare earth metals and minerals. Electrical power could be generated by exploiting the wide differences in water temperature between the top and bottom of the ocean. Construction would be achieved with industrial-scale 3D printers using resin components instead of concrete. Shimizu believes the technology required for this project could be available by 2030. The company has been behind a number of previous futuristic concepts, including a "Luna Ring" of solar panels going around the Moon and a floating botanical city that could absorb CO2.

“We had this in Japan in the 1980s when the same corporations were proposing underground and ‘swimming’ cities and 1 kilometre-high towers as part of the rush to development during the height of the bubble economy," says Christian Dimmer, assistant professor in urban studies at Tokyo University. “It’s good that many creative minds are picking their brains as to how to deal with climate change, rising sea levels and the creation of resilient societies – but I hope we don’t forget to think about more open and democratic urban futures in which citizens can take an active role in their creation, rather than being mere passengers in a corporation’s sealed vision of utopia.”

For more information on the Ocean Spiral, see its press release.

 

Click to enlarge

  ocean spiral future underwater city   ocean spiral future underwater city   ocean spiral future underwater city   ocean spiral future underwater city  
               
  ocean spiral future underwater city   ocean spiral future underwater city   ocean spiral future underwater city    

 

  speech bubble Comments »
 

 

 

19th November 2014

Lightning strikes will increase due to global warming

Global warming will cause lightning strikes in the U.S. to increase 50% by 2100, according to a study by the University of California (UC).

 

lightning

 

New climate models predict a 50 percent increase in lightning strikes across the United States during this century as a result of warming temperatures associated with climate change.

Reporting in the peer-reviewed journal Science, UC Berkeley climate scientist David Romps and his colleagues look at predictions of precipitation and cloud buoyancy in 11 different climate models and conclude that their combined effect will generate more frequent electrical discharges to the ground.

“With warming, thunderstorms become more explosive,” says Romps, an assistant professor of earth and planetary science and a faculty scientist at Lawrence Berkeley National Laboratory. “This has to do with water vapour, which is the fuel for explosive deep convection in the atmosphere. Warming causes there to be more water vapour in the atmosphere – and if you have more fuel lying around, when you get ignition, it can go big time.”

More lightning strikes mean more human injuries; estimates of people struck each year range from hundreds to nearly a thousand, with many deaths. But another significant impact of increased lightning strikes would be more wildfires, since half of all fires – and often the hardest to fight – are ignited by lightning, Romps said. More lightning would also generate more nitrogen oxides in the atmosphere, exerting a strong control on atmospheric chemistry.

While some studies have shown changes in lightning associated with seasonal or year-to-year variations in temperature, there have been no reliable analyses to indicate what the future may hold. Romps and graduate student Jacob Seeley hypothesised that two atmospheric properties — precipitation and cloud buoyancy — together might be a predictor of lightning, and looked at observations during 2011 to see if there was a correlation.

 

 

 

“Lightning is caused by charge separation within clouds, and to maximise charge separation, you have to loft more water vapour and heavy ice particles into the atmosphere,” he said. “We already know that the faster the updrafts, the more lightning, and the more precipitation, the more lightning.”

Precipitation – the total amount of water hitting the ground in the form of rain, snow, hail or other forms – is basically a measure of how convective the atmosphere is, and convection generates lightning. The ascent speeds of those convective clouds are determined by a factor called CAPE — convective available potential energy — which is measured by balloon-borne instruments, called radiosondes, released around the United States twice a day.

“CAPE is a measure of how potentially explosive the atmosphere is – that is, how buoyant a parcel of air would be if you got it convecting, if you got it to punch through overlying air into the free troposphere,” Romps said. “We hypothesised that the product of precipitation and CAPE would predict lightning.”

Using U.S. Weather Service data on precipitation, radiosonde measurements of CAPE and lightning-strike counts from the National Lightning Detection Network at the University of Albany, State University of New York (UAlbany), they concluded that 77 percent of the variations in lightning strikes could be predicted from knowing just these two parameters.

“We were blown away by how incredibly well that worked to predict lightning strikes,” he said.

 

lightning usa

The intensity of lightning flashes averaged over the year in the lower 48 states during 2011. Data from NLDN.

 

They then looked at 11 different climate models that predict precipitation and CAPE through this century and are archived in the most recent Coupled Model Intercomparison Project (CMIP5). CMIP was established as a resource for climate scientists, providing a repository of output from global climate models that can be used for comparison and validation.

“With CMIP5, we now have for the first time the CAPE and precipitation data to calculate these time series,” Romps said.

On average, the models predicted an 11 percent increase in CAPE in the U.S. per degree Celsius rise in global average temperature by the end of the 21st century. Because the models predict little average precipitation increase nationwide over this period, the product of CAPE and precipitation gives about a 12 percent rise in cloud-to-ground lightning strikes per degree in the contiguous U.S., or a roughly 50 percent increase by 2100 if Earth sees the expected 4-degree Celsius increase (7 degrees Fahrenheit) in temperature.

Exactly why CAPE increases as the climate warms is still an area of active research, Romps said, though it is clear that it has to do with the fundamental physics of water. Warm air typically contains more water vapour than cold air; in fact, the amount of water vapour that air can “hold” increases exponentially with temperature. Since water vapour is the fuel for thunderstorms, lightning rates can depend very sensitively on temperature.

In the future, Romps plans to look at the distribution of lightning-strike increases around the U.S. and also explore what lightning data can tell climatologists about atmospheric convection.

 

  speech bubble Comments »
 

 

 

15th November 2014

Hottest October on record

A new global temperature record for October has been set, according to data from the Japan Meteorological Agency (JMA).

 

hottest october on record Japan Meteorological Agency JMA October 2014 temperature trend graph

 

Globally, last month was the hottest October on record – by far – according to data just released by the Japan Meteorological Agency (JMA). This follows the hottest March–May, June, August and September, also recorded this year. Near-surface land and sea surface temperatures were 0.67°C (1.2°F) higher than the 20th century average. Despite oft-repeated claims of a "pause", it seems increasingly likely that 2014 is on course to be the all-time hottest year since the JMA began record-keeping in 1891. Data from the National Oceanic and Atmospheric Administration (NOAA) – the U.S. equivalent of Japan's agency – presents a similar trend, with October 2013 to September 2014 being the warmest 12-month period among all months since 1880. These records have occurred even without the latest El Niño, which has yet to begin, meaning that 2015 could be even hotter.

The Intergovernmental Panel on Climate Change (IPCC) has just released the final part of its Fifth Assessment Report. This further discusses the future impacts of climate change and – it is hoped – will pave the way for a global, legally binding treaty on carbon emissions at the UN Climate Change Conference in Paris during late 2015. This week in Beijing, Chinese President Xi Jinping met with Barack Obama to announce a "historic" agreement that would see U.S. emissions fall 26%-28% below 2005 levels by 2025, while China's would peak by 2030. By announcing these targets now, they hope to inject momentum into the global climate negotiations and inspire other countries to join in coming forward with ambitious actions as soon as possible, preferably before the first quarter of 2015. The two Presidents resolved to work closely together over the next year to address major impediments to reaching a successful treaty in Paris. UN climate chief, Christiana Figueres, said: "These two crucial countries have today announced important pathways towards a better and more secure future for humankind."

Unfortunately for Barack Obama, the U.S. midterm election was a disaster for the Democrats. They will now lose control of the Senate, for the first time since January 2007, with Republicans also increasing their majority in the House. The incoming Senate Majority Leader, Mitch McConnell, stated that his top priority is to "get the EPA reined in" and to dismantle the new emissions rules for coal power plants. In a related development, the controversial Keystone XL was approved yesterday with a 252-161 vote. This 875-mile (1,408 km) pipeline will carry tar sands oil from Alberta, Canada, to the US state of Nebraska where it joins pipes running down to Texas. While creating only 35 permanent jobs, it will transport 51 coal plants' worth of CO2 and do nothing to lower U.S. gas prices.

Meanwhile, the G20 summit now underway in Brisbane, Australia, has seen hundreds of people staging a "head in the sand" protest over the lack of discussions on climate change. Australian Prime Minister, Tony Abbott, recently declared that "coal is good for humanity" while opening a new coal plant and expressing his belief that "the trajectory should be up and up and up in the years and decades to come ... The future for coal is bright."

A new report from the Overseas Development Institute (ODI) and Oil Change International highlights the fact that G20 governments are now spending almost £56bn ($90bn) a year on finding new oil, gas and coal reserves. This is despite clear evidence that two-thirds of fossil fuels must be left in the ground to avoid tipping the world into a climate catastrophe. Phasing out these perverse subsidies may form a crucial part of the negotiations at the Paris conference in 2015.

The science of global warming is clearer than ever. Back in April, a report by McGill University concluded "with confidence levels greater than 99% and most likely greater than 99.9%" that recent warming is not caused by natural factors but is man-made. A new generation of supercomputers – able to crunch hundreds of terabytes' worth of data – has led to what one researcher calls "a golden age for high-resolution climate modelling" with accurate simulations of intense weather and climate events. These models will only get better in the years ahead. On current trends, it should be possible to achieve resolutions down to a square metre by 2030. And yet, even without these models or the IPCC, we know the problem is real.

 

  speech bubble Comments »
 

 

 

7th November 2014

The world's first solar-powered road

A project creating the first solar-powered bicycle path will be officially opened in the Netherlands next week. If successful, it could be applied to 20% of the country's roads in the future.

 

solar road

 

Developed by the Netherlands' TNO research institute, SolaRoad is the first road in the world that converts sunlight into electricity. The pilot project of just a hundred metres will be used as a bike path and consists of concrete modules each measuring 2.5 by 3.5 metres. Solar cells are fitted in one travelling direction underneath an extremely strong top layer of glass with a dirt and abrasion-resistant coating about 1 cm thick.

There are no solar cells on the other side of the road and this is used to test various top layers. In time, energy generated from the road will be used for practical applications in street lighting, traffic systems, electric cars (which will drive on the surface) and households. This first section of SolaRoad is located in Krommenie, along the provincial road N203, next to the Texaco garage on the railway track side (see Google Street View).

For a three-year period, various measurements will be taken and tests performed to enable SolaRoad to undergo further development. The tests must answer questions such as: How does it behave in practice? How much energy does it produce? What is it like to cycle over? In the run-up to the surface being laid, laboratory tests were conducted to ensure all safety and other requirements were met. The modules were found to successfully carry the weight of heavy vehicles such as tractors, though how they respond to longer term wear and tear remains to be seen.

A spokesperson for the project, Sten de Wit, claims that up to 20% of the Netherlands' 140,000 km (87,000 miles) of road could potentially be adapted. The pilot road will be officially opened on 12th November by Dutch Minister of Economic Affairs, Henk Kamp.

A similar concept – Solar Roadways – is being developed in the US, though its technical and financial viability seems to have come under a lot of criticism in the blogosphere and elsewhere. Perhaps this Dutch effort can prove to be more successful.

 

solaroad

 

  speech bubble Comments »
 

 

 

28th October 2014

Reducing human population to a sustainable level could take centuries

A new multi-scenario modelling of world human population concludes that even draconian fertility restrictions or a catastrophic mass mortality won't be enough to solve issues of global sustainability by 2100.

 

overpopulation

 

Published today in the Proceedings of the National Academy of Sciences of the USA, ecologists Professor Corey Bradshaw and Professor Barry Brook from the University of Adelaide's Environment Institute say that our "virtually locked-in" population growth means the world must focus on policies and technologies that reverse rising consumption of natural resources and enhance recycling, for more immediate sustainability gains.

Fertility reduction efforts, however, through increased family-planning assistance and education, should still be pursued, as this will lead to hundreds of millions fewer people to feed by mid-century.

"Global population has risen so fast over the past century that roughly 14% of all the human beings that have ever existed are still alive today. That's a sobering statistic," says Professor Bradshaw, Director of Ecological Modelling. "This is considered unsustainable for a range of reasons – not least being able to feed everyone, as well as the impact on the climate and environment.

"We examined various scenarios for global human population change to the year 2100 by adjusting fertility and mortality rates to determine the plausible range of population sizes at the end of this century. Even a worldwide one-child policy like China's, implemented over the coming century, or catastrophic mortality events like global conflict or a disease pandemic, would still likely result in 5-10 billion people by 2100."

The team constructed nine different scenarios for continuing population, ranging from "business as usual" through various fertility reductions, to highly unlikely broad-scale catastrophes resulting in billions of deaths.

"We were surprised that a five-year WWIII scenario – mimicking the same proportion of people killed in the First and Second World Wars combined – barely registered a blip on the human population trajectory this century," says Professor Barry Brook.

"Often when I give public lectures about policies to address global change, someone will claim that we are ignoring the 'elephant in the room' of human population size. Yet, as our models show clearly, while there needs to be more policy discussion on this issue, the current inexorable momentum of the global human population precludes any demographic 'quick fixes' to our sustainability problems.

"Our work reveals that effective family planning and reproduction education worldwide have great potential to constrain the size of the human population and alleviate pressure on resource availability over the longer term. Our great-great-great-great grandchildren might ultimately benefit from such planning, but people alive today will not.

"The corollary of these findings is that society's efforts towards sustainability would be directed more productively towards reducing our impact as much as possible through technological and social innovation."

 

  speech bubble Comments »
 

 

 

26th October 2014

Cheaper silicon means cheaper solar cells

A new method of producing solar cells could reduce the amount of silicon per unit area by 90 per cent compared to the current standard. With the high prices of pure silicon, this could help cut the cost of solar power.

 

solar power

 

Researchers at the Norwegian University of Science and Technology (NUST) have pioneered a new approach to manufacturing solar cells that requires less silicon and can accommodate silicon 1,000 times less pure than is currently the standard. This breakthrough means that solar cells could be made much more cheaply than at present.

“We're using less expensive raw materials, and smaller amounts of them, we have fewer production steps and our total energy consumption is potentially lower,” explains PhD candidate Fredrik Martinsen and Professor Ursula Gibson, from NUST's Department of Physics.

The researchers’ solar cells are composed of silicon fibres coated in glass. A silicon core is inserted into a glass tube about 30 mm in diameter. This is then heated so that the silicon melts and the glass softens. The tube is stretched out into a thin glass fibre filled with silicon. The process of heating and stretching makes the fibre up to 100 times thinner.

This is the widely accepted industrial method used to produce fibre optic cables. But the NUST researchers – in collaboration with Clemson University in the USA – are the first to use silicon-core fibres made this way in solar cells. The active part of these solar cells is the silicon core, with a diameter of about 100 micrometres.

 

silicon fibres

 

This production method also enabled them to solve another problem: traditional solar cells require very pure silicon. Manufacturing pure silicon wafers is laborious, energy intensive and expensive. Using their new process, it takes only one-third of the energy to manufacture solar cells compared to the traditional approach of producing silicon wafers.

“We can use relatively dirty silicon – and the purification occurs naturally as part of the process of melting and re-solidifying in fibre form. This means that you save energy, and several steps in production,” says Gibson.

These new solar cells are based on the vertical rod radial-junction design, a relatively new approach.

“The vertical rod design still isn’t common in commercial use. Currently, silicon rods are produced using advanced and expensive nano-techniques that are difficult to scale,” says Martinsen. “But we’re using a tried-and-true industrial bulk process, which can make production a lot cheaper.”

The power produced by these prototype cells is not yet up to commercial standards. The efficiency of modern solar cells is typically about 20 per cent, while the NTNU's version has only managed 3.6 per cent. However, Martinsen claims their work has great potential for improvement – so this new production method is something we might see appearing in future decades, as nanotechnology continues to advance.

“These are the first solar cells produced this way, using impure silicon. So it isn’t surprising that the power output isn’t very high. It’s a little unfair to compare our method to conventional solar cells, which have had 40 years to fine-tune the entire production process. We’ve had a steep learning curve, but not all the steps of our process are fully developed yet. We’re the first to show that you can make solar cells this way. The results are published and the process is set in motion.”

 

  speech bubble Comments »
 

 

 

21st October 2014

2014 on track for hottest year ever

Globally, 2014 is on track for the hottest year ever. September 2014 was the hottest September on record, after the hottest August, which was part of the hottest summer on record. The past 12 months — October 2013–September 2014 — were the warmest 12-month period among all months since records began in 1880.

 

2014 global temperature records

 

The National Oceanic and Atmospheric Administration (NOAA) has released its latest State of the Climate Report. Highlights include:

  • The combined average temperature over global land and ocean surfaces for September 2014 was the highest on record for September, at 0.72°C (1.30°F) above the 20th century average of 15.0°C (59.0°F).

  • The global land surface temperature was 0.89°C (1.60°F) above the 20th century average of 12.0°C (53.6°F), the sixth highest for September on record. For the ocean, the September global sea surface temperature was 0.66°C (1.19°F) above the 20th century average of 16.2°C (61.1°F), the highest on record for September and also the highest on record for any month.

  • The combined global land and ocean average surface temperature for the January–September period (year-to-date) was 0.68°C (1.22°F) above the 20th century average of 14.1°C (57.5°F), tying with 1998 as the warmest such period on record.

Last month, Britain had its driest September since national records began in 1910, with just 20% of the average rainfall for the month. Besides breaking the record itself, this rainfall deficit is especially notable as the preceding eight-month period (January–August) was the wettest such period on record. Meanwhile, 30.6% of the contiguous USA was in drought, with conditions worsening in many regions. Nearly 100% of California and Nevada were in "moderate-to-exceptional" drought.

If 2014 maintains its current trend for the remainder of the year, it will be the warmest calendar year on record, says NOAA. The agency's findings are in strong agreement with both NASA and the JMA, who both reported a record warm September earlier this month too. It also seems quite likely that we'll see an El Niño event during the winter, which could send global temperature anomalies even higher.

 

2014 global warming trend

 

  speech bubble Comments »
 

 

 

21st October 2014

World's first carbon-capture coal power plant

The world’s first commercial-scale carbon capture and storage (CCS) process on a coal-fired power plant has been officially opened at Canada's Boundary Dam Power Station. This $1.4 billion project will cut CO2 emissions from the plant by 90% and sulphur dioxide emissions by 100%.

 

worlds first commercial scale carbon capture coal power plant 2014

 

Electric utility company SaskPower’s new process involves retrofitting an old 110-megawatt (MW) coal-fired plant (that was first commissioned in 1959), adding solvent-based processors to strip away carbon dioxide, and then piping the CO2 to a nearby oil field. When fully optimised, it will capture up to a million tonnes of carbon dioxide annually, the equivalent of taking 250,000 cars off the road. The power unit equipped with CCS technology will continue to use coal to power approximately 100,000 homes and businesses in Saskatchewan, near the Canada-U.S. border. The captured CO2 will be used for enhanced oil recovery, with the remainder stored safely and permanently deep underground and continuously monitored.

The Canadian federal government paid $240 million towards the project. The launch was attended by more than 250 people from over 20 countries representing governments, industries and media. Attendees at the event toured the facility and learned how they can access SaskPower’s expertise and knowledge to develop their own CCS initiatives.

“This project is important because it is applicable to 95% of the world’s coal plants,” said Bill Boyd, Saskatchewan Minister of the Economy. “As nations develop emission regulations, they will come to us to see how we continue to provide affordable coal power to customers, but in an environmentally sustainable way.”

This follows news last month of a similar project being developed in Jacksonville, USA. The Environmental Protection Agency (EPA) approved permits allowing the FutureGen Industrial Alliance to capture and store CO2 deep underground – the first project of its kind in the U.S.

“The opening of this new SaskPower plant reinforces the great innovation and development that can take place if you have strong investment and partnerships from the government and industry,” said U.S. Senator Heidi Heitkamp (D-ND). “From my more than a decade working at Dakota Gasification in North Dakota, and from visiting the construction of the SaskPower facility just over a year ago, I understand just how important it is that we look to the future in how we harness our energy. Coal is a key resource in both Canada and the U.S., and through the development of clean coal technology, we can create North American independence and energy security, while also reducing emissions. We need to develop more clean coal plants to make that possible, and in the U.S., we can learn from the steps Canada has taken to find a realistic path forward for coal.”

The economics of CCS are still a major issue, however. At present, SaskPower's project is expensive and depends on having a nearby source of coal alongside an additional revenue stream from the enhanced oil recovery. Environmentalists have also continued to express concerns.

“At the end of the day, many people are going to wonder why SaskPower is investing $1.4-billion in 'clean coal' technology instead of wind, solar or geothermal energy,” said Victor Lau, Saskatchewan Greens Leader. “Our party will be monitoring future developments of this project very carefully.”

 

  speech bubble Comments »
 

 

 

18th October 2014

Lockheed Martin planning a compact fusion reactor within 10 years

This week, Lockheed Martin announced plans for a small-scale fusion power plant to be developed in as little as 10 years. A number of experts have expressed doubts over its viability.

 

lockheed martin compact fusion reactor design 2019 2024 technology

 

If it ever became a reality, fusion power would be truly world-altering – a clean, safe and essentially limitless supply of energy allowing humanity's continued survival for centuries and millennia to come. The international project known as ITER is planned for operation in 2022 and its eventual successor may emerge in the 2040s. Widespread deployment of fusion is not expected until 2070.

U.S. defence giant Lockheed Martin hopes to accelerate progress in this area, by developing what it calls a compact fusion reactor (CFR). This would be around 10 times smaller than conventional tokamak designs, small enough to fit on the back of a truck and generating 100 megawatts (MW) of power. The company intends to build a prototype within five years – according to its press release – with commercial introduction five years after that. It has several patents pending for the work and is looking for partners in academia, industry and among government laboratories.

As illustrated above, the main improvement over ITER would be the use of a superconducting torus to create a differently shaped magnetic field, able to contain plasma far better than previous configurations. These small reactors could be fitted in U.S. Navy warships and submarines while eliminating the need for other fuel types. They could power small cities of up to 100,000 people, allow planes to fly with unlimited range, or even be used in spacecraft to cut journey times to Mars from six months to a single month. Using a CFR, the cost of desalinated water could fall by 60 percent.

 

 

If this sounds too good to be true, it may well be. Although Lockheed has been successful in its magnetised ion confinement experiments, a number of significant challenges remain for a working prototype with plasma confinement – let alone a commercialised version.

"I think it's very overplayed," University of California nuclear engineering professor Dr. Edward Morse told The Register. "They are being very cagey about divulging details."

"Getting net energy from fusion is such a goddamn difficult undertaking," said University of Texas physicist Dr. Swadesh M. Mahajan, in an interview with Mother Jones. "We know of no materials that would be able to handle anywhere near that amount of heat."

"The nuclear engineering clearly fails to be cost effective," Tom Jarboe told Business Insider in an email.

For these reasons, it is perhaps best to wait for more news and developments before adding the CFR to our timeline. We will, of course, keep you updated on Lockheed's progress as it emerges. You can also discuss this project on our forum.

 

  speech bubble Comments »
 

 

 

14th October 2014

Onshore wind is cheaper than coal, gas and nuclear

Generating electricity from onshore wind is cheaper than gas, coal and nuclear when externalities are stacked with the levelised cost of energy and subsidies, according to a new study ordered and endorsed by the European Commission.

 

onshore wind power

 

A new report by the energy consultancy firm Ecofys has been analysed by the European Wind Energy Association (EWEA). Data in the report shows that onshore wind now has an approximate cost of €105 per megawatt hour (MWh) which is cheaper than gas (up to €164), nuclear (€133) and coal (between €162-233). Offshore wind comes in at €186 and solar PV has a cost of around €217 per MWh.

The total cost of energy production – which factors in externalities such as air quality, climate change and human toxicity among others – shows that coal is more expensive than the highest retail electricity price in the EU. The report puts the figure of external costs of the EU's energy mix in 2012 at between €150 and €310 billion (US$190 and US$394 billion).

Justin Wilkes, deputy chief executive officer of the European Wind Energy Association, said: "This report highlights the true cost of Europe's dependence on fossil fuels. Renewables are regularly denigrated for being too expensive and a drain on the taxpayer. Not only does the Commission's report show the alarming cost of coal but it also presents onshore wind as both cheaper and more environmentally-friendly."

Onshore and offshore wind technologies also have room for significant cost reduction. Coal on the other hand is a fully mature technology and is unlikely to reduce costs any further.

He added: "We are heavily subsidising the dirtiest form of electricity generation while proponents use coal's supposed affordability as a justification for its continued use. The irony is that coal is the most expensive form of energy in the European Union. This report shows that we should use the 2030 climate and energy package as a foundation for increasing the use of wind energy in Europe to improve our competitiveness, security and environment."

 

  speech bubble Comments »
 

 

 

9th October 2014

Fusion reactor concept could be cheaper than coal

The University of Washington is developing a new fusion reactor design that could be one-tenth the cost of ITER – while producing five times the amount of energy.

 

HIT-SI3

 

Fusion energy sounds almost too good to be true – zero greenhouse gas emissions, no long-lived radioactive waste, and a nearly unlimited fuel supply. Perhaps the biggest roadblock to adopting fusion energy is that the economics haven't worked out. Fusion power designs aren't cheap enough to outperform systems that use fossil fuels such as coal and natural gas.

Engineers at the University of Washington (UW) hope to change that. They have designed a concept for a fusion reactor that, when scaled up to the size of a large electrical power plant, would rival costs for a new coal-fired plant with similar electrical output. The team will present its reactor design and cost-analysis findings on 17th October at the Fusion Energy Conference in St. Petersburg, Russia.

“Right now, this design has the greatest potential of producing economical fusion power of any current concept,” says Thomas Jarboe, a UW professor of aeronautics and astronautics and an adjunct professor in physics.

The reactor – called the dynomak – began as a class project taught by Jarboe two years ago. After the class had ended, Jarboe and doctoral student Derek Sutherland (who previously worked on a reactor design at MIT) continued to develop and refine the concept.

The design builds on existing technology and creates a magnetic field within a closed space to hold plasma in place long enough for fusion to occur, allowing the hot plasma to react and burn. The reactor itself would be largely self-sustaining, meaning it would continuously heat the plasma to maintain thermonuclear conditions. Heat generated from the reactor would heat up a coolant that is used to spin a turbine and generate electricity, similar to how a typical power reactor works.

“This is a much more elegant solution, because the medium in which you generate fusion is the medium in which you’re also driving all the current required to confine it,” Sutherland says.

 

fusion design

 

There are several ways to create a magnetic field, which is crucial to keeping a fusion reactor going. The UW’s design is known as a spheromak – meaning it generates the majority of magnetic fields by driving electrical currents into the plasma itself. This reduces the amount of required materials and actually allows researchers to shrink the overall size of the reactor.

Other designs, such as the ITER experimental fusion reactor being built in France – due to be operational in 2022 – have to be much larger than UW’s because they rely on superconducting coils that circle around the outside of the device to provide a similar magnetic field. When compared with the fusion reactor concept in France, the UW’s is much less expensive – about one-tenth the cost of ITER – while producing five times the amount of energy.

The UW researchers factored the cost of building a fusion reactor power plant using their design and compared that with building a coal power plant. They used a metric called “overnight capital costs,” which includes all costs, particularly startup infrastructure fees. A fusion power plant producing a gigawatt (1 billion watts) of power would cost $2.7 billion, while a coal plant of the same output would cost $2.8 billion, according to their analysis.

“If we do invest in this type of fusion, we could be rewarded because the commercial reactor unit already looks economical,” Sutherland said. “It’s very exciting.”

Right now, the UW’s concept is about one-tenth the size and power output of a final product, which is still years away. The researchers have successfully tested the prototype’s ability to sustain plasma efficiently, and as they further develop and expand the size of the device, they can ramp up to higher-temperature plasma and get significant fusion power output. The team has filed patents on the concept with the UW’s Centre for Commercialisation and plans to continue developing and scaling up its prototypes. The research was funded by the U.S. Department of Energy.

 

  speech bubble Comments »
 

 

 

8th October 2014

Ocean warming in Southern Hemisphere has been greatly underestimated

The evidence for global warming continues to pour in. A new study of ocean heat content shows that temperatures have been greatly underestimated in the Southern Hemisphere. As a result, the world's oceans are now absorbing between 24 and 58 per cent more energy than previously thought.

 

ocean heat content southern hemisphere global warming underestimated
Like a fleet of miniature research vessels, more than 3,600 robotic floats provide data on upper layers of the world's ocean currents.

 

Scientists from Lawrence Livermore National Laboratory in California, using satellite observations and a large suite of climate models, have found that long-term ocean warming in the upper 700 metres of Southern Hemisphere oceans has been greatly underestimated.

"This underestimation is a result of poor sampling prior to the last decade, and limitations of the analysis methods that conservatively estimated temperature changes in data-sparse regions," said LLNL oceanographer Paul Durack, lead author of a paper in the 5th October issue of the journal Nature Climate Change.

Ocean heat storage is important because it accounts for over 90 percent of excess heat associated with global warming. The observed ocean and atmosphere warming is a result of continuing greenhouse gas emissions. The Southern Hemisphere oceans make up 60 percent of the world's oceans.

The researchers found that climate models simulating the relative increase in sea surface height between Northern and Southern hemispheres were consistent with highly accurate altimeter observations. However, the simulated upper-ocean warming in Northern and Southern hemispheres was inconsistent with observed estimates of ocean heat content change. These sea level and ocean heat content changes should have been consistent, suggesting that until recent improvements in observational data, Southern Hemisphere ocean heat content changes were underestimated.

Since 2004, automated profiling floats called Argo (pictured above) have been used to measure global ocean temperatures from the surface down to 2,000 m (6,560 ft). These 3,600 floats currently observing the global ocean provide systematic coverage of the Southern Hemisphere for the first time. Argo float data over the last decade, as well as earlier measurements, show that the ocean has been steadily warming, according to Durack.

"The Argo data is really critical," he said. "Estimates that we had until now have been pretty systematically underestimating the changes. Prior to 2004, research has been very limited by poor measurement coverage. Our results suggest that ocean warming has been underestimated by 24 to 58 percent. The conclusion that warming has been underestimated agrees with previous studies. However, it's the first time that scientists have tried to estimate how much heat we've missed."

 

ocean heat content global warming map

 

Given that most of the excess heat associated with global warming is in the oceans, this study has important implications for how scientists view the Earth's overall energy budget. Heat currently stored by the oceans will eventually be released, causing land temperatures to accelerate and triggering more extreme climate events.

"We continue to be stunned at how rapidly the ocean is warming," said Sarah Gille, a Scripps Institution of Oceanography professor who was not involved in the study. "Even if we stopped all greenhouse gas emissions today, we'd still have an ocean that is warmer than the ocean of 1950, and that heat commits us to a warmer climate. Extra heat means extra sea level rise, since warmer water is less dense, so a warmer ocean expands."

"An important result of this paper is the demonstration that the oceans have continued to warm over the past decade, at a rate consistent with estimates of Earth’s net energy imbalance," says Prof. Steve Rintoul, from Australia’s Commonwealth Scientific and Industrial Research Organisation. "While the rate of increase in surface air temperatures slowed in the last 10 to 15 years, the heat stored by the planet, which is heavily dominated by the oceans, has steadily increased as greenhouse gases have continued to rise."

These new results are consistent with another new paper that appears in the same issue of Nature Climate Change. Co-author Felix Landerer of NASA's Jet Propulsion Laboratory, who contributed to both studies, says, "Our other new study on deep-ocean warming found that from 2005 to the present, Argo measurements recorded a continuing warming of the upper-ocean. Using the latest available observations, we're able to show that this upper-ocean warming and satellite measurements are consistent."

In related news, a report by Edinburgh's Heriot-Watt University – based on the work of 30 experts – finds that ocean acidification has increased by 26% since pre-industrial times. It is now causing nearly $1 trillion of damage to coral reefs each year, threatening the livelihoods of 400 million people.

 

  speech bubble Comments »
 

 

 

3rd October 2014

Eastern basin of the Aral Sea has completely dried up

This year marks another milestone for the Aral Sea — a once huge lake in Central Asia that has been shrinking rapidly since the 1960s. For the first time in modern history, its eastern basin has completely dried up.

 

aral sea eastern basin dried up 2000 2014

 

These images, taken by NASA's flagship Terra satellite, show how the Aral Sea has changed in just 14 years. It is now apparent that its eastern basin has completely dried up. The transformation is especially stark when compared to the approximate shoreline location in 1960 (black outline).

"This is the first time the eastern basin has completely dried in modern times," says Philip Micklin, a geographer from Western Michigan University and expert on the Aral Sea. "And it is likely the first time it has completely dried in 600 years, since Medieval desiccation associated with diversion of Amu Darya to the Caspian Sea."

In the 1950s and 60s, the government of the former Soviet Union diverted the Amu Darya and the Syr Darya – the region's two major rivers – in order to irrigate farmland. This diversion began the lake's gradual retreat. By the year 2000, the lake had separated into the North (Small) Aral Sea in Kazakhstan and the South (Large) Aral Sea in Uzbekistan. The South Aral had further split into western and eastern lobes.

 

abandoned boats in the aral sea
The rusting remains of abandoned boats in the Aral Sea, Kazakhstan.

 

The eastern lobe of the South Aral nearly dried in 2009, then saw a huge rebound in 2010. Water levels continued to fluctuate annually in alternately dry and wet years.

According to Micklin, the desiccation in 2014 occurred because there has been less rain and snow in the watershed that starts in the Pamir Mountains; this has greatly reduced water flow on the Amu Darya. In addition, huge amounts of river water continue to be withdrawn for irrigation. The Kok-Aral Dam across the Berg Strait – a channel that connects the northern Aral Sea with the southern part – played some role, but has not been a major factor this year, he said.

Formerly the world's fourth largest lake (pictured below in 1964), the Aral Sea is often described as the worst ecological disaster on the planet. With its eastern half now gone, what remains of the western half is expected to vanish by 2019.

 

aral sea in 1964
Satellite view of the Aral Sea in 1964.

 

  speech bubble Comments »
 

 

 

 
     
       
     
   
« Previous  
   
     
   

 
     
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2014

 

2013

 

2012

 

2011

 

2010

 

 
 
 
 

 


future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed