21st April 2016
Fossil fuels could be phased out worldwide in a decade
The worldwide reliance on burning fossil fuels to create energy could be phased out in a decade, according to an article published by a major energy think tank.
Professor Benjamin Sovacool, from the University of Sussex in the United Kingdom, believes that the next great energy revolution could take place in a fraction of the time of major changes in the past. But it would take a collaborative, interdisciplinary, multi-scalar effort to get there, he warns. And that effort must learn from the trials and tribulations from previous energy systems and technology transitions.
In a paper published by the peer-reviewed journal Energy Research & Social Science, he analyses energy transitions throughout history and argues that only looking at the past can often paint an overly bleak and unnecessary picture. Moving from wood to coal in Europe, for example, took between 96 and 160 years, whereas electricity took 47 to 69 years to enter into mainstream use.
But this time, the future could be different, he says – the scarcity of fossil fuel resources, the threat of climate change and vastly improved technological learning and innovation could greatly accelerate a global shift to a cleaner energy future.
The study highlights numerous examples of speedier transitions that are often overlooked by analysts. For example, Ontario completed a shift away from coal between 2003 and 2014; a major household energy programme in Indonesia took just three years to move two-thirds of the population from kerosene stoves to LPG stoves; and France's nuclear power programme saw supply rocket from four per cent of the electricity supply market in 1970 to 40 per cent in 1982. Each of these cases has in common strong government intervention along with shifts in consumer behaviour, often driven by incentives and pressure from stakeholders.
"The mainstream view of energy transitions as long, protracted affairs, often taking decades or centuries to occur, is not always supported by evidence," says Professor Sovacool. "Moving to a new, cleaner energy system would require significant shifts in technology, political regulations, tariffs and pricing regimes, and the behaviour of users and adopters.
"Left to evolve by itself – as it has largely been in the past – this can indeed take many decades. A lot of stars have to align all at once. But we have learnt a sufficient amount from previous transitions that I believe future transformations can happen much more rapidly."
To summarise, while the study suggests that the historical record can be instructive in shaping our understanding of macro and micro energy transitions, it need not be predictive.
While his forecast may seem outlandish and overly optimistic, there are signs that he might be correct. The worldwide installed capacities of solar and wind power, for example, continue to grow exponentially as costs plummet and efficiencies improve. Likewise, the number of electric cars in the world is rising exponentially and passed the one million mark in 2015. In Europe, the Netherlands is proposing a ban on all gas and diesel car sales by 2025, while UK firm Ecotricity is looking at "green gas", a carbon-neutral alternative to fracking. Meanwhile in the United States, Elon Musk last year unveiled a new battery system – the Powerwall – that could revolutionise clean energy storage for homes and businesses.
• Follow us on Twitter
• Follow us on Facebook
10th April 2016
Self-driving trucks complete journey across Europe
Fleets of self-driving trucks this week completed the European Truck Platooning Challenge.
As part of the world's first cross-border initiative with smart trucks, six "platoons" of semi-automated trucks have completed their journeys from various cities across Europe, reaching their final destination at the Port of Rotterdam on 6th April.
They were participating in the European Truck Platooning Challenge organised by the Dutch government as part of its EU Presidency. The European Automobile Manufacturers' Association (ACEA) and commercial vehicle members – including Daimler, Scania and Volvo – are active partners of the initiative with each supplying a platoon. One set of trucks made by Scania travelled over 2,000 km and crossed four borders. Daimler made headlines in 2014 when the company demonstrated the world's first autonomous truck in Magdeburg, Germany, and in 2015 its Freightliner Inspiration Trucks gained a licence for road use in Nevada.
Truck platooning – which has the potential to make transport cleaner, safer and more efficient – is the linking of two or three trucks in a convoy. These vehicles follow each other at a fixed, close distance, by using connectivity technology and automated driving support systems.
Using this technique cuts fuel use by 15%, prevents human error from causing accidents, and reduces congestion, according to research firm TNO. Expenses can be lowered significantly.
Two trucks doing 100,000 miles annually can save €6,000 ($6,840) on fuel by platooning, compared to driving on cruise control. Safety is greatly improved by using technology such as Volvo's emergency braking system and Daimler's Highway Pilot Connect – systems with braking reaction times of under 0.1 seconds, compared to 1.4 seconds for a human driver. A Wi-Fi connection between the trucks ensures synchronised braking and can prevent sudden jolt/shock effects.
When operating in platoon mode, a convoy of three semi-autonomous trucks can travel much closer together – requiring only 80 metres of road space, from end to end. For comparison, if they were driven by humans, they would need to fill 185 metres of road. Congestion on roads will therefore be greatly reduced if more and more self-driving vehicles are deployed in the future, while pollution can also be lowered.
Melanie Schulz, Dutch minister for Infrastructure and the Environment who spearheaded this initiative, commented: "The results of this first ever major try-out in Europe are promising. The hands-on experience gained here will be very useful in the informal European transport council on 14th April in Amsterdam. It will certainly help my colleagues and I discuss the adjustments needed to make self-driving transport a reality."
There are still a number of barriers standing in the way of truck platooning across Europe. These barriers are not of a technical nature as platooning technology exists already; rather they are caused by differences in legislation between the EU member states: "Harmonisation is needed if we want a wide-scale introduction of platooning," stated Harrie Schippers, President of DAF Trucks.
Sufficient demand is also crucial, to ensure the right level of market uptake. Following the Truck Platooning Challenge, there have been encouraging expressions of interest from the business community and the transport sector, including Unilever and major Dutch supermarkets. The testing phase is the most important next step. More and more national governments are offering industry the opportunity to test their latest vehicles and technologies, thereby also supporting efforts to increase public awareness, understanding and acceptance. However, this is also vital on a pan-European scale.
"It is precisely for this reason that we believe that the European Truck Platooning Challenge has been a huge success: it has fostered much-needed cooperation between all relevant stakeholders right across the EU, facilitating cross-border driving and encouraging compatibility on legal and technical issues," said Schippers. "We look forward to harvesting the learnings from this initiative so that, together, we can make truck platoons a common sight on Europe's roads in the future."
• Follow us on Twitter
• Follow us on Facebook
20th March 2016
An interview with climate scientist Paul Beckwith
Following the recent news that Arctic sea ice reached its lowest extent on record, we interviewed climate scientist Paul Beckwith, an expert on this subject and a member of the Arctic Methane Emergency Group (AMEG).
Future Timeline: Hi Paul. Thanks for agreeing to do this interview. First of all, could you tell us a bit about your background, how long you've been involved in climate science, and what areas of climatology you specialise in?
Paul Beckwith: Hello, and thank you. It is my pleasure to have this interview with you.
I am an Engineer with a Bachelor of Engineering Degree in Engineering Physics (often called Engineering Science) from McMaster University in Hamilton, Ontario, Canada. I finished at the top of my class and received many scholarships and awards during my studies. My CV can be found on my website PaulBeckwith.net under the About Me section.
I am a Physicist with a Master of Science Degree in Laser Physics. My research area was blowing molecules apart with high-powered CO2 lasers and measuring the chunks flying off with low-power tunable diode lasers. This involved the science of molecular spectroscopy in the infrared region. I worked in industry for many years, as a Product Line Manager for optical switching devices in high speed fibre optic communication systems, on high-powered excimer laser research and tunable laser research, and also on software quality assurance for various tech companies.
I have been interested in climate science my entire life. I decided to formally study it after becoming concerned with the lack of urgency by the public, scientists (literally everybody) about six years ago or so.
I am a part time professor in the Laboratory for Paleoclimatology in the Geography Department at the University of Ottawa. I have taught many courses including climatology, meteorology, oceanography and the geography of environmental issues. My research work in my PhD program is abrupt climate system change in the past and present, to determine what will happen in the near future. I am very active on educating the public about the grave dangers that we face from abrupt climate change, using primarily videos and blogs and public talks. My research is self-funded, apart from my teaching, and I greatly welcome financial contributions at the Please Donate button on the main task bar on my website.
Future Timeline: It's clear that the Arctic is melting rapidly and this trend is likely to continue. When do you predict the Arctic will start to have ice-free conditions? At what point during the year will it disappear, and how long for? How will these conditions develop in future decades, and could we reach a point where the Arctic is free of ice all year round?
Paul Beckwith: I think that the Arctic will start to have ice-free conditions at the end of the melt season (Septembers) as early as 2020 or before (possibly even the summer of 2016). It is hard to predict a single year, since the loss of Arctic sea ice greatly depends on local Arctic wind and ocean conditions in the summer melt season. These local conditions determine how much ice is lost to export via the Fram Strait and Nares Strait, which makes a huge difference to ice loss amounts during the Northern summer period. When there is less than 1 million square km of sea ice left, we have essentially a "blue-ocean" event in the Arctic.
For the sake of argument, let us pick September 2020 for the first "blue-ocean" event in the Arctic. This would occur for about a month – call it the month of September. Within two or three years, it is highly likely that the duration of this "blue-ocean" state would be three months or so, thus occurring for August, September and October in 2023. Within an additional few years, say by 2025, it is highly likely that the "blue-ocean" event would be extended for another few additional months, and we'd have ice-free conditions from July through to and including November; namely for five months of the year. Then, within a decade or two from the initial 2020 event we can expect to have an ice-free "blue-ocean" Arctic year round; that would be some year between 2030 and 2040.
Of course, if the first "blue-ocean" event occurred in 2016, this timeline would be advanced accordingly.
Future Timeline: In recent years, there's been a lot of talk about methane eruptions in the Arctic and Siberia. How serious is this, in terms of its potential for adding to global warming? Can you give us some idea of the timescales involved? What's the level of certainty about these future effects?
Paul Beckwith: Once the Arctic is essentially ice-free for ever-increasing durations in the summer months, and then over the entire year, there are two enormous feedback risks that we face. Methane and Greenland.
Methane is the mother of all risks. The Russians have measured large increases in emissions from the continental shelf seabed in the Eastern Siberian Arctic Shelf (ESAS). Over the timespan of a few years they observed that methane bubbled up in vast numbers of plumes that increased in size from tens of metres in diameter to hundreds and even thousands of metre diameter plumes in the shallow regions of ESAS. Global atmospheric levels of methane are rapidly rising, and although they average about 1900 ppb or so there have been readings over 3100 ppb in the atmosphere over the Arctic. Since the Global Warming Potential (GWP) of methane versus carbon dioxide is 34x, 86x and close to 200x on timescales of 100 years, 20 years and a few years, respectively, a large burst of methane can virtually warm the planet many degrees almost overnight.
Recently, we have passed about 405 ppm of CO2, with a record rise of 3.09 ppm in 2015 alone. When accounting for methane and other greenhouse gases and putting them into CO2-equivalent numbers, we are at about 490 ppm CO2 equivalent. We are literally playing with fire, and the outcome will not be pretty.
Greenland ice melt is the next enormous feedback risk. When we lose snow and ice in the Arctic – and the cascading feedbacks like albedo destruction kick in – and the methane comes out, then the enormous warming over Greenland and in the water around and under the Greenland ice will viciously destroy the ice there and greatly accelerate sea level rise. I refer people to my video from several years ago on the great risk of realising 7 metres of global sea level rise by 2070 from Greenland and Antarctica melt.
The level of certainty over these future effects is close to 100% if we continue to be stupid and do nothing. If we are smart, we need to have a Manhattan–Marshall Plan-like emergency status to a) Zero emissions as soon as possible, i.e. by 2030; b) Cool the Arctic to keep the methane in place and restore jet stream stability, and c) Remove CO2 from the atmosphere/ocean system and remove methane from the atmosphere. There is no other choice. I use the metaphor of a three legged bar stool with legs a), b) and c) as above.
“The level of certainty over these future effects
is close to 100% if we continue
to be stupid and do nothing.”
Future Timeline: What new satellites, monitoring stations, and other science projects are being planned for the future (if any)? How will these improve our knowledge of the Arctic and the various climatic processes in the region?
Paul Beckwith: NASA, the ESA and the Russians and Chinese are always launching new satellites with better high tech sensors to gather more information on the changes in the Earth System. We need to have a massive increase in scientific study in the Arctic to better quantify what is happening there. However, we know enough to see that if we do not deploy the three-legged barstool approach immediately, then our chances of halting the ongoing abrupt climate change will vanish, and emissions from the Earth System will dwarf all cumulative anthropogenic emissions throughout human history. We need the US military budget of $700 to $800 billion dollars per year to be applied to saving human civilisation from abrupt climate change.
Future Timeline: What can be done to save the Arctic and reverse the melting trend? How long would it take to restore the ice cover to, say, mid-20th century levels? Is this even possible with current technology?
Paul Beckwith: We must cool the Arctic as soon as possible using Solar Radiation Management (SRM) technologies. We can deploy very quickly if we treat this Arctic temperature amplification as an existential threat to humanity and put billions of dollars into deployment. It will take many years – perhaps a decade to restore the ice cover, but we must start now. If we wait until we have "blue-ocean" events before we deploy, then our ability to restore the ice will be much harder and perhaps even futile. Deployment is possible with current technology. I am specifically referring to Marine Cloud Brightening (MCB) methods. I am working today with people on these technologies.
Solar Radiation Management (SRM). Credit: Hugh Hunt
Future Timeline: How does the melting in the Arctic compare to its southern polar opposite, the Antarctic?
Paul Beckwith: The Arctic is rapidly losing snow cover (mostly in the spring months) and sea ice cover, and thus the average albedo (reflectivity) of the region is rapidly decreasing. This is feeding back into additional Arctic Temperature Amplification and further darkening and warming, until we have no snow and ice in the region. These vicious feedback cycles have not kicked in to the same extent in the Antarctic. The ice cap there is losing ice causing a rise in sea level mostly from the warming of the seawater undercutting the ice on land that is grounded below sea level. However, since the Arctic is warming so fast due to increased solar radiation absorption (from darkening) there is less heat transported there via the atmosphere and oceans. Thus, jet streams and ocean currents are slowing. Thus, more heat is moving from the equator to the southern hemisphere, making it to Australian latitudes and increasing the temperature gradient to Antarctica and thus increasing the speed of the jet streams there.
Future Timeline: Finally, what's your message to climate change deniers who reject the science and believe the whole thing is a giant hoax?
Paul Beckwith: Climate change deniers cannot be tolerated by society any longer. They are threatening the future of everybody on our planet.
• Follow us on Twitter
• Follow us on Facebook
14th March 2016
Viessmann launches the UK's first WiFi-enabled boiler
Leading international heating systems manufacturer Viessmann has announced it is introducing the first domestic heating boilers with WiFi and Internet connectivity.
At the EcoBuild exhibition and conference in London, international heating systems manufacturer Viessmann announced it is introducing the first domestic heating boilers with WiFi and Internet connectivity. This new technology will enable homeowners to control their heating and hot water settings from anywhere in the world that has a mobile phone or Internet connection, simply by using the new, free Viessmann Vicare app.
The app – available for Apple and Android smartphones, as well as the iPad, iPod touch and Android tablets – allows homeowners to set the boiler’s daily programme and to adjust the boiler’s functions, improving heating comfort and convenience and saving unnecessary energy costs. The app will also remind the homeowner and the homeowner’s chosen registered gas engineer when the boiler’s annual service is due.
Unlike other apps for domestic boilers, which communicate only with thermostats, Viessmann’s is the first to connect with the boiler and to continually monitor its performance. If a technical fault should develop, the app will automatically inform the gas engineer, with a diagnosis of the problem and list of the parts needed for rectification.
All new Viessmann Vitodens 100 and 200 gas condensing boilers, which go on sale in September, can be WiFi enabled. All 100 and 200 range models installed since 2007 can also be WiFi enabled retrospectively, with a £60 control accessory. Internet connectivity is achieved via Viessmann’s Vitocom 100 system, which connects to the homeowners’ WiFi.
Viessmann’s marketing director, Darren McMahon commented: “We’re living in an increasingly connected world where we expect to have all information at our fingertips, and the health and servicing needs of our domestic heating and hot water boilers should be no different. Internet-enabled boilers are a big step forward in boiler development and Viessmann is proud to be the first manufacturer to make this available.
“We’ve given homeowners increased security and the potential to save money and reduce their carbon footprint, whilst increasing peace of mind.”
• Follow us on Twitter
• Follow us on Facebook
12th March 2016
New plastic-eating bacteria discovered
The first species of bacteria able to degrade polyethylene terephthalate (PET) has been described by Japanese researchers.
Ideonella sakaiensis (left) and the degraded remains of plastic (right). Credit: Kohei Oda / Yoshida et al.
The first species of bacteria able to degrade polyethylene terephthalate (PET) has been described by Japanese researchers. PET is a type of polymer used in plastic that is highly resistant to biodegradation. About 56 million tons of PET was produced worldwide in 2013 alone, and its accumulation in ecosystems around the globe is increasingly problematic.
To date, very few species of fungi – but no bacteria – have been found to break down PET. In their new study published yesterday, Yoshida et al. collected 250 samples of PET debris and screened for bacterial candidates that depend on PET film as a primary source of carbon for growth. A novel bacterium was identified, which could almost completely degrade a thin film of PET after six weeks of exposure at 30°C (86°F). They have named the organism Ideonella sakaiensis.
Further investigation identified an enzyme, ISF6_4831, which works with water to break down PET into an intermediate substance, which is then further broken down by a second enzyme, ISF6_0224. These two enzymes alone can break down PET into simpler building blocks. Remarkably, they seem to be highly unique in their function compared to the closest related known enzymes of other bacteria, raising questions of how this plastic-eating bacteria species originated.
Rapid evolution, in as little as 70 years, may have occurred in response to the build-up of plastic in the environment, according to Enzo Palombo, professor of microbiology at Swinburne University in Australia: "If you put a bacteria in a situation where they've only got one food source to consume, over time they will adapt to do that," he said.
"This is the first rigorous study – it appears to be very carefully done – that I have seen that shows plastic being hydrolysed [broken down] by bacteria," said Dr Tracy Mincer, a researcher at the Woods Hole Oceanographic Institution, USA. "I think we are seeing how nature can surprise us and in the end the resiliency of nature itself."
However, the bacteria took longer to degrade highly crystallised PET, which is used in plastic bottles. That means the enzymes and processes would need refinement before they could be used in recycling, ocean clean-up, or other environmental regeneration projects.
"It's difficult to break down highly crystallised PET," said Prof Kenji Miyamoto from Keio University, one of the paper's co-authors. "Our research results are just the initiation for the application. We have to work on so many issues needed for various applications. It takes a long time."
Future studies would also need to determine if the bacteria can survive in salty seawater and fluctuating ocean temperatures. There might be additional species out there, which have already adapted for these conditions, and just need to be found, says Palombo: "I would not be surprised if samples of ocean plastics contained microbes that are happily growing on this material and could be isolated in the same manner," he said.
The study was published yesterday in Science.
• Follow us on Twitter
• Follow us on Facebook
9th March 2016
World's second largest rainforest at risk from lifting of logging moratorium
New licences could soon be issued to logging companies in the Democratic Republic of Congo (DRC), threatening to accelerate the rate of deforestation in the region.
A tropical rainforest more than twice the size of France is at risk of being cut down, following news from the Democratic Republic of Congo (DRC) that the government is planning to re-open its forest to new logging companies. This comes at a time when the governments of Norway, France, Germany, the UK, and the European Union are assessing whether to support a billion-dollar plan proposed by the DRC government to protect the country's 1.55 million square kilometres of forests.
A coalition of environmental and anti-corruption organisations is calling on the DRC to maintain its moratorium on the allocation of new logging licenses, which has been in place since 2002.
"The large-scale logging of DRC's rainforest was and is a disaster," said Irène Betoko of Greenpeace Africa. "It not only harms the country's environment, but also fuels corruption and creates social and economic havoc. We call upon the DRC government to keep the present logging moratorium in place."
Lars Løvold of the Rainforest Foundation Norway (RFN): "At a time when the global community is working together to protect the world's last rainforests, a vital defence against climate change, the DRC government seems to be undermining the commitment to reducing emissions that it presented in Paris."
DRC Environment Minister, Robert Bogeza, outlining his priorities for 2016, stated that measures are being taken to lift the moratorium on the allocation of new logging licenses, citing the financial benefits this could bring: "The moratorium on granting new forestry concessions, decreed in 2002 by Ministerial Decree and reaffirmed in 2005 by Presidential Decree, has caused a huge shortfall in revenues for our country. Measures are underway for the Government to lift it."
However, Joesph Bobia of Réseau Ressources Naturelles (RRN) said: "The argument that logging can significantly contribute to government revenues is completely unfounded. Around a tenth of the DRC's rainforest is already being logged. And yet, in 2014 the country obtained a pitiful USD$8 million in fiscal revenues from the sector – the equivalent of about 12 cents for every Congolese person."
Simon Counsell of the Rainforest Foundation UK said: "The expansion of industrial logging in the Congo's rainforests is likely to have serious long-term negative impacts on the millions of people living in and depending on those forests. We urge the government of DRC to instead promote community-based forest protection and alternatives to logging that will help the country's population prosper."
Reducing Emissions through Deforestation and Degradation (REDD) is an international effort under the UN climate treaties to combat carbon emissions by protecting the world's forests. The DRC's national strategy for REDD has been under negotiation for six years and will be submitted to international donor governments for approval this year.
The moratorium on the allocation of new logging titles was issued by Ministerial decree in 2002, in an attempt to regain control of the country's timber industry, which was riddled with illegal logging and corruption. The DRC accounts for a tenth of the world's remaining tropical rainforests. Many species, such as the bonobo and okapi, are only found in these ecosystems. Some 40 million people in the country rely on these forests for their livelihoods.
A civil society briefing is available to download here.
• Follow us on Twitter
• Follow us on Facebook
8th March 2016
Arctic sea ice hits record low
Last month, Arctic sea ice reached its lowest extent on record for a February.
The latest available satellite data shows that last month, the Arctic sea ice extent averaged 14.22 million square kilometres (5.48 million sq miles), the lowest on record for a February. The National Snow and Ice Data Centre (NSIDC), which published the data, confirms it was the second month in a row that a satellite record low has been observed.
Surface temperatures in the region averaged more than 4 degrees Celsius (7.2 degrees Fahrenheit) above the 1951 to 1980 average. Air temperatures at the 925 hPa level were even higher: between 6 to 8 degrees Celsius (11 to 14 degrees Fahrenheit) above the 1981 to 2010 average over the central Arctic Ocean near the pole. This follows 10 straight months of record-breaking high surface temperatures around the globe.
The loss of thicker, older ice has become an especially alarming trend in the Arctic. More than half of the older ice has disappeared since 1987, which is partly responsible for the sharp decline in overall ice. Being more brittle, the younger ice tends to melt faster. As warming continues to increase, it could seriously affect the jet stream, while disappearing sea ice could trigger feedbacks as darker ocean water absorbs more heat than white, reflective ice.
Many gigatons of methane are trapped by permafrost and subsea clathrates, but the rapid melt is already causing outbursts in the region. Methane is a greenhouse gas with 72 times the heat-trapping potential of CO2 when measured over a 20-year period.
The current spike in global temperatures is being driven by the ongoing El Niño. However, regardless of this phenomenon, the signature of man-made warming is clearly evident in the background. Natural causes alone simply cannot account for the extra heat within the climate system. In a worrying sign of things to come, an alarming new milestone was reached on Thursday 3rd March. For a brief few hours, the average temperature in the northern hemisphere hit 2°C above the pre-industrial average, considered by most nations to be the "dangerous" limit for climate change. In terms of 12-month global averages, we are not expected to witness such an increase until the 2040s; but already half the world has experienced this temperature in 2016 – if only for a short time – which underscores just how rapidly our planet is changing.
In related news, the UAH V6 Global Temperature record is now reported to have surpassed its previous high. For many years, climate change deniers have been using this dataset (which is based on indirect satellite inferences, as opposed to ground-level thermometers) to claim that global warming stopped in 1998. However, this dataset is now also showing a clear upward trend in global average temperatures, as illustrated in the graph below.
From an environmental perspective, the U.S. presidential election this November will be of critical importance. All of the Republican candidates have vehemently denied the science of global warming and promised to undo the climate goals negotiated just last December in Paris, as well as threatening to curtail the Environmental Protection Agency (EPA). On the Democrat side, Clinton has been inconsistent and appeared to flip-flop on various issues. Only Bernie Sanders has presented a clear and unwavering message on solving climate change.
• Follow us on Twitter
• Follow us on Facebook
29th February 2016
Thinnest and lightest solar cells ever made
A solar cell so thin, flexible, and lightweight that it can be draped on a soap bubble has been demonstrated by MIT.
Credit: Joel Jean and Anna Osherov
Imagine solar cells so thin, flexible, and lightweight that they could be placed on almost any material or surface, including your hat, shirt, or smartphone, or even on a sheet of paper or a helium balloon.
Researchers at the Massachusetts Institute of Technology (MIT) have now demonstrated just such a technology: the thinnest, lightest solar cells ever produced. Though it may take years to develop into a commercial product, this laboratory proof-of-concept shows a new approach to making solar cells that could help power the next generation of portable electronic devices.
The new process is described in a paper by Professor Vladimir Bulovic, research scientist Annie Wang, and doctoral student Joel Jean, in the April 2016 edition of the journal Organic Electronics. The key to their technique is making the solar cell, the substrate that supports it, and a protective overcoating all in one process. The substrate is made in place and never needs to be handled, cleaned, or removed from the vacuum during fabrication, thus minimising exposure to dust or other contaminants that could degrade the cell's performance.
"The innovative step is the realisation that you can grow the substrate at the same time as you grow the device," Bulovic says.
In this initial proof-of-concept experiment, the team used a common flexible polymer called parylene as both the substrate and the overcoating, and an organic material called DBP as the primary light-absorbing layer. Parylene is a commercially available plastic coating used widely to protect implanted biomedical devices and printed circuit boards from environmental damage. The entire process occurs at room temperature, without using any solvents – unlike conventional solar cell manufacturing that requires high temperatures and harsh chemicals. In this case, both the substrate and the solar cell are "grown" using established vapour deposition techniques.
The team emphasises that these particular choices of materials were just examples, and that it is the in-line substrate manufacturing process that is the key innovation. Different materials could be used for the substrate and encapsulation layers, and different types of thin-film solar cell materials including quantum dots or perovskites could be substituted for the organic layers used in initial tests. But already, the team has achieved the thinnest and lightest complete solar cells ever made, they say. To demonstrate just how thin and lightweight the cells are, the researchers draped a working cell on top of a soap bubble, without popping the bubble.
The complete solar cells, including substrate and overcoating, are just 1/50th of the thickness of a human hair and 1/1000th of the thickness of equivalent cells on glass substrates – two micrometres thick – yet they convert light into electricity just as efficiently as their glass-based counterparts.
"We put our carrier in a vacuum system, then we deposit everything else on top of it, and then peel the whole thing off," explains Wang.
While they used a glass carrier for their solar cells, Jean says "it could be something else. You could use almost any material," since the processing occurs under such benign conditions. The substrate and solar cell could be deposited directly on fabric or paper, for example.
While the solar cell in this demonstration device is not especially efficient, because of its low weight, its power-to-weight ratio is among the highest ever achieved. That's important for applications in which weight is a vital factor – such as on spacecraft, aeroplanes, or high-altitude balloons. Whereas a typical silicon-based solar module, whose weight is dominated by a glass cover, may produce about 15 watts of power per kilogram, the new MIT cells have already demonstrated an output of 6 watts per gram – about 400 times higher.
"It could be so light that you don't even know it's there, on your shirt or on your notebook," says Bulovic.
This is still early, laboratory-scale work, and developing a mass-produced, commercial version will take time. Yet while success in the short term may be uncertain, this work could open up dramatic new applications for solar power in the longer term. "We have a proof-of-concept that works," Bulovic says. The next question is, "How many miracles does it take to make it scalable? We think it's a lot of hard work ahead – but likely no miracles needed."
"This demonstration by the MIT team is almost an order of magnitude thinner and lighter than the previous record holder," says Max Shtein, a professor of materials science and engineering at the University of Michigan, who was not involved in this work. "It has tremendous implications for maximising power-to-weight (important for aerospace applications, for example), and for the ability to simply laminate photovoltaic cells onto existing structures."
"This is very high quality work," Shtein adds, with a "creative concept, careful experimental set-up, very well written paper, and lots of good contextual information." And, he says, "The overall recipe is simple enough that I could see scale-up as possible."
• Follow us on Twitter
• Follow us on Facebook
24th February 2016
Samsung predicts the world 100 years from now
Hyper-tall skyscrapers, underwater bubble cities, personal home "medi-pods" and civilian colonies on the Moon are all likely to be a reality in a hundred years' time, according to a report commissioned by Samsung.
A new study commissioned by Samsung paints a vivid picture of our future lives; suggesting the way we live, work and play will change beyond recognition over the course of the next century. The SmartThings Future Living Report was authored by a team of leading academics – including TV presenter and space scientist, Dr Maggie Aderin-Pocock, award-winning futurist architects and lecturers at the University of Westminster, Arthur Mamou-Mani and Toby Burgess, as well as pioneering urbanists Linda Aitken and Els Leclerq.
The report was released to promote SmartThings, a system which allows people to make their home smarter, meaning that at any time and from anywhere it is possible to switch on lights, turn up the thermostat or unlock the back door, all via a simple app or automatically through daily routines – something that might have seemed like science fiction as little as 10 years ago, but today is a reality.
The predictions for how we will live in the future have been brought to life via detailed animated renders, showing a futuristic London where high rise apartments dwarf the Shard, and drone transportation is ubiquitous.
Many of the predictions were influenced by environmental conditions, with growing populations leading to the development of structures better able to cope with space constraints and diminishing resources. As city space becomes ever more squeezed, we will burrow deeper and build higher with the creation of:
Super skyscrapers: carbon nanotubes and diamond nano-threads will help us create towering megastructures that dwarf today's skyscrapers
Earth-scrapers: just as we build up, we will also dig down – huge structures will tunnel 25 storeys deep, or more
Underwater cities: using the water itself to create breathable atmospheres and generating hydrogen fuel in the process
Personal flying drones replacing cars: we will travel through "skyways" with our own personal flying drones, some big enough to carry entire homes around the world for holidays
Click to enlarge
As technology develops, Samsung predicts:
3D printing of houses and furniture: we will be able to print exact replicas of large scale structures like houses out of local, recyclable materials so that we really can have all the comforts of home while we are away
Flexible, smart walls and 3D printed Michelin starred meals: smart walls will mean you won't need to decorate your home – room surfaces will adapt to suit your mood. When it comes to entertaining, there will be no more botched recipes or pizza deliveries – instead we will be downloading dishes from famous chefs that we will tailor to our personal needs. We will be able to 3D-print a banquet or a favourite cake in minutes
Virtual meetings: our working lives will be transformed with the use of holograms which will allow us to attend meetings virtually, without leaving the comfort of our homes
Home medi-pods: stepping into these will confirm if you really are ill, providing a digital diagnosis and supplying medicine or a remote surgeon if needed
Colonisation of space: first the Moon, then Mars, then far beyond into the galaxy
In addition to looking at how we will live in 100 years' time, the SmartThings team surveyed 2,000 British adults to pinpoint the predictions the nation thought were the most likely to become a reality in the future. This survey shines a spotlight on the public perception of the future and suggests that building further into the sky, and colonising oceans are believed to be the biggest future trends, as space and resources become scarce. The top ten predictions for future living:
||Virtual work meetings – the ability to work from anywhere and attend meetings remotely via avatars/holograms
||Commercial flights into space
||Virtual interior decoration to program your surroundings/LED walls that adapt surroundings to your mood
||3D printed houses/furniture/food – you can instantly download and print these things at home
||At-home scanning capsules/pods that can diagnose health problems and administer medicines/treatments
||Colonising other planets as we use up resources on Earth
||AI becoming a normal part of daily life – taking over from humans in many industries
||Giant skyscrapers that house entire cities, built with new super-strong materials
||At-home hydroponic farms (that don't require soil) where you can grow your own food
||Earth-scrapers – parts of cities becoming subterranean, due to space constraints and also to provide further shelter
Space Scientist Maggie Aderin-Pocock, who co-authored the report, commented: "Our lives today are almost unrecognisable from those a century ago. The Internet has revolutionised the way we communicate, learn and control our lives. Just 25 years ago, technology like SmartThings would have been inconceivable, yet today, developments like this let us monitor, control and secure our living spaces with the touch of a smartphone. Over the next century we will witness further seismic shifts in the way we live and interact with our surroundings – working on the SmartThings Future Living Report, with a panel of industry experts, has allowed me to explore what these could be.
"We are likely to see the emergence of towering megastructures, as well as sub-aquatic cities and transportation via advanced flying drones – some of which could be strong enough to transport entire houses on holiday."
James Monighan, UK Managing Director of Samsung SmartThings, comments as follows: "The smart home revolution will have massively positive implications on how we live. Our homes are becoming smarter – they can now detect the presence of things like people, pets, smoke, humidity, lighting and moisture. And this is just the beginning.
"Just as the technology driving the Internet has spread to smartphones and smart homes, the smart home revolution is destined to spread to larger communities and countries. By simply turning lights and heating off when we don't use them, we can reduce emissions. By being able to better monitor and secure our homes, we can reduce crime. By better monitoring the habits of aging relatives, we help them to achieve greater independence and a higher quality of life."
The report has been published to coincide with the announcement that SmartThings will work with hundreds of products, from a wide range of brands – as well as working with all of Samsung's TVs, refrigerators, washer machines, ovens, and robot vacuum cleaners.
28th January 2016
Doomsday Clock stays at three minutes to midnight
The Bulletin of the Atomic Scientists Science and Security Board has announced that their closely monitored "Doomsday Clock" will remain at three minutes to midnight.
The Doomsday Clock is a symbolic clock face, representing a countdown to possible global catastrophe (e.g. nuclear war or climate change). It has been maintained since 1947 by the Science and Security Board of the Bulletin of the Atomic Scientists, which includes 18 Nobel Laureates. The closer they set the Clock to midnight, the closer the scientists believe the world is to global disaster.
The position of the Clock hands in the past has ranged from two minutes to midnight in 1953 (after the U.S. began testing hydrogen bombs, which was followed by Soviet tests shortly after), to 17 minutes to midnight in 1991 (when the Cold War ended and deep cuts were made to nuclear arsenals).
Last year, the Clock hands were moved from five to three minutes to midnight, with the Bulletin stating: "Unchecked climate change, global nuclear weapons modernisations, and outsized nuclear weapons arsenals pose extraordinary and undeniable threats to the continued existence of humanity, and world leaders have failed to act with the speed or on the scale required to protect citizens from potential catastrophe. These failures of political leadership endanger every person on Earth."
This week, it was announced that the Doomsday Clock will remain at three minutes to midnight, since recent progress in the Iran nuclear agreement and the Paris climate accord "constitute only small bright spots in a darker world situation full of potential for catastrophe."
The statement accompanying the Doomsday Clock decision opens with the following words: "Three minutes (to midnight) is too close. Far too close. We, the members of the Science and Security Board of the Bulletin of the Atomic Scientists, want to be clear about our decision not to move the hands of the Doomsday Clock in 2016: That decision is not good news, but an expression of dismay that world leaders continue to fail to focus their efforts and the world's attention on reducing the extreme danger posed by nuclear weapons and climate change. When we call these dangers existential, that is exactly what we mean: They threaten the very existence of civilization and therefore should be the first order of business for leaders who care about their constituents and their countries."
While recognising the important progress of the Iran nuclear deal and the Paris climate accord, the Bulletin cautions that these positive steps have been offset in large part by foreboding developments: "Even as the Iran agreement was hammered out, tensions between the United States and Russia rose to levels reminiscent of the worst periods of the Cold War. Conflict in Ukraine and Syria continued, accompanied by dangerous bluster and brinkmanship, with Turkey, a NATO member, shooting down a Russian warplane involved in Syria, the director of a state-run Russian news agency making statements about turning the United States to radioactive ash, and NATO and Russia repositioning military assets and conducting significant exercises with them. Washington and Moscow continue to adhere to most existing nuclear arms control agreements, but the United States, Russia, and other nuclear weapons countries are engaged in programs to modernise their nuclear arsenals, suggesting that they plan to keep and maintain the readiness of their nuclear weapons for decades, at least — despite their pledges, codified in the Nuclear Non-Proliferation Treaty, to pursue nuclear disarmament."
On the climate front, the Bulletin statement points out: "Promising though it may be, the Paris climate agreement came toward the end of Earth's warmest year on record, with the increase in global temperature over pre-industrial levels surpassing one degree Celsius."
Other more positive climate developments cited in the statement include the Papal encyclical related to climate change, the movement among investors toward divestment of fossil fuels, new advances in sustainable energy systems, more climate-friendly governments in Canada and Australia. However, the statement cautions that even these developments must be seen "against the steady backtracking of the United Kingdom's present government on climate policies and the continued intransigence of the Republican Party in the U.S., which stands alone in the world in failing to acknowledge even that human-caused climate change is a problem."
The Bulletin also reflects concerns about "the nuclear power vacuum" around the globe: "The international community has not developed coordinated plans to meet cost, safety, radioactive waste management, and proliferation challenges that large-scale nuclear expansion poses ... Because of such problems, in the United States and in other countries, nuclear power's attractiveness as an alternative to fossil fuels has decreased, despite the clear need for carbon-emissions-free energy in the age of climate change."
Rachel Bronson, executive director of the Bulletin, comments: "Last year, the Bulletin's Science and Security Board moved the Doomsday Clock forward to three minutes to midnight, noting: 'The probability of global catastrophe is very high, and the actions needed to reduce the risks of disaster must be taken very soon.' That probability has not been reduced. The Clock ticks. Global danger looms. Wise leaders should act — immediately."
Lawrence Krauss, chair of the Bulletin's Board of Sponsors: "Developments have been mixed since we moved the clock forward a year ago. In spite of some positive news, the major challenges the Bulletin laid out for governments then have not been addressed, even as the overall global challenges we need to face become more urgent. The clock reflects our estimate that the world is as close to the brink as it was in 1983 when US-Russian tensions were at their iciest in decades."
Sharon Squassoni, Bulletin Science and Security Board member, and a director of the Proliferation Prevention Program at the Centre for Strategic & International Studies (CSIS) in Washington, DC, said: "North Korea's recent nuclear test illustrates the very real danger of life in a proliferated world. Nuclear proliferation isn't a potential threat — we still have few controls over the kinds of capabilities that Iran succeeded in acquiring. In addition, regional tensions and conflict increase the risk of theft or use of these weapons."
Sivan Kartha, Bulletin Science and Security Board member, senior scientist and climate change expert, states: "The voluntary pledges made in Paris to limit greenhouse gas emissions are insufficient to the task of averting drastic climate change. These incremental steps must somehow evolve into the fundamental change in world energy systems needed if climate change is to ultimately be arrested."
So, what steps need to be taken?
The Bulletin statement accompanying the Doomsday Clock announcement identifies the following as the most urgently needed:
• Dramatically reduce proposed spending on nuclear weapons modernisation programs.
• Re-energise the disarmament process, with a focus on results.
• Engage North Korea to reduce nuclear risks.
• Follow up on the Paris accord with actions to sharply reduce greenhouse gas emissions and fulfil the global agreement to keep warming below 2°C.
• Deal now with the commercial nuclear waste problem.
• Create institutions specifically assigned to explore and address potentially catastrophic misuses of new technologies.
18th January 2016
Next ice age delayed by 100,000 years
Man-made carbon emissions have delayed the next ice age by as much as 100,000 years, according to researchers at the Potsdam Institute for Climate Impact Research.
Artist's impression of the northern hemisphere during the last Ice Age. By Ittiz (Own work) [CC BY-SA 3.0], via Wikimedia Commons.
Humanity has become a geological force that is able to suppress the beginning of the next ice age, according to a new study published in Nature. Scientists at the Potsdam Institute for Climate Impact Research in Germany found the relation of insolation (solar radiation hitting the Earth's surface) and CO2 concentration in the atmosphere are the key criterion to explain the last eight glacial cycles in Earth's history. At the same time, their results show that even moderate human interference with the planet's natural carbon balance might postpone the next glacial inception by 100,000 years.
Even without man-made climate change, the current geological cycle (known as the Holocene) suggests that the earliest the next ice age could be expected to arrive is 50,000 years from now. However, the addition of man-made CO2 from human activity since the Industrial Revolution is already sufficient to postpone the next ice age by a further 50,000 years, while future anthropogenic emissions in the coming decades may increase that by yet another 50,000 years.
"The bottom line is that we're basically skipping a whole glacial cycle, which is unprecedented," says lead author Andrey Ganopolski. "It's mind-boggling that humankind is able to interfere with a mechanism that shaped the world as we know it."
For the first time, says Ganopolski, scientists can explain the onset of the past eight ice ages by quantifying several key factors that preceded the formation of each glacial cycle: "Our results indicate a unique functional relationship between the summer insolation and atmospheric CO2 for the beginning of a large-scale ice-sheet growth which does not only explain the past, but also enables us to anticipate future periods when glacial inception might occur again."
Using an elaborate Earth system model – simulating atmosphere, ocean, ice sheets and global carbon cycle at the same time – the scientists analysed the effects of further human-made CO2-emissions on the ice volume in the Northern Hemisphere.
"Due to the extremely long life-time of anthropogenic CO2 in the atmosphere, past and future emissions have a significant impact on the timing of the next glacial inception," says co-author Ricarda Winkelmann. "Our analysis shows that even small additional carbon emissions will most likely affect the evolution of the Northern Hemisphere ice sheets over tens of thousands of years, and moderate future anthropogenic CO2-emissions of 1000 to 1500 Gigatons of Carbon are bound to postpone the next ice age by at least 100,000 years."
The search for what drives each glacial cycle remains one of the most fascinating questions of Earth system analysis, and especially paleoclimatology – the study of climate changes throughout the entire history of our planet. Usually, the beginning of a new ice age is marked by periods of very low solar radiation in summer, like current times. However, at present there is no evidence for a new ice age emerging: "This is the motivation for our study. Unravelling the mystery of the mechanisms driving past glacial cycles also facilitates our ability to predict the next glacial inception," Winkelmann says.
"Like no other force on the planet, ice ages have shaped the global environment and thereby determined the development of human civilisation. For instance, we owe our fertile soil to the last ice age that also carved out today's landscapes, leaving glaciers and rivers behind, forming fjords, moraines and lakes. However, today it is humankind with its emissions from burning fossil fuels that determines the future development of the planet," co-author and PIK-Director Hans Joachim Schellnhuber says. "This illustrates very clearly that we have long entered a new era, and that in the Anthropocene humanity itself has become a geological force. In fact, an epoch could be ushered in which might be dubbed the Deglacial."
31st December 2015
World's first in-office paper recycling system
Epson has announced "PaperLab", the world's first in-office papermaking system that turns waste paper into new sheets.
Epson Corporation has developed what it believes to be the world's first compact, in-office papermaking system capable of producing new paper from securely shredded waste paper, without the use of water. Epson plans to put the new "PaperLab" into commercial production in Japan in 2016, with sales in other regions to be decided at a later date. Businesses and government offices that install a PaperLab in a backyard area will be able to produce new paper of various sizes, thicknesses, and types – from office paper and business cards, to paper that is coloured and scented.
Paper is traditionally made from trees, a limited resource in a world of ever-increasing demand. As a leading company in the world of printing, Epson has been deeply involved with paper used for its printer products. With this in mind, the company set out to develop a new technology that would change the paper cycle. With PaperLab, Epson aims to give new value to paper and to improve recycling.
Office-based recycling process
Ordinarily, paper is recycled in an extensive process that typically involves transporting waste paper from the office to a papermaking (recycling) facility. But with PaperLab it is possible to shorten and localise a new recycling process in the office.
Secure destruction of confidential documents
Until now, enterprises had to hire contractors to handle the disposal of confidential documents, or shredded them themselves. With PaperLab, however, enterprises can quickly and safely dispose of documents onsite, instead of handing them over to a contractor. PaperLab breaks documents down into paper fibres, so information on them is completely destroyed within seconds.
High-speed production of various types of paper
PaperLab produces the first new sheet of paper about three minutes after being loaded with waste paper and having the Start button pressed. It can produce an A4 sheet every 4.3 seconds, which is 14 sheets per minute or 6,720 in an eight-hour work day; equivalent to 13 reams of 500 sheets. Users can produce a variety of types of paper to meet their needs, from A4 and A3 office paper of various thicknesses to business cards, coloured paper and even scented paper.
PaperLab makes paper without the use of water. Ordinarily, it takes about a cup of water to make a single A4 sheet of paper. Given that water is an increasingly precious global resource, Epson recognised the need for a dry process. In addition, recycling paper onsite in the office shrinks and simplifies the recycling loop. Users can expect to purchase less new paper, while carbon emissions are greatly reduced since there is no need for transportation, with the entire process being done locally onsite. Epson's new "dry" technology consists of three separate systems: fiberising, binding, and forming.
World consumption of paper has grown 400% since 1970, and is now approximately 400 million tons a year. North America, despite having only 7% of the world's population, uses 20% of all paper. Unsustainable logging by some businesses in the paper industry degrades forests, accelerates climate change and leads to wildlife loss. Such practices also affect people who depend directly on forests. By using PaperLab to convert used paper into new, Epson is hoping that offices of all types will fundamentally change the way they think about paper.
27th December 2015
Lions added to endangered species list
In response to the alarming decline of lion populations in the wild, the U.S. Fish and Wildlife Service has listed two lion subspecies as endangered and threatened. Without action to protect them, African lions could see their populations halved by 2035.
This week, the U.S. Fish and Wildlife Service (USFWS) announced it will list two lion subspecies under the Endangered Species Act (ESA). Panthera leo leo – located in India and western and central Africa – will be listed as endangered, while Panthera leo melanochaita – found in eastern and southern Africa – will be listed as threatened.
In the last 20 years, lion populations have declined by 43% due to a combination of habitat loss, loss of prey base, trophy hunting, poaching for skins and uses in Chinese traditional medicine, and retaliatory killing of lions by a growing human population. The killing of Cecil the lion in July of this year served to further highlight this issue. Coupled with inadequate financial and other resources for countries to effectively manage protected areas, the impact on lions in the wild has been substantial. Having once been present in south-eastern Europe and throughout much of the Middle East and India, the animals have now lost 85% of their historic range, as shown on the map below.
Their numbers could be halved again by 2035, according to a recent study in the journal PNAS: "Many lion populations are either now gone or expected to disappear within the next few decades, to the extent that the intensively managed populations in southern Africa may soon supersede the iconic savannah landscapes in East Africa as the most successful sites for lion conservation," the study said.
In 2011, the USFWS received a petition to list Panthera leo leo as endangered under the ESA. In 2014, the agency published a 12-month finding and agreed to list the subspecies as threatened with a special rule under section 4(d) of the ESA. Based on newly available scientific information on the genetics and taxonomy of lions, the agency assessed the status of the entire lion species and subsequently changed its earlier finding.
The new science resolved that the western and central populations of African lion are more genetically related to the Asiatic lion. These lions are now considered the same subspecies, P. l. leo. There are only about 1,400 of these lions remaining; 900 in Africa and just 523 in India. Considering the size and distribution of the populations, the current trends and the severity of the threats, the agency has found that this subspecies now meets the definition of "endangered" under the ESA.
The other subspecies – Panthera leo melanochaita – likely numbers between 17,000-19,000 and is found across southern and eastern Africa. The agency determined that this subspecies is less vulnerable and is not currently in danger of extinction. However, although lion numbers in southern Africa are increasing overall, they are declining significantly in some regions, due to various ongoing threats. As a result, the agency finds this subspecies meets the definition of a "threatened" species under the ESA.
With an endangered listing, imports of P. l. leo will now be prohibited – except in certain rare cases, such as when it can be found that the import will enhance the survival of the species. To strengthen conservation measures for the threatened subspecies P. l. melanochaita, a new permitting mechanism will regulate the import of all P. l. melanochaita parts and products into the USA. This process will ensure that any imported specimens are legally obtained in range countries as part of a scientifically sound management program that benefits the subspecies in the wild. A third and final rule will enable the agency to support changes that strengthen the governance and accountability of conservation programs in other nations.
Protected areas are vital to the future survival of lions; and the building of corridors or funnelling mechanisms between protected areas is equally critical so that lions can be directed to other suitable habitat, away from potential conflict areas. It takes around $2,000 per square kilometre per year to properly protect these animals in Africa. Scientists from both the USA and the UK have, in recent years, begun collaborating to better understand how lions move across the African landscape and to model ways to conserve genetic diversity and populations across the continent.
“The lion is one of the planet’s most beloved species and an irreplaceable part of our shared global heritage,” said USFWS Director Dan Ashe. “If we want to ensure that healthy lion populations continue to roam the African savannas and forests of India, then it’s up to all of us – not just the people of Africa and India – to take action.”
21st December 2015
Global coal demand stalls after more than a decade of relentless growth
The latest annual report from the International Energy Agency (IEA) shows the coal market under intense pressure, reflecting Chinese economic restructuring and global environmental policies.
Following more than a decade of aggressive growth, global coal demand has stalled, the IEA has concluded in its annual coal market report. The report sharply lowers its five-year global coal demand growth forecast in reflection of economic restructuring in China, which represents around half of global coal consumption. Greater policy support for renewable energy and energy efficiency – the foundation of the COP21 agreement in Paris – is also expected to dent coal demand.
The IEA's Medium-Term Coal Market Report 2015 has slashed its five-year estimate of global coal demand growth by more than 500 million tonnes of coal equivalent (Mtce) in recognition of the tremendous pressures facing coal markets. This revision comes as official preliminary data indicate that a decline in Chinese coal demand occurred in 2014 and is set to accelerate in 2015. A decline in coal consumption in China for two consecutive years would be the first since 1982.
"The coal industry is facing huge pressures, and the main reason is China – but it is not the only reason," IEA Executive Director Fatih Birol said as he launched the report in Singapore at an event organised by the Energy Market Authority. "The economic transformation in China and environmental policies worldwide – including the recent climate agreement in Paris – will likely continue to constrain global coal demand."
Strong growth in coal use in India & Southeast Asia offset declines in the EU & the US, but does not match the rise seen over last decade in China.
Coal demand in China is sputtering as the Chinese economy gradually shifts to one based more on services, and less on energy-intensive industries. New Chinese hydro, nuclear, wind and solar are also significantly curtailing coal power generation, driven not only by energy security and climate concerns but also by efforts to reduce local pollution.
Given the strong rebalancing of China's economy, the report also presents an alternate scenario in which Chinese coal demand has already peaked. In this so-called "peak coal scenario", infrastructure and energy-intensive industries represent a lower share of Chinese GDP than in the report's base case, while services and high-tech manufacturing gain momentum. In the peak case, Chinese coal demand in 2020 is 9.8% percent below the level in 2013 and more than 300 Mtce below the base-case forecast of nearly 2950 Mtce in 2020. Meanwhile, global coal demand in the peak case drops to around 5500 Mtce in 2020 – falling 0.1% per year on average, compared with growth of 0.8% per year in the report's main forecast.
The report sees coal demand outside China modestly increasing through 2020 as the structural decline in Europe and the United States is more than offset by growth in India and Southeast Asia. The Indian government's push for universal energy access and an expansion of manufacturing will drive electricity growth. In addition to India's ambitious renewable targets (175 GW of renewables by 2022, of which 100 GW are solar PV), coal will provide a significant share of additional power requirements – as much as 60% through 2020. Indeed, preliminary data show India overtaking China as the world's largest coal importer this year.
The region with the highest growth rate in coal use in the outlook period is in Southeast Asia, where Indonesia, Vietnam, Malaysia and the Philippines among others plan to underpin their power generation with new coal power plants. Unfortunately, around half of the new coal-fired generation capacity under development in the region still uses inefficient subcritical technologies.
Slowing economic growth and energy consumption in China as well as the restriction of coal use in its coastal regions will impact seaborne trade, especially Indonesian exports. In the IEA report's forecast, Australia takes a growing share of seaborne coal trade.
The four largest exporters represent more than 80% of seaborne coal trade; India overtakes China to become the world's largest importer.
Prices continue to remain at low levels. In December 2015, prices of imported coal in Europe fell below USD 50/tonne – levels not seen in a decade. Persistent oversupply and shrinking imports in China and elsewhere suggest prices will remain under pressure through 2020.
With the recent COP21 agreement in Paris calling for the global increase in temperatures to be limited to "well below" 2 degrees Celsius, the IEA reiterated that carbon capture and storage (CCS) technology will be essential for enabling future use of coal without large CO2 emissions.
"Governments and industry must increase their focus on this technology if they are serious about long-term climate goals," said Fatih Birol. "CCS is not just a coal technology. It is not a technology just for power generation. It is an emissions reduction technology that will need to be widely deployed to achieve our low-carbon future."
12th December 2015
Ford will invest $4.5 billion in electric vehicles by 2020
US car giant Ford has announced it will invest an additional $4.5 billion in electric vehicle technology by 2020, as well as changing how the company develops vehicle experiences for customers.
Ford Focus Electric, 2016 model.
Ford is adding 13 new electrified vehicles to its portfolio by 2020 – by which time, more than 40 per cent of the company’s global nameplates will come in electrified versions. This represents Ford’s largest-ever electrified vehicle investment in a five-year period.
On the way next year is a new Focus Electric, which features all-new DC fast-charge capability, delivering an 80 per cent charge in just 30 minutes and a projected 160-kilometre (100‑mile) range – an estimated two hours faster than today’s model.
The zero emissions Focus Electric is manufactured in an eco-conscious facility. Its production home, the Michigan Assembly Plant, has one of the largest solar energy generator systems in the state. The new version of the Focus Electric, which starts production in late 2016, will provide features including:
- SmartGauge with EcoGuide LCD Instrument Cluster, which offers a multitude of customisable displays that can help the driver see real-time electric vehicle power usage to help maximise efficiency. At the end of each trip, a screen provides the distance driven, miles gained through regenerative braking, energy consumed and comparative gasoline information achieved by driving electric.
- Brake Coach (pictured below), another smart feature that coaches the driver on how to use smooth braking, to maximise the energy captured through the Regenerative Braking System. The more energy a driver captures through braking, the more energy is returned to the battery.
- “MyFord” mobile app, allowing owners to control and maintain contact with the car remotely, get instant vehicle status information, monitor the car's state of charge and current range, get alerts when the vehicle has finished charging, precondition the car (to be heated or cooled to a desired temperature, by a selected time), locate the vehicle with GPS, remotely start the vehicle, and remotely lock and unlock the car doors.
- SYNC 3, an enhanced voice recognition communications and entertainment system.
- Fun-to-drive character, with agile steering and handling engineered into the vehicle to give drivers a more connected feel to the road.
- Eco-conscious materials, such as soy, bio-foam seat cushions and recycled fabrics.
Ford’s shift towards electrified vehicle technology is in response to the increasing global demand for cleaner, more efficient vehicles. Electric car ownership is expected to surpass one million this year, with continued rapid growth predicted in the years ahead. Ford is also expanding its research and development programme in Europe and Asia, creating a “hub and spoke” system allowing its global team to further accelerate battery technology and take advantage of market specific opportunities. Ford is also reimagining how to set itself apart in the marketplace by focusing on the customer experience and not just the vehicle itself. The company is changing its product development process to support that shift.
“The challenge going forward isn’t who provides the most technology in a vehicle, but who best organises that technology in a way that most excites and delights people,” said Raj Nair, executive vice president, Product Development and chief technical officer. “By observing consumers, we can better understand which features and strengths users truly use and value and create even better experiences for them going forward.”
In addition to traditional market research, Ford is investing in social science-based research globally, observing how consumers interact with vehicles and gaining new insights into the cognitive, social, cultural, technological and economic nuances that affect product design.
“This new way of working brings together marketing, research, engineering and design in a new way to create meaningful user experiences, rather than individually developing technologies and features that need to be integrated into a final product,” Nair said. “We are using new insights from anthropologists, sociologists, economists, journalists and designers, along with traditional business techniques, to reimagine our product development process, create new experiences and make life better for millions of people.”
The global expansion of Ford’s electric vehicle research and development programme allows the company’s Electrified Powertrain Engineering teams to share common technologies and test batteries virtually, in real time, to develop new technology faster while reducing the need for costly prototypes.
Ford is expanding in Europe and China to accelerate battery technology development for new markets. By using an innovative hardware and software systems called HIL, or “Hardware in a Loop”, the global team can test battery technology and control system hardware in a virtual environment to simulate how batteries and control modules would behave in different – often punishing – environments in any part of the world.
“Batteries are the life force of any electric vehicle – and we have been committed to growing our leadership in battery research and development for more than 15 years,” said Kevin Layden, director of Ford Electrification Programs.
11th December 2015
Fusion reactor begins testing in Germany
The first helium plasma test has been successfully conducted at the Wendelstein 7-X fusion device in northeastern Germany. Tests with hydrogen plasma will begin in 2016.
The first helium plasma was produced yesterday in the Wendelstein 7-X fusion device at the Max Planck Institute for Plasma Physics (IPP) in Greifswald, northeastern Germany. Following more than a year of technical preparations and tests, experimental operation has now commenced according to plan. Wendelstein 7-X, the world's largest stellarator-type fusion device, will investigate the suitability of this type of device for a commercial power station.
After nine years of construction work and over a million assembly hours, the Wendelstein 7-X was completed in April 2014. Operational preparations have been underway ever since. Each technical system was tested in turn, the vacuum in the vessels, the cooling system, the superconducting coils and the magnetic field they produce, the control system, as well as the heating devices and measuring instruments.
On 10th December 2015, the day had arrived: the operating team in the control room started up the magnetic field and initiated the computer-operated experiment control system. This fed around one milligram of helium gas into the evacuated plasma vessel, switched on the microwave heating for a short pulse of 1.3 megawatts – and the first plasma was observed by the installed cameras and measuring devices. The exact moment of ignition was captured in this video.
“We’re starting with a plasma produced from the noble gas helium,” explains project leader, Professor Thomas Klinger: “We’re not changing over to the actual investigation object, a hydrogen plasma, until next year. This is because it’s easier to achieve the plasma state with helium. In addition, we can clean the surface of the plasma vessel with helium plasmas.”
The first plasma in the machine had a duration of one tenth of a second and achieved a temperature of around one million ºC. “We’re very satisfied”, concludes Dr. Hans-Stephan Bosch, whose division is responsible for the operation. “Everything went according to plan.” The next task will be to extend the duration of the plasma discharges and to investigate the best method of producing and heating helium plasmas using microwaves. After a break for New Year, the confinement studies will continue in January, which will prepare the way for producing the first plasma from hydrogen.
The Wendelstein 7-X fusion device. Photo: IPP, Thorsten Bräuer
The Wendelstein 7-X is the largest fusion device created using the "stellarator" concept, which refers to the possibility of harnessing the power source of the Sun, a stellar object. It is planned to operate with up to 30 minutes of continuous plasma discharge, demonstrating an essential feature of a future power plant: continuous operation. By contrast, tokamaks such as ITER can only operate in pulses without auxiliary equipment.
The Wendelstein 7-X is based on a five field-period Helias configuration. It is mainly a toroid – consisting of 50 non-planar and 20 planar superconducting magnetic coils, 3.5 m high – which induce a magnetic field that prevents the plasma from colliding with the reactor walls. The 50 non-planar coils are used for adjusting the magnetic field. It aims for a plasma temperature of 60 to 130 million K.
Stellarators were popular in the 1950s and 60s, but the much better results from tokamak designs led to them falling from favour in the 1970s. Wendelstein 7-X, however, aims to put the quality of the plasma equilibrium and confinement on a par with that of a tokamak for the very first time, potentially offering a new pathway to reliable fusion power.
Scheme of coil system (blue) and plasma (yellow) of the Wendelstein 7-X. A magnetic field line is highlighted in green on the plasma surface shown in yellow. Credit: Max Planck Institute for Plasma Physics [CC BY 3.0]
7th December 2015
Global human freshwater footprint has been underestimated
Dams and irrigation raise the global human freshwater footprint almost 20 percent higher than previously thought, according to new research by Stockholm University.
A new study by Stockholm University – published in the peer-reviewed journal Science – shows how dams and irrigation considerably raise the global human consumption of freshwater, by increasing evapotranspiration. This effect increases the loss of freshwater to the atmosphere, thereby reducing the water available for humans, societies and ecosystems on land.
"Small things that we do on the surface of the Earth can have large global effects. Previously, the global effects of local human activities such as dams had been underestimated. This study shows that, so far, the effects are even greater than those from atmospheric climate change," says Fernando Jaramillo, postdoc at the Department of Physical Geography at Stockholm University.
The researchers compiled and analysed over a century of data – from 1901 to 2008 – for climate, hydrology and water use in one hundred large hydrological basins spread over the world. Their results raise the previous estimate of the global human freshwater footprint by almost 20 percent. The increase in total freshwater loss from the landscape to the atmosphere is calculated to be 4,370 km3 per year. This is equivalent to two-thirds of the annual flow of the Amazon River, the world's largest river by discharge.
"The human-caused increase in this loss is like a huge river of freshwater from the landscape to the atmosphere. We have changed so much of the freshwater system without knowing it," says Gia Destouni, Professor at Stockholm University. "Our study shows that we have already passed a proposed planetary boundary for freshwater consumption. This is serious, regardless of whether we have crossed a real boundary or if the boundary has been underestimated."
As the global population continues to increase, the situation will worsen. By 2030, the world will need at least 30 percent more water than it did in 2012, according to United Nations High Level Panel on Global Sustainability estimates.
26th November 2015
No substantive evidence for 'pause' in global warming
A review of scientific literature by Bristol University finds no substantive evidence of a "pause" or "hiatus" in global warming.
There is no substantive evidence for a 'pause' or 'hiatus' in global warming and the use of those terms is therefore inaccurate, new research from the University of Bristol has found.
The researchers, led by Professor Stephan Lewandowsky of Bristol's School of Experimental Psychology and the Cabot Institute, examined 40 peer-reviewed scientific articles published between 2009 and 2014 that specifically addressed the presumed 'hiatus' and found no consistent or agreed definition of such a 'hiatus', when it began and how long it lasted.
The researchers then compared the distribution of decadal warming trends during the 'hiatus' – as defined by the same scientific articles – against other trends of equivalent length in the entire record of modern global warming. The analysis showed that all definitions of the 'hiatus' in the literature were found to be unexceptional in the context of other trends.
The researchers also found that, if sample size is small, the 'hiatus' will always appear to be present. For example, anyone making a claim for a 'hiatus' of 12 years or below (a claim made by a third of the articles studied) will find one, not because something new and different is happening, but because small sample sizes provide insufficient statistical power for the detection of trends.
"Our study raises the question: why has so much research been framed around the concept of a 'hiatus' when it does not exist?" said Lewandowsky. "The notion of a 'pause' or 'hiatus' demonstrably originated outside the scientific community, and it likely found entry into the scientific discourse because of the constant challenge by contrarian voices that are known to affect scientific communication and conduct."
Discussing climate change using the terms 'pause' or 'hiatus' creates hazards for the public and the scientific community, the study concludes.
Professor Lewandowsky added: "Scientists may argue that when they use the terms 'pause' or 'hiatus' they know – and their colleagues understand – that they do not mean to imply global warming has stopped.
"But while scientists might tacitly understand that global warming continues notwithstanding the alleged 'hiatus', or they may intend the 'pause' to refer to differences between observed temperatures and expectations from theory or models, the public is not privy to that tacit understanding.
"Therefore, scientists should avoid the use of 'pause' or hiatus' when referring to fluctuations of global mean surface temperature around the longer-term warming trend. There is no evidence for a pause in global warming."