Japanese factory operator SPREAD Co. has announced it will develop the world's first large-scale vegetable factory that is fully automated from seeding to harvest and capable of producing 30,000 heads of lettuce per day.
Credit: SPREAD Co.
SPREAD Co. was founded in 2006 and operates the world's largest vegetable factory using artificial lighting in Kameoka, Kyoto Prefecture. Four types of lettuce are currently produced, totalling 21,000 heads per day that are shipped to around 2,000 stores throughout the year.
As the company embarks on global expansion, it is now focussing on environmentally-friendly measures to be featured in the construction of a major next-generation vegetable factory. This new facility will be a vertical farm with total automation of the cultivation process from start to finish. It will cut labour costs by 50 percent, while energy costs will be reduced by 30 percent per head of lettuce through the use of artificial LED lighting specifically created for SPREAD, as well as the development of a unique air conditioning system. Up to 98 percent of water will be recycled onsite.
Thanks to indoor operations, this highly controlled environment will be unaffected by pests, temperature or weather conditions and will not require any chemical pesticides. Productivity per unit volume will be doubled in comparison to the company's existing factory in Kameoka, as a result of innovative efforts to save space in the cultivation area. Stacker machines will carry seedlings and hand them over to robots that will take care of transplanting them. Once fully grown, they will be harvested and delivered automatically to the packaging line.
The project will require up to 2 billion yen (US$16.7 million) of investment, which includes onsite R&D and testing facilities. The factory will have a total area of 4,400 square metres (47,400 sq ft) and be capable of producing 30,000 heads of lettuce per day. Construction is expected to start in spring 2016 with commercial operations beginning from summer 2017. The company is predicting annual sales of approximately 1 billion yen (US$8.4 million).
SPREAD Co. has plans for major expansion. They intend to increase the scale of production to 500,000 heads of lettuce per day within five years and will continue expanding their franchise both domestically and internationally.
U.S. physicists have achieved a breakthrough in fusion power by containing superheated hydrogen plasma for five milliseconds, far longer than any other effort before.
California-based Tri Alpha Energy reportedly held gas in a steady state at 10,000,000°C – only stopping when they ran out of fuel. Particle physicist and adviser to the secretive company, Burton Richter of Stanford University, comments: "They've succeeded finally in achieving a lifetime limited only by the power available to the system."
"Until you learn to control and tame [the hot gas], it's never going to work. In that regard, it's a big deal. They seem to have found a way to tame it," says Jaeyoung Park, head of rival fusion startup Energy/Matter Conversion Corporation in San Diego. "The next question is how well can you confine [heat in the gas]. I give them the benefit of the doubt. I want to watch them for the next 2 or 3 years."
Tri Alpha Energy's reactor is based on field-reversed configuration (FRC). This was first observed in the laboratory in the late 1950s. For decades, research on FRC was limited to plasma lasting for a maximum of only 0.3 milliseconds. In recent experiments, Tri Alpha Energy achieved a huge increase of up to two milliseconds. During their latest attempts, reported this week in the journal Science, angled beams at higher energies of 10 megawatts maintained stability for even longer – five milliseconds without decaying.
The company's goal is to scale their technique up to longer times and higher temperatures (3 billion degrees Celsius), such that atomic nuclei will collide with enough force to fuse and release energy. Tri Alpha Energy intends to dismantle their current machine and build a more powerful version in 2016. Houyang Guo, Chief Experimental Strategist, during a recent physics seminar at the University of Wisconsin–Madison, revealed that confinement times of 100 milliseconds to one second might be possible in the near future. Ultimately, fusion reactors could supply humanity with a practically limitless supply of clean energy.
Two separate studies highlight the need for major policy changes to protect the world's forests over the next century and beyond.
Forests cover an area of four billion hectares (15 million square miles) or about 30 percent of the world's land area. They are the dominant terrestrial ecosystem of Earth, accounting for 75% of the biosphere's gross primary productivity and containing 80% of the world's plant biomass. Forests provide crucial "ecosystem services" that benefit humanity in various ways. These include the sequestering of carbon from the atmosphere, regulation of the water cycle, soil formation, nutrient recycling, biodiversity and gene pool conservation. They also serve an aesthetic function by offering scenic and landscape beauty. The mere presence of trees has been shown to improve both physical and mental health for people living near them, particularly in urban areas.
Unfortunately, the world is losing forests at an alarming rate. As of today, more than three-quarters of the remaining tropical forests have now been degraded by human actions and this figure is likely to increase in the future. Research led by University College London (UCL) and published in the journal Science identifies a new and more dangerous phase of deforestation that is rapidly emerging.
According to the researchers, the first phase occurred when our ancestors moved into tropical forests, as hunter-gatherers. This was followed by a second phase around 6,000 years ago, with the emergence of tropical agriculture. Throughout this time, the overall health of forests was maintained. Today, however, we live in a third phase – characterised by much greater impacts, with distant decision-makers directing how land is used, including permanent intensive agriculture, often for soybeans or palm oil, frontier industrial logging for timber export, cross-continental species invasions, and the early impacts of climate change. The UCL researchers term this phase the era of "Global Integration", affecting even the most remote areas.
Lead author, tropical forest expert Dr Simon Lewis, comments as follows: "Earth has lost 100 million hectares of tropical forest over the last 30 years, mostly to agricultural developments. Few people think about how intertwined with tropical forests we all are. Many foodstuffs include palm oil which comes from once pristine Asian tropical forest, while remaining intact forests are buffering the rate of climate change by absorbing about a billion tonnes of carbon each year."
Current trends look set to intensify without major policy changes, as global food demand is projected to double, over 25 million kilometres of road are predicted to be built by 2050, and climate change intensifies, creating a new phase of human dominance of tropical forests. Having driven the world's highest deforestation rates in South East Asia, the palm oil industry is now gearing up to repeat this process across Africa.
Dr Lewis adds: "I fear a global simplification of the world's most complex forests. Deforestation, logging and road building all create fragmented patches of forest. However, as the climate rapidly changes, the plants and animals living in the rainforest will need to move to continue to live within their ecological tolerances. How will they move? This is a recipe for the mass extinction of tropical forest species this century."
"What is needed are unbroken areas of forest that link today's core tropical regions with forest areas about 4 degrees cooler – so as temperatures rise and rainfall patterns change, species have a better chance of surviving rapid 21st century climate change. We need to bring conservation in line with the reality of climate change," says Lewis.
In a separate paper, released by the Centre for Global Development (CGD) this week, researchers conclude that tropical forests will disappear faster than previously thought. Using the most sophisticated satellite imagery available from over 100 countries, CGD environmental economist Jonah Busch and research associate Jens Engelmann have projected a pattern of deforestation that will climb steadily through the 2020s and 2030s before accelerating around 2040.
On a business-as-usual scenario, they find:
• By 2050, an area of tropical forest the size of India will have been cleared – 289 million hectares, or roughly one-third the size of the U.S.
• By 2050, deforestation will have burned through one-sixth of our remaining "carbon budget" – the amount of emissions we have left in order to keep the average global temperature rise below 2° Celsius.
Longer term, the outlook is even worse. If humanity continues its pattern of endless consumption, their research indicates that less than one-fifth of Asia and Latin America's forest cover may remain by 2200.
In their report, Busch and Engelmann list three possible solutions that could reverse this trend. The first is an international payment system from rich countries to poor tropical countries, to keep forests standing. This is already beginning to happen – last year, Norway agreed a deal with Liberia and promised to pay the small African nation $150m (£91.4m) in development aid, to stop deforestation by 2020.
The second is for developing countries to introduce carbon pricing. With a charge of $20-per-ton of carbon dioxide on deforestation, emissions would drop by more than 20 percent by 2020; a $50-per-ton price would cut emissions nearly in half by 2050.
Their third solution is restrictive policies on deforestation. If developing countries introduce tighter regulation on deforestation, combined with better satellite monitoring and law enforcement, this would have a drastic impact.
"Conserving tropical forests is a bargain," explains Busch. "Reducing emissions from tropical deforestation costs about a fifth as much as reducing emissions in the European Union."
"The Paris climate agreement needs to provide funding and other resources to stop tropical deforestation," says Engelmann. "A climate agreement without robust action on forests will simply not be enough."
The other paper, by UCL, suggests giving forest dwellers formal collective legal rights over their land, which previous studies have shown is one of the best ways of preserving forests. A study of 292 protected areas in Amazonia showed that indigenous reserves were the most effective at avoiding deforestation in high pressure areas. Ensuring that local people are given collective long-term rights over their lands would mean that benefits flowing from forest lands accrue to the local people. This can provide the beginnings of "development without destruction" programs, tackling poverty while maintaining forest cover. This, the authors have argued, provides human rights and conservation win-wins.
Dr Lewis adds: "With long-term certainty of tenure people can plan, maintaining forests while investing in improving agricultural productivity without expanding into forested lands. Forest dwellers won't be perfect managers of forests – but they won't look for a quick profit and then move on, as big businesses often do."
"The Paris climate change talks in December are doubly important for forests and forest communities. The levels of emission cuts will be a critical factor in determining how many tropical forest plants and animals go extinct over the coming decades and centuries. The agreements on reducing deforestation, including durable finance, will be pivotal. The final test will be whether some funds for adaptation will include land-use planning to retain forest connectivity as the climate rapidly changes."
A new satellite – the BIOMASS Earth Observation mission – will help to improve the global monitoring of forest cover when launched in 2019. This will feature a radar powerful enough to sense both the height and wood content of individual trees. In the Amazon rainforest alone, there are an estimated 390 billion individual trees.
The Sumatran rhinoceros has been declared extinct in the wild in Malaysia – leaving only nine in captivity and 100 or fewer individuals in neighbouring Indonesia.
Credit: Rasmus Gren Havmøller
Leading scientists and experts in the field of rhino conservation have stated in a new paper that it is safe to consider the Sumatran rhinoceros extinct in the wild in Malaysia. The survival of this species now depends on the 100 remaining individuals in the wild in neighbouring Indonesia and the nine rhinos held in captivity.
Despite intensive survey efforts, there have been no signs of the wild Sumatran rhinoceros (Dicerorhinus sumatrensis) in Malaysia since 2007, apart from two females that were captured for breeding purposes in 2011 and 2014. Scientists now consider the species extinct in the wild in Malaysia. The experts urge conservation efforts in Indonesia to pick up the pace.
The conclusions are published online in Oryx, the International Journal of Conservation, led by the Centre for Macroecology, Evolution and Climate at the University of Copenhagen. Partners include WWF, the International Rhino Foundation, and the International Union for Conservation of Nature (IUCN), which maintains the global Red List of Threatened Species.
Only four Sumatran rhinos have been born in captivity so far. Three in the U.S. and one in the Sumatran Rhino Sanctuary in Indonesia, where this footage is from. Filmed by Rasmus Gren Havmøller.
“It is vital for the survival of the species that all remaining Sumatran rhinos are viewed as a metapopulation – meaning that all are managed in a single program across national and international borders, in order to maximise overall birth rate. This includes the individuals currently held in captivity”, says lead author and PhD student Rasmus Gren Havmøller from the Centre for Macroecology, Evolution and Climate.
The experts point to the creation of intensive management zones as a solution; areas with increased protection against poaching, where individual rhinos can be relocated to, in order to increase the number of potential and suitable mating partners.
As illustrated in the map below, the Sumatran rhino was once common in this part of the world – historically ranging across most of South-east Asia. Today, it is found only in a few small pockets of land. Here, less than 100 individuals are thought to live in three separate populations, one of which has seen a critical decline in distribution range of 70% over the last decade. This trend echoes how the population dropped from around 500 to extinction between 1980 and 2005 in Sumatra's largest protected area, the enormous 13,800 km2 Kerinci Seblat National Park.
Apart from the wild populations, nine Sumatran rhinos are in captivity – one in Cincinnati Zoo in the U.S. (soon to be moved to Indonesia), three held at facilities in Sabah, Malaysia for attempts to produce embryos by in vitro fertilisation, and five in the Sumatran Rhino Sanctuary in Sumatra, Indonesia.
“The tiger in India was saved from extinction due to the direct intervention of Mrs. Gandhi, the then prime minister, who set up Project Tiger. A similar high level intervention by President Joko Widodo of Indonesia could help pull the Sumatran rhinos back from the brink”, says Christy Williams, co-author.
Widodo Ramono, co-author and Director of the Rhino Foundation of Indonesia (YABI) elaborates: “Serious effort by the government of Indonesia should be put to strengthen rhino protection by creating Intensive Protection Zone (IPZ), intensive survey of the current known habitats, habitat management, captive breeding, and mobilising national resources and support from related local governments and other stakeholders.”
The conservation strategy so far has included the ongoing development of Rhino Protection Units at sites with remaining viable breeding populations. While this has been achieved, the authors highlight a need for strengthening the units against poaching efforts, especially in northern Sumatra. With a high demand for rhino horns in the black markets of Asia, poaching continues to be a major threat to the species.
Finally, captive breeding was identified as a key action back in 2013 at the Sumatran Rhino Crisis Summit in Singapore, and agreed upon that same year by the Indonesian government, in the Bandar Lampung Declaration. However, the necessary reproductive technology may still take years to develop, during which time we may lose the Sumatran rhino in the wild, the authors conclude.
By studying the structure and temperature of butterfly wings, researchers have observed physical properties that could hugely improve the efficiency of solar energy.
The humble butterfly may hold the key to unlocking new techniques to make solar energy far cheaper and more efficient, pioneering new research has shown. Experts from the University of Exeter have studied new methods for generating photovoltaic (PV) energy – or ways to convert sunlight into power. They showed that by mimicking the v-shaped posture adopted by Cabbage White butterflies to heat up their flight muscles before take-off, the amount of energy produced by solar panels could increase by almost 50 per cent. Crucially, by replicating this 'wing-like' structure, the power-to-weight ratio of the overall solar energy structure is increased 17-fold, making it vastly more efficient.
Professor Tapas Mallick, lead author of the research said: "Biomimicry in engineering is not new. However, this truly multidisciplinary research shows pathways to develop low cost solar power that have not been done before."
Cabbage White butterflies are known to take flight before other butterflies on cloudy days – which limit how quickly the insects can use the energy from the Sun to heat their flight muscles. This ability is thought to be due to the v-shaped posturing, known as reflectance basking, they adopt on such days to maximise the concentration of solar energy onto their thorax, which allows for flight. Furthermore, specific sub-structures of the butterflies' wings allow the light from the Sun to be reflected most efficiently, ensuring the flight muscles are warmed to an optimal temperature as quickly as possible.
The scientists therefore investigated how to replicate the wings to develop a new, lightweight reflective material that could be used in solar energy production. They found that the optimal angle by which the butterfly should hold its wings to increase temperature to its body was around 17°, which increased the temperature by 7.3°C compared to when held flat. They also showed that by replicating the simple mono-layer of scale cells found in the butterfly wings in solar energy producers, this could vastly improve the power-to-weight rations of future solar concentrators, making them significantly lighter and so more efficient.
Professor Richard ffrench-Constant, who conducts world-leading research into butterfly mimicry at the University of Exeter, said: "This proves that the lowly Cabbage White is not just a pest of your cabbages, but actually an insect that is an expert at harvesting solar energy."
The paper – White butterflies as solar photovoltaic concentrators – was published in the journal Scientific Reports and is available online.
Today, 13th August, is Earth Overshoot Day – the point at which our planet's ecological budget has been exhausted for the year. From now until the end of 2015, we are operating in "ecological overdraft".
In less than eight months, humanity has used up nature’s budget for the entire year, with carbon sequestration making up more than half of the demand on nature, according to data from Global Footprint Network, a sustainability think tank with offices in North America, Europe and Asia.
Global Footprint Network tracks humanity’s demand on the planet (Ecological Footprint) against nature’s ability to provide for this demand (biocapacity). Earth Overshoot Day marks the date when humanity’s annual demand on nature exceeds what Earth can regenerate in that year. Earth Overshoot Day has been gradually shifting forward over the years – from early October in 2000, to 13th August in 2015. Last year, it occurred on 19th August.
The costs of this ecological overspending are becoming more evident by the day, in the form of deforestation, drought, fresh-water scarcity, soil erosion, biodiversity loss and the buildup of carbon dioxide in the atmosphere. The latter will significantly amplify the former, if current climate models are correct. Consequently, government decision-makers who factor these growing constraints in their policy making will stand a significantly better chance to set their nation’s long-term economic performance on a favourable track.
“Humanity’s carbon footprint alone more than doubled since the early 1970s, which is when the world went into ecological overshoot. It remains the fastest growing component of the widening gap between the Ecological Footprint and the planet’s biocapacity,” says Mathis Wackernagel, president of Global Footprint Network and the co-creator of the Ecological Footprint resource accounting metric. “The global agreement to phase out fossil fuels that is being discussed around the world ahead of the Climate Summit in Paris would significantly help curb the Ecological Footprint’s consistent growth and eventually shrink the Footprint.”
Credit: Global Footprint Network
The carbon footprint is inextricably linked to the other components of the Ecological Footprint — cropland, grazing land, forests and productive land built over with buildings and roads. All these demands compete for space. As more is being demanded for food and timber products, fewer productive areas are available to absorb carbon from fossil fuel. This means carbon emissions accumulate in the atmosphere rather than being fully absorbed.
The climate agreement expected at the United Nations Conference of Parties (COP) 21 this December will focus on maintaining global warming within the 2°C range over pre-Industrial Revolution levels. This shared goal will require nations to implement policies to completely phase out fossil fuels by 2070, per the recommendations of the U.N.’s International Panel on Climate Change (IPCC), directly impacting the Ecological Footprints of nations.
Assuming that global carbon emissions are reduced by at least 30 percent below today’s levels by 2030, in keeping with the IPCC’s suggested scenario, Earth Overshoot Day could be moved back on the calendar to 16th September 2030 (assuming the rest of the Footprint would continue to expand at the current rate), according to Global Footprint Network.
This is not impossible. In fact, Denmark has cut its emissions over the last two decades at this rate: Since the 1990s, it has reduced its carbon emissions by 33%. Had the world done the same (while not changing the rest of the Footprint), Earth Overshoot Day would be on 3rd October this year.
This is not to say that Denmark has already reached a sustainable Ecological Footprint. Humanity would require the resources of nearly three planets if everyone lived like the Danes, which would move Earth Overshoot Day to 8th May. By contrast, business as usual would mean using the resources equivalent to two planets by 2030, with Earth Overshoot Day moving up on the calendar to the end of June.
This projection assumes that biocapacity, population growth and consumption trends remain on their current trajectories. However, it is not clear whether a sustained level of overuse is possible without significantly damaging long-term biocapacity, with consequent impacts on consumption and population growth.
“We are encouraged by the recent developments on the front line of renewable energy, which have been accelerating worldwide, and by the increasing awareness of the finance industry that a low-carbon economy is the way of the future,” says Wackernagel. “Going forward, we cannot stress enough the vital importance of reducing the carbon footprint, as nations are slated to commit to in Paris. It is not just good for the world, but increasingly becoming an economic necessity for each nation. We all know that the climate depends on it, but that is not the full story: Sustainability requires that everyone live well, within the means of one planet. This can only be achieved by keeping our Ecological Footprint within our planet’s resource budget.”
A new comprehensive analysis of global glacier changes in the Journal of Glaciology concludes that melting rates are "unprecedented" and occurring faster than ever.
Glacier decline in the first decade of the 21st century has reached a historical record since the onset of direct observations. Glacier melt is a global phenomenon and will continue even without further climate change. That's according to the latest study by the World Glacier Monitoring Service under the lead of the University of Zurich, Switzerland.
The World Glacier Monitoring Service, domiciled at the University of Zurich, has compiled worldwide data on glacier changes for more than 120 years. Together with its National Correspondents in more than 30 countries, the international service just published a new comprehensive analysis of global glacier changes in the Journal of Glaciology. In this study, observations of the first decade of the 21st century (2001-2010) were compared to all available earlier data from in-situ, air-borne and satellite-borne observations, as well as reconstructions from pictorial and written sources.
"The observed glaciers currently lose between half a metre and one metre of ice thickness every year – this is two to three times more than the corresponding average of the 20th century", explains Michael Zemp, lead author of the study. "Exact measurements of this ice loss are reported from a few hundred glaciers only. However, these results are qualitatively confirmed from field and satellite-based observations for tens of thousands of glaciers around the world."
A huge glacier calving event featured in the documentary Chasing Ice.
According to the international author team, the current rate of glacier melt is without precedence at global scale, at least for the time period observed and probably also for recorded history, as indicated also in reconstructions from written and illustrated documents. In addition, the study shows that the long-term retreat of glacier tongues is a global phenomenon. Intermittent re‐advance periods at regional and decadal scales are normally restricted to a subsample of glaciers and have not come close to achieving the Little Ice Age maximum positions reached between the 16th and 19th century. As such, glacier tongues in Norway have retreated by some kilometres from the maximum extents in the 19th century. The intermittent re-advances of the 1990s were restricted to glaciers in the coastal area and to a few hundred metres.
In addition, the study indicates that the intense ice loss of the past two decades has resulted in a strong imbalance of glaciers in many regions throughout the world. "These glaciers will suffer further ice loss – even if climate remains stable", warns Michael Zemp.
In the coming decades, the loss of glaciers worldwide could have profound implications for many countries that rely on them for water. This is likely to be a particular problem in southeast Asia. For example, another recent study suggests that the glacier volume of the Mount Everest region will fall below 50% of its 2015 level by the middle of this century.
Luca Curci architects studio have presented their "Vertical City" concept, a project proposal for a modular city-building settled in the water.
Italian architecture studio, Luca Curci, has presented "Vertical City" – a project proposal for a vertical city-building settled in the water. The project combines sustainability with population density and aims to be a "zero-energy city-building".
The architects explain how they analysed the contemporary skyscraper, and re-interpreted it as an opened structure, with green areas on each level and more natural light and ventilation. This new interpretation would allow residents to enjoy a healthier lifestyle, in connection with natural elements and improving their local community.
The building's design is based on a modular structural prefabricated element, which is repeatable horizontally as well as vertically. The singular shape of this element creates a 3-D network which sustains every single floor. The structure is surrounded by a membrane of photovoltaic glasses which provide electricity to the whole building and make it energy independent, with any excess solar energy able to be exported to the mainland.
The city-building is completely perforated to permit the circulation of air and light on each level, hosting green areas and vertical gardens. Green zones are spread all over the tower, while meeting and social areas can enhance community life.
The city-building consists of 10 modular layers overlapping. It reaches the height of 750 metres (2,460 ft). With a total volume of 3.8 million cubic metres it can host up to 25,000 people, with green areas encompassing 200,000 square metres, including the public garden square at the top of the building. Each modular and repeatable layer has a diameter of 155 metres (508 ft). The tower has 18 floors, with a mixture of homes, commercial services and other facilities for a large community. Residences have different sizes and shapes for each floor, and they include apartments, duplex and villas.
The building is settled on the sea bottom, with a series of underwater floors that host parking and technical areas, facilities such as spas, mediation centres and gym and luxury hotels rooms with underwater views.
It is possible to reach the Vertical City by water, by land or by air. The circular basement is equipped with external and internal docks and three naval entries: large boats can dock at the external berths, permitting public or private smaller boats only, to navigate in the inner gulf. A connection with the mainland is made possible through a semi-submersed bridge for pedestrians, cars and public electric transports, which connect the land with the basement underwater. The tower also features a heliport connected with the upper garden-square and vertical linking-installations.
The architects conclude that Vertical City is "a modular interpretation of the contemporary city – and possible future."
The latest global analysis of temperature data from NOAA shows that the first half of 2015 was the hottest such period on record, at 0.85°C (1.53°F) above the 20th century average, surpassing the previous record set in 2010 by 0.09°C (0.16°F).
The National Oceanic and Atmospheric Administration (NOAA) has released its latest Global Analysis of temperature and climate data. In addition to the warmest six months on record, a number of records were broken for individual months in the first half of 2015 – the Earth experienced its hottest ever February, March, May and June. These warm months, combined with the previous six months, make July 2014 to June 2015 the warmest 12-month period since records began 136 years ago.
Large areas of Earth's land surfaces witnessed higher than average temperatures in June. There was record warmth across the western United States, parts of northern South America, several regions in central to western Africa, central Asia around and to the east of the Caspian Sea, and parts of southeastern Asia. Western Greenland and parts of India and China were cooler than average, and northern Pakistan was much cooler than average.
For the oceans, the June global sea surface temperature was 0.74°C (1.33°F) above the 20th century average of 16.4°C (61.5°F), the highest for June on record, surpassing the previous record set last year by 0.06°C (0.11°F). This also tied with September 2014 as the highest monthly departure from average for any month for the globally-averaged sea surface temperature. Record warmth was observed across the northeastern and equatorial Pacific as well as parts of the equatorial and southern Indian Ocean, various regions of both the North and South Atlantic Ocean, and the Barents Sea to the northeast of Scandinavia. Only part of the North Atlantic between Greenland and the United Kingdom was much cooler than average. 2015 looks set to become the hottest year ever, thanks to the ongoing El Niño, which is clearly seen in the image below and has a strong (80%) chance of persisting into early spring 2016. For comparison, the famous "super El Niño" of 1997-1998 is shown on the left.
Nanowires have been used by Dutch researchers to boost solar fuel cell efficiency tenfold, while using 10,000 times less precious material.
Researchers at Eindhoven University of Technology (EUT) and the Foundation for Fundamental Research on Matter (FOM) in the Netherlands have demonstrated a highly promising prototype of a solar cell that generates fuel, rather than electricity. The material gallium phosphide enables their cell to produce the clean fuel hydrogen gas from liquid water. By processing the gallium phosphide using tiny nanowires, the yield is boosted by a factor of ten, while using 10,000 times less precious material.
Electricity produced by a solar cell can be used to set off chemical reactions. If this generates a fuel, then one speaks of solar fuels – a hugely promising replacement for polluting fuels. One possibility is to split liquid water using the electricity that is generated (electrolysis). Among oxygen, this produces hydrogen gas that can be used as a clean fuel in the chemical industry or combusted in fuel cells – in cars for example – to drive engines.
To connect an existing silicon solar cell to a battery that splits the water may well be an efficient solution now, but is very expensive. Many researchers are therefore trying to develop a semiconductor material able to both convert sunlight to an electrical charge and split the water, all in one; a kind of "solar fuel cell". Researchers at EUT and FOM see their dream candidate in gallium phosphide (GaP), a compound of gallium and phosphorus that also serves as the basis for specific coloured LEDs.
GaP has good electrical properties, but it cannot easily absorb light when it consists of a large flat surface, as used for solar cells. The researchers overcame this problem by making a grid of tiny GaP nanowires, measuring 500 nanometres (a millionth of a millimetre) in length and just 90 nanometres thick. This design immediately boosted the yield of hydrogen to 2.9 percent – a factor of ten improvement and a record for GaP cells, even though still some way off the 15 percent achieved by silicon cells coupled to a battery.
Research leader and EUT professor Erik Bakkers said it’s not simply about the yield – where there is still a lot of scope for improvement he points out: “For the nanowires we needed 10,000 less precious GaP material than in cells with a flat surface. That makes these kinds of cells potentially a great deal cheaper. In addition, GaP is also able to extract oxygen from the water – so you then actually have a fuel cell in which you can temporarily store your solar energy. In short, for a solar fuels future we cannot ignore gallium phosphide any longer.”
2015 is turning out to be a significant year for research on mammoths. In March, DNA from an ancient specimen was spliced into that of an elephant and shown to be functional for the first time. In April, a team sequenced the entire genome of the extinct animal. Following those breakthroughs, it is now reported that scientists have completed the first detailed analysis of the genome, revealing extensive genetic changes that helped mammoths adapt to life during the Ice Age.
The research was published this week in the peer-reviewed journal Cell Reports. It concludes that mammoths possessed genes with striking differences to those found in elephants. These genes played roles in skin and hair development, fat metabolism, insulin signalling and numerous other traits for adaptation in extreme cold environments. Genes linked to physical traits such as skull shape, small ears and short tails were also identified. As a test of their function, a mammoth gene involved in temperature sensation was "resurrected" in the laboratory and its protein product characterised.
“This is by far the most comprehensive study to look at the genetic changes that make a woolly mammoth a woolly mammoth,” says Vincent Lynch, PhD, assistant professor of human genetics at the University of Chicago. “They are an excellent model to understand how morphological evolution works, because mammoths are so closely related to living elephants, which have none of the traits they had.”
Well-studied due to the abundance of skeletons, frozen carcasses and depictions in prehistoric art, these animals possessed long, coarse fur, a thick layer of subcutaneous fat, small ears and tails and a brown-fat deposit behind the neck which may have functioned similar to a camel hump. They last roamed the frigid tundra steppes of northern Asia, Europe and North America roughly 10,000 years ago.
Artist's impression of the northern hemisphere during the last Ice Age. By Ittiz (Own work) [CC BY-SA 3.0], via Wikimedia Commons.
Previous efforts to sequence preserved mammoth DNA were error-prone, or yielded insights into only a limited number of genes. Lynch and his team performed deep sequencing of two specimens to identify 1.4 million genetic variants unique to woolly mammoths. These are now known to have caused changes to the proteins produced by around 1,600 genes.
Of particular interest was a group of genes responsible for temperature sensation, which also play roles in hair growth and fat storage. The team used ancestral reconstruction techniques to “resurrect” the mammoth version of one of these genes, TRPV3. When transplanted into human cells in the lab, the mammoth TRPV3 gene produced a protein that was less responsive to heat than an ancestral elephant version of the gene. This result is supported by experiments with TRPV3 on mice, which prefer colder environments and have wavier hair than normal mice.
However, although the functions of these genes match well with the environment in which woolly mammoths were known to live, Lynch warns that it is not direct proof of their effects in live mammoths. Regulation of gene expression, for example, is extremely difficult to study through the genome alone.
“We can’t know with absolute certainty the effects of these genes unless someone resurrects a complete woolly mammoth, but we can try to infer by doing experiments in the laboratory,” he says. Lynch and his colleagues are now identifying candidates for other mammoth genes to functionally test, alongside planning experiments to study mammoth proteins in elephant cells.
High-quality sequencing and detailed analysis of genomes can serve as a blueprint for efforts to “de-extinct” the woolly mammoth, according to Lynch: “Eventually, we’ll be technically able to do it,” he states. “But the question is: if you’re technically able to do something, should you do it? I personally think no. Mammoths are extinct and the environment in which they lived has changed. There are many animals on the edge of extinction we should be helping instead.”
A biotech startup firm has come up with an ingenious use of 3D printing that could save the rhino from extinction.
San Francisco-based Pembient reports that it has managed to synthesise fake rhino horn that is virtually indistinguishable from the real thing. It even carries the same genetic fingerprint. The process involves a series of chemical reactions on synthetic keratin, which is mixed with rhino DNA to produce a dried powder used as the "ink" for the 3D printer.
The number of rhinos being killed in Africa has exploded in recent years, due to a combination of soaring demand and the industrial-scale killing methods of organised gangs. Several subspecies have already gone extinct, including the West African black rhino in 2006. The remaining five subspecies on current trends will be extinct or very near extinction as early as 2025-2030.
The illegal wildlife trade, a $20bn black market, is the fourth largest after drug, arms, and human trafficking. Pembient intends to flood China with these fake horns at well below the current market price. This same 3D printing technique could be applied to other illegal animal products like elephant ivory, tiger bones and pangolin scales.
"We can meet the demand for horns at one-eighth the black-market price. We'll make money; the poaching syndicates won't," says the co-founder and CEO of Pembient, Matthew Markus. "We can produce a rhinoceros horn product that is actually more pure than what you can get from a wild animal. There are so many contaminants, pesticides, fallout from Fukushima. Rhino horn in the lab is as pure as that of a rhino of 2,000 years ago."
A prototype is shown in the picture below. Markus will be hosting an AMA (Ask Me Anything) on social media website Reddit, tomorrow from 1pm PT.
A prototype, 3D-printed rhino horn. Pembient will begin shipping these to Beijing later this year.
NASA has just released its latest update for GISTEMP – one of the most widely-cited datasets for the measuring of global temperatures. This shows that the first five months of this year were the hottest five-month period on record by a considerable margin. So far, 2015 has been 0.77°C (1.4°F) warmer than the 1951-1980 baseline. This is compared to 0.68°C (1.2°F) set during 2014, the previous record year.
These record high temperatures have occurred even before a substantial El Niño has yet to take full effect. Taking the pre-industrial temperature as the baseline (instead of 1951-1980) and projecting a future trend, the world is on course for a 1°C (1.8°F) rise by the early 2020s. One degree of warming might not sound like much, but the energy required to heat the entire surface and lower atmosphere of a planet is huge – equivalent to four Hiroshima atomic bombs detonating every second. That heat is being trapped by greenhouse gases, as shown by simple laboratory experiments and theorised as far back as the mid-19th century.
If global warming is to be kept below 2°C this century, then over 80% of coal, 50% of gas and 30% of oil reserves are "unburnable", according to a recent study published in Nature. This means that drilling in the Arctic Circle should be prohibited, since it contains a large fraction of the world's undiscovered oil and gas. Despite this scientific conclusion and the long-term risks, a number of nations including Canada, Russia and the US are racing to claim the available resources.
GISTEMP is based on publicly available data from 6,300 meteorological stations around the world; from ship-based and satellite observations of sea surface temperatures; and from Antarctic research stations. These three data sets are combined and adjusted to account for breaks in station records, the effects of urban heating, and the distribution of stations across the landscape.
Engineers at Stanford University have developed a state-by-state plan to convert the USA to 100% clean, renewable energy by 2050.
At the G7 summit in Germany this week, world leaders agreed to phase out fossil fuels by 2100. However, some countries may be able to achieve this target earlier than others. Indeed, a new study led by Stanford University outlines how each of the 50 states in the USA could achieve such a transition by 2050.
Mark Z. Jacobson – professor of civil and environmental engineering at Stanford – and colleagues including U.C. Berkeley researcher Mark Delucchi, demonstrate 50 individual plans, calling for aggressive changes to both infrastructure and the ways America currently consumes energy. While it may sound like a radical idea, their research indicates that the conversion is technically and economically possible through the wide-scale implementation of existing technologies.
"The main barriers are social, political, and getting industries to change. One way to overcome the barriers is to inform people about what is possible," said Jacobson. "By showing that it's technologically and economically possible, this study could reduce the barriers to a large scale transformation."
Jacobson and his colleagues looked at future trends in energy use for residential, commercial, industrial and transportation sectors. Their research examined how the integration of zero-carbon, fully electric technology could affect energy savings in vehicles, homes and workplaces.
"When we did this across all 50 states, we saw a 39 percent reduction in total end-use power demand by the year 2050," Jacobson said. "About six percentage points of that is gained through efficiency improvements to infrastructure, but the bulk is the result of replacing current sources and uses of combustion energy with electricity."
Next, the team calculated the renewable energy resources available for each state by analysing sunlight exposure, wind maps, geothermal sources and determining whether local offshore wind turbines were an option. Geothermal energy was available at a reasonable cost for only 13 states. Their plans call for virtually no new hydroelectric dams, but do account for energy gains from improving the efficiency of existing dams. The report lays out individual roadmaps for each state to achieve an 80 percent transition by 2030, and a full conversion by 2050.
Several states are already on their way. Washington state, for instance, could make the switch to full renewables relatively quickly, thanks to the fact that more than 70 percent of its current electricity comes from existing hydroelectric sources. Iowa and South Dakota are also well-positioned, as they already produce nearly 30 percent of their electricity from wind power. California already has a plan to be 60 percent electrified by renewables by 2030.
No more than 0.5 percent of any state's land would need covering in solar panels or wind turbines. The upfront cost of the changes would be significant, but wind and sunlight are free. So the overall cost spread over the long term would roughly equal the price of the fossil fuel infrastructure, maintenance and production. The plan also addresses the issues of base load and intermittency (a criticism that is frequently levelled at renewables) by using a combination of storage systems and demand response, with support from non-variable energy sources such as hydro and geothermal, to fill temporary gaps in supply from wind or solar. All in all, this new grid would not only be reliable, but actually more reliable than today's grid.
"When you account for the health and climate costs – as well as the rising price of fossil fuels – wind, water and solar are half the cost of conventional systems," he continued. "A conversion of this scale would also create jobs, stabilise fuel prices, reduce pollution-related health problems and eliminate emissions from the United States. There is very little downside to a conversion, at least based on this science."
If the conversion is followed exactly as his plan outlines, the reduction of air pollution in the U.S. could prevent the deaths of approximately 63,000 Americans who die from air pollution-related causes each year. It would also eliminate U.S. emissions of greenhouse gases produced from fossil fuel, which would otherwise cost the world $3.3 trillion a year by 2050.
The USA currently produces 15% of the world's carbon emissions. An even bigger emitter is China, of course – responsible for 29%. While the sheer size and growth of China may appear daunting, it is actually a world leader in terms of clean energy investment. Last year, a report from WWF-US indicated that China could make a similar transition to that illustrated here, with potentially 82% of its electricity generated from renewables by 2050.
In June 2014, Tesla released its patents in an effort to accelerate the development of electric vehicles (EVs). Following Tesla's lead, Ford has now taken similar action by opening its portfolio of EV technology patents to competitors. Last year, Ford filed more than 400 patent applications for EV technology amounting to over 20% of the company's 2,000 total applications.
“Innovation is our goal,” says Kevin Layden, the director of Ford Electrification Programs. “The way to provide the best technology is through constant development and progress. By sharing our research with other companies, we will accelerate the growth of electrified vehicle technology and deliver even better products to customers.”
Ford Motor Company is a leader in this area – offering six hybrid or fully electrified vehicles including Ford Focus Electric, Ford Fusion Hybrid, Ford Fusion Energi plug-in hybrid, Ford C-MAX Hybrid, Ford C-MAX Energi plug-in hybrid (including a solar-powered concept) and Lincoln MKZ Hybrid. In total, Ford has more than 650 electrified vehicle patents and 1,000 pending applications on electrified vehicle technologies.
Ford’s innovations have resulted in acclaimed electrified vehicles on the road today, but the company believes sharing its patented technologies will promote faster development of future inventions as all automakers look toward greater opportunities.
“As an industry, we need to collaborate while we continue to challenge each other,” says Layden. “By sharing ideas, companies can solve bigger challenges and help improve the industry.”
As part of Ford’s increased focus on new and innovative technologies, the automaker is set to hire an additional 200 electrified vehicle engineers this year as the team moves into a newly dedicated facility – Ford Engineering Laboratories – home to Henry Ford’s first labs in Dearborn.
Some of Ford’s electrified vehicle patents available for competitors include:
• Method and Apparatus for Battery Charge Balancing, patent No. US5764027: This patent covers passive cell balancing: discharging a cell through a resistor to lower the state of charge to match other cells. This innovation extends battery run time and overall life. This is the first invention to enable battery balancing at any time, instead of only while charging, and it enables the use of lithium-ion batteries in electrified vehicles. It was invented long before lithium-ion battery-powered vehicles became commonplace – truly ahead of its time.
• Temperature Dependent Regenerative Brake System for Electric Vehicle, patent No. US6275763: This works to maximise the amount of energy recaptured in a hybrid vehicle through regenerative braking. By improving the interplay between normal friction brakes and regenerative braking during stopping at certain air temperatures, a driver is able to recapture more energy than previously possible, helping the motorist drive farther on a charge.
• Driving Behaviour Feedback Interface, patent No. US8880290: This patent provides a system and method for monitoring driver inputs such as braking and accelerating, and vehicle parameters including energy consumption to assess driving behaviour. The feedback can be used to coach future driving behaviour that may translate into better long-term driving habits and improve fuel economy. This technology has also enabled drivers of non-electrified vehicles, such as a Ford Focus, to develop better driving habits.
The Australian Government's Bureau of Meteorology has confirmed that the tropical Pacific is in the early stages of an El Niño that is likely to persist in the coming months.
The tropical Pacific is now in the early stages of an El Niño. Based upon the model outlooks and current observations, the Bureau's ENSO Tracker has been raised to El Niño status.
El Niño–Southern Oscillation (ENSO) indicators have shown a steady trend towards El Niño levels since the start of the year. Sea surface temperatures in the tropical Pacific Ocean have exceeded El Niño thresholds for the past month, supported by warmer-than-average waters below the surface. Trade winds have remained consistently weaker than average since the start of the year, cloudiness at the Date Line has increased and the Southern Oscillation Index (SOI) has remained negative for several months. These indicators suggest the tropical Pacific Ocean and atmosphere have started to couple and reinforce each other, indicating El Niño is likely to persist in the coming months. Pacific Ocean temperatures are likely to remain above El Niño thresholds through the coming southern winter and at least into spring.
"This is a proper El Niño effect – not a weak one," David Jones, manager of climate monitoring and prediction, told reporters. "You know, there's always a little bit of doubt when it comes to intensity forecasts, but across the models as a whole we'd suggest that this will be quite a substantial El Niño event."
The last El Niño was observed during 2009–10. A very strong El Niño has not occurred since 1997–98. It was during 1998 that global average temperatures spiked to an unprecedented high. Since then, average temperatures have continued to rise, with 2014 being the hottest year on record even in the absence of significant El Niño conditions. According to Jones, this means there is a "significant probability" that 2015 will top 2014 as the hottest year globally. Seven of the ten warmest years occurred during El Niño years.
"The most obvious thing we know is that El Niño events tend to lead to drier winter and spring periods [in Australia]," Jones explained. "There is an increased risk of drought, which obviously isn’t good for people already in drought. Australian temperatures are already warming – and El Niño tends to give those temperatures a boost – so we’d expect winter, spring and even early summer to have well above average daytime temperatures."
Australia is among the regions most dramatically affected by the recurring weather phenomenon, but its effects are felt around the world. South America is hit by heavy rains and floods, while the USA experiences warmer winters. In Africa and parts of Asia, scorching temperatures can lead to rises in the price of commodities such as rice, corn and palm oil. Additional health and social impacts include the increased spread of diseases, especially those which are transmitted by mosquitoes. In Europe, the snowy UK winter of 2009–10 was thought to be an effect of El Niño.
In general, developing countries dependent upon agriculture and fishing, particularly those bordering the Pacific Ocean, are likely to be worst affected. Research by Columbia University suggests that ENSO may have had a role in 21% of all civil conflicts since 1950, with the risk of annual civil conflict doubling from 3% to 6% in countries affected by ENSO during El Niño years, relative to La Niña years.
During the last several decades, the frequency and intensity of El Niño events has increased. This is most likely linked to global warming and the increasing level of greenhouse gases in the atmosphere – although a longer period of observation is needed to confirm this. Scientists have theorised that permanent El Niño conditions may emerge when global average temperatures increase by 3°C (5.4°F).
Atmospheric CO2 remained above 400 parts per million (ppm) through March 2015, the first time it has been at this level for an entire month, according to the National Oceanic and Atmospheric Administration (NOAA). The current concentration of greenhouse gases is the highest it has been for millions of years.
“It was only a matter of time that we would average 400 parts per million globally,” says Pieter Tans, lead scientist of NOAA’s Global Greenhouse Gas Reference Network. “We first reported 400 ppm when all of our Arctic sites reached that value in the spring of 2012. In 2013, the record at NOAA’s Mauna Loa Observatory first crossed the 400 ppm threshold. Reaching 400 parts per million as a global average is a significant milestone.
“This marks the fact that humans burning fossil fuels have caused global carbon dioxide concentrations to rise more than 120 parts per million since pre-industrial times,” he adds. “Half of that rise has occurred since 1980.”
NOAA bases the global CO2 concentration on air samples taken from 40 sites around the world. NOAA and partner scientists collect air samples in flasks while standing on cargo ship decks, on the shores of remote islands, and other isolated locations. It takes some time after each month's end to compute this global average because samples are shipped for analysis at NOAA’s Earth System Research Laboratory in Boulder, Colorado.
“We choose to sample at these sites because the atmosphere itself serves to average out gas concentrations that are being affected by human and natural forces. At these remote sites, we get a better global average,” said Ed Dlugokencky, the NOAA scientist who manages the global network.
Patricio Eladio Rojas Ledezma, a meteorologist, collects air samples on Easter Island, Chile.
The last time atmospheric levels of carbon dioxide were at 400ppm was during the mid-Pliocene, over 3 million years ago. Back then, our ancestors had brains about as big as those of modern chimps. They had only recently developed stone tools and were roaming the savannahs of Africa while being hunted by sabre-toothed cats. Average global temperatures in the mid-Pliocene were up to 3°C hotter than today, exceeding 10°C in the polar regions, with sea levels around 25m (82ft) higher. Many species of plants and animals were living several hundred kilometres further north of where their nearest relatives exist today.
On a geological timescale, the present rate of change in atmospheric CO2 level is unprecedented. During the ancient past, a rise of 10ppm might have taken 1,000 years or more. Today, human activity is adding that much every five years, as we overwhelm nature's ability to absorb it. On current trends, the world is on track for a doubling of greenhouse gas levels in the second half of this century – potentially causing 4 to 6°C of warming. This would lead to a radically altered planet with grave consequences for humanity.
Dr. James Butler, director of NOAA’s Global Monitoring Division, explains that reversing the global CO2 level would be difficult because of its long lifetime: “Elimination of about 80 percent of fossil fuel emissions would essentially stop the rise in carbon dioxide in the atmosphere – but concentrations of carbon dioxide would not start decreasing until even further reductions are made and then it would only do so slowly.”
Tesla has revealed a new battery technology for homes and businesses, which provides a way to store energy from localised renewables and can function as a backup system during power outages.
A major barrier to the widespread adoption of clean energy has been the intermittent nature of wind and solar. The Sun doesn't always shine, and the wind doesn't always blow – making it difficult or impossible to harness these resources on a 24-hour basis.
Elon Musk, CEO of electric vehicle firm Tesla Motors, yesterday unveiled a revolutionary new technology that can solve these issues. The Powerwall, pictured above, is a rechargeable lithium-ion battery product, intended primarily for home use. It stores electricity generated from rooftop solar panels, which can then be used for domestic consumption, load shifting, or backup power.
With a constant supply of renewable energy at a local scale, the Powerwall offers complete independence from the utility grid, meaning that customers no longer have to worry about expensive bills incurred during peak hours. If a utility company experiences a major outage, the Powerwall can serve as the home power supply instead, which is especially useful in areas prone to storms or unreliable grids. It also recharges electric vehicles more cheaply during night hours while surplus power can be flowed back to the grid when needed.
Tesla claims the Powerwall is fully automated, simple to install, and requires no maintenance. It is being marketed in two models: 10 kWh weekly cycle ($3,500) and 7 kWh daily cycle ($3,000) versions. Multiple batteries can be installed together for homes with greater energy needs; up to 90 kWh total for the 10 kWh battery and 63 kWh total for the 7 kWh battery. Both are rated for indoor and outdoor installation, and guaranteed for ten years.
The Powerwall begins shipping this summer. It will be sold to companies including SolarCity, which is running a pilot project in 500 California houses, using 10-kWh battery packs. Tesla is bullish about the prospects for batteries, electric vehicles and clean energy. The company is building a "gigafactory" to develop and expand these technologies at a large scale, with more factories to come in the future.
While the current price of the Powerwall may seem a little on the high side, analysts forecast a substantial decline in battery costs over the next decade and beyond, with a similar fall in solar panel costs. When combined with smart grids, the proliferation of this technology seems inevitable. As predicted on our future timeline, it is likely that home energy storage systems will be commonplace by 2030.
A much larger version of the Powerpack – described as an "infinitely scalable system" – will be made available for businesses and industrial applications. This will come in 100 kWh battery blocks, which can scale from 500 kWH, up to 1 GWh and even higher: "Our goal here is to change the way the world uses energy at an extreme scale," says Musk. You can watch his full keynote presentation (which was powered by solar energy) in the video below.