future timeline technology singularity humanity
 
   
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed
 
     
     
 
       
 
 
 

Blog » Energy & the Environment

 
     
 

21st June 2015

3D-printed rhino horn could make poaching obsolete

A biotech startup firm has come up with an ingenious use of 3D printing that could save the rhino from extinction.

San Francisco-based Pembient reports that it has managed to synthesise fake rhino horn that is virtually indistinguishable from the real thing. It even carries the same genetic fingerprint. The process involves a series of chemical reactions on synthetic keratin, which is mixed with rhino DNA to produce a dried powder used as the "ink" for the 3D printer.

The number of rhinos being killed in Africa has exploded in recent years, due to a combination of soaring demand and the industrial-scale killing methods of organised gangs. Several subspecies have already gone extinct, including the West African black rhino in 2006. The remaining five subspecies on current trends will be extinct or very near extinction as early as 2025-2030.

The illegal wildlife trade, a $20bn black market, is the fourth largest after drug, arms, and human trafficking. Pembient intends to flood China with these fake horns at well below the current market price. This same 3D printing technique could be applied to other illegal animal products like elephant ivory, tiger bones and pangolin scales.

"We can meet the demand for horns at one-eighth the black-market price. We'll make money; the poaching syndicates won't," says the co-founder and CEO of Pembient, Matthew Markus. "We can produce a rhinoceros horn product that is actually more pure than what you can get from a wild animal. There are so many contaminants, pesticides, fallout from Fukushima. Rhino horn in the lab is as pure as that of a rhino of 2,000 years ago."

A prototype is shown in the picture below. Markus will be hosting an AMA (Ask Me Anything) on social media website Reddit, tomorrow from 1pm PT.

 

3d printed rhino horn poaching obsolete future timeline technology
A prototype, 3D-printed rhino horn. Pembient will begin shipping these to Beijing later this year.

 

  speech bubble Comments »
 

 

 

17th June 2015

NASA: 2015 is the hottest year on record (so far)

NASA has just released its latest update for GISTEMP – one of the most widely-cited datasets for the measuring of global temperatures. This shows that the first five months of this year were the hottest five-month period on record by a considerable margin. So far, 2015 has been 0.77°C (1.4°F) warmer than the 1951-1980 baseline. This is compared to 0.68°C (1.2°F) set during 2014, the previous record year.

These record high temperatures have occurred even before a substantial El Niño has yet to take full effect. Taking the pre-industrial temperature as the baseline (instead of 1951-1980) and projecting a future trend, the world is on course for a 1°C (1.8°F) rise by the early 2020s. One degree of warming might not sound like much, but the energy required to heat the entire surface and lower atmosphere of a planet is huge – equivalent to four Hiroshima atomic bombs detonating every second. That heat is being trapped by greenhouse gases, as shown by simple laboratory experiments and theorised as far back as the mid-19th century.

If global warming is to be kept below 2°C this century, then over 80% of coal, 50% of gas and 30% of oil reserves are "unburnable", according to a recent study published in Nature. This means that drilling in the Arctic Circle should be prohibited, since it contains a large fraction of the world's undiscovered oil and gas. Despite this scientific conclusion and the long-term risks, a number of nations including Canada, Russia and the US are racing to claim the available resources.

GISTEMP is based on publicly available data from 6,300 meteorological stations around the world; from ship-based and satellite observations of sea surface temperatures; and from Antarctic research stations. These three data sets are combined and adjusted to account for breaks in station records, the effects of urban heating, and the distribution of stations across the landscape.

 

nasa gistemp 2015 hottest year data

 

  speech bubble Comments »
 

 

 

10th June 2015

A plan to convert USA to 100% renewables by 2050

Engineers at Stanford University have developed a state-by-state plan to convert the USA to 100% clean, renewable energy by 2050.

 

usa energy future energy 2050 timeline

 

At the G7 summit in Germany this week, world leaders agreed to phase out fossil fuels by 2100. However, some countries may be able to achieve this target earlier than others. Indeed, a new study led by Stanford University outlines how each of the 50 states in the USA could achieve such a transition by 2050.

Mark Z. Jacobson – professor of civil and environmental engineering at Stanford – and colleagues including U.C. Berkeley researcher Mark Delucchi, demonstrate 50 individual plans, calling for aggressive changes to both infrastructure and the ways America currently consumes energy. While it may sound like a radical idea, their research indicates that the conversion is technically and economically possible through the wide-scale implementation of existing technologies.

"The main barriers are social, political, and getting industries to change. One way to overcome the barriers is to inform people about what is possible," said Jacobson. "By showing that it's technologically and economically possible, this study could reduce the barriers to a large scale transformation."

Jacobson and his colleagues looked at future trends in energy use for residential, commercial, industrial and transportation sectors. Their research examined how the integration of zero-carbon, fully electric technology could affect energy savings in vehicles, homes and workplaces.

"When we did this across all 50 states, we saw a 39 percent reduction in total end-use power demand by the year 2050," Jacobson said. "About six percentage points of that is gained through efficiency improvements to infrastructure, but the bulk is the result of replacing current sources and uses of combustion energy with electricity."

Next, the team calculated the renewable energy resources available for each state by analysing sunlight exposure, wind maps, geothermal sources and determining whether local offshore wind turbines were an option. Geothermal energy was available at a reasonable cost for only 13 states. Their plans call for virtually no new hydroelectric dams, but do account for energy gains from improving the efficiency of existing dams. The report lays out individual roadmaps for each state to achieve an 80 percent transition by 2030, and a full conversion by 2050.

Several states are already on their way. Washington state, for instance, could make the switch to full renewables relatively quickly, thanks to the fact that more than 70 percent of its current electricity comes from existing hydroelectric sources. Iowa and South Dakota are also well-positioned, as they already produce nearly 30 percent of their electricity from wind power. California already has a plan to be 60 percent electrified by renewables by 2030.

No more than 0.5 percent of any state's land would need covering in solar panels or wind turbines. The upfront cost of the changes would be significant, but wind and sunlight are free. So the overall cost spread over the long term would roughly equal the price of the fossil fuel infrastructure, maintenance and production. The plan also addresses the issues of base load and intermittency (a criticism that is frequently levelled at renewables) by using a combination of storage systems and demand response, with support from non-variable energy sources such as hydro and geothermal, to fill temporary gaps in supply from wind or solar. All in all, this new grid would not only be reliable, but actually more reliable than today's grid.

"When you account for the health and climate costs – as well as the rising price of fossil fuels – wind, water and solar are half the cost of conventional systems," he continued. "A conversion of this scale would also create jobs, stabilise fuel prices, reduce pollution-related health problems and eliminate emissions from the United States. There is very little downside to a conversion, at least based on this science."

If the conversion is followed exactly as his plan outlines, the reduction of air pollution in the U.S. could prevent the deaths of approximately 63,000 Americans who die from air pollution-related causes each year. It would also eliminate U.S. emissions of greenhouse gases produced from fossil fuel, which would otherwise cost the world $3.3 trillion a year by 2050.

The study is published in the online edition of Energy and Environmental Sciences. An interactive map summarising the plans for each state is available at www.thesolutionsproject.org.

The USA currently produces 15% of the world's carbon emissions. An even bigger emitter is China, of course – responsible for 29%. While the sheer size and growth of China may appear daunting, it is actually a world leader in terms of clean energy investment. Last year, a report from WWF-US indicated that China could make a similar transition to that illustrated here, with potentially 82% of its electricity generated from renewables by 2050.

 

  speech bubble Comments »
 

 

 

29th May 2015

Ford releases electric vehicle technology patents

In June 2014, Tesla released its patents in an effort to accelerate the development of electric vehicles (EVs). Following Tesla's lead, Ford has now taken similar action by opening its portfolio of EV technology patents to competitors. Last year, Ford filed more than 400 patent applications for EV technology amounting to over 20% of the company's 2,000 total applications.

“Innovation is our goal,” says Kevin Layden, the director of Ford Electrification Programs. “The way to provide the best technology is through constant development and progress. By sharing our research with other companies, we will accelerate the growth of electrified vehicle technology and deliver even better products to customers.”

Ford Motor Company is a leader in this area – offering six hybrid or fully electrified vehicles including Ford Focus Electric, Ford Fusion Hybrid, Ford Fusion Energi plug-in hybrid, Ford C-MAX Hybrid, Ford C-MAX Energi plug-in hybrid (including a solar-powered concept) and Lincoln MKZ Hybrid. In total, Ford has more than 650 electrified vehicle patents and 1,000 pending applications on electrified vehicle technologies.

Ford’s innovations have resulted in acclaimed electrified vehicles on the road today, but the company believes sharing its patented technologies will promote faster development of future inventions as all automakers look toward greater opportunities.

“As an industry, we need to collaborate while we continue to challenge each other,” says Layden. “By sharing ideas, companies can solve bigger challenges and help improve the industry.”

 

ford electric vehicle technology

 

As part of Ford’s increased focus on new and innovative technologies, the automaker is set to hire an additional 200 electrified vehicle engineers this year as the team moves into a newly dedicated facility – Ford Engineering Laboratories – home to Henry Ford’s first labs in Dearborn.

Some of Ford’s electrified vehicle patents available for competitors include:

Method and Apparatus for Battery Charge Balancing, patent No. US5764027: This patent covers passive cell balancing: discharging a cell through a resistor to lower the state of charge to match other cells. This innovation extends battery run time and overall life. This is the first invention to enable battery balancing at any time, instead of only while charging, and it enables the use of lithium-ion batteries in electrified vehicles. It was invented long before lithium-ion battery-powered vehicles became commonplace – truly ahead of its time.

Temperature Dependent Regenerative Brake System for Electric Vehicle, patent No. US6275763: This works to maximise the amount of energy recaptured in a hybrid vehicle through regenerative braking. By improving the interplay between normal friction brakes and regenerative braking during stopping at certain air temperatures, a driver is able to recapture more energy than previously possible, helping the motorist drive farther on a charge.

Driving Behaviour Feedback Interface, patent No. US8880290: This patent provides a system and method for monitoring driver inputs such as braking and accelerating, and vehicle parameters including energy consumption to assess driving behaviour. The feedback can be used to coach future driving behaviour that may translate into better long-term driving habits and improve fuel economy. This technology has also enabled drivers of non-electrified vehicles, such as a Ford Focus, to develop better driving habits.

 

ford solar powered car technology

 

  speech bubble Comments »
 

 

 

14th May 2015

"Substantial" El Niño predicted for 2015

The Australian Government's Bureau of Meteorology has confirmed that the tropical Pacific is in the early stages of an El Niño that is likely to persist in the coming months.

 

el nino 2015 prediction

 

The tropical Pacific is now in the early stages of an El Niño. Based upon the model outlooks and current observations, the Bureau's ENSO Tracker has been raised to El Niño status.

El Niño–Southern Oscillation (ENSO) indicators have shown a steady trend towards El Niño levels since the start of the year. Sea surface temperatures in the tropical Pacific Ocean have exceeded El Niño thresholds for the past month, supported by warmer-than-average waters below the surface. Trade winds have remained consistently weaker than average since the start of the year, cloudiness at the Date Line has increased and the Southern Oscillation Index (SOI) has remained negative for several months. These indicators suggest the tropical Pacific Ocean and atmosphere have started to couple and reinforce each other, indicating El Niño is likely to persist in the coming months. Pacific Ocean temperatures are likely to remain above El Niño thresholds through the coming southern winter and at least into spring.

"This is a proper El Niño effect – not a weak one," David Jones, manager of climate monitoring and prediction, told reporters. "You know, there's always a little bit of doubt when it comes to intensity forecasts, but across the models as a whole we'd suggest that this will be quite a substantial El Niño event."

The last El Niño was observed during 2009–10. A very strong El Niño has not occurred since 1997–98. It was during 1998 that global average temperatures spiked to an unprecedented high. Since then, average temperatures have continued to rise, with 2014 being the hottest year on record even in the absence of significant El Niño conditions. According to Jones, this means there is a "significant probability" that 2015 will top 2014 as the hottest year globally. Seven of the ten warmest years occurred during El Niño years.

"The most obvious thing we know is that El Niño events tend to lead to drier winter and spring periods [in Australia]," Jones explained. "There is an increased risk of drought, which obviously isn’t good for people already in drought. Australian temperatures are already warming – and El Niño tends to give those temperatures a boost – so we’d expect winter, spring and even early summer to have well above average daytime temperatures."

Australia is among the regions most dramatically affected by the recurring weather phenomenon, but its effects are felt around the world. South America is hit by heavy rains and floods, while the USA experiences warmer winters. In Africa and parts of Asia, scorching temperatures can lead to rises in the price of commodities such as rice, corn and palm oil. Additional health and social impacts include the increased spread of diseases, especially those which are transmitted by mosquitoes. In Europe, the snowy UK winter of 2009–10 was thought to be an effect of El Niño.

In general, developing countries dependent upon agriculture and fishing, particularly those bordering the Pacific Ocean, are likely to be worst affected. Research by Columbia University suggests that ENSO may have had a role in 21% of all civil conflicts since 1950, with the risk of annual civil conflict doubling from 3% to 6% in countries affected by ENSO during El Niño years, relative to La Niña years.

During the last several decades, the frequency and intensity of El Niño events has increased. This is most likely linked to global warming and the increasing level of greenhouse gases in the atmosphere – although a longer period of observation is needed to confirm this. Scientists have theorised that permanent El Niño conditions may emerge when global average temperatures increase by 3°C (5.4°F).

 

 

 

  speech bubble Comments »
 

 

 

8th May 2015

Monthly average CO2 is at 400ppm and rising

Atmospheric CO2 remained above 400 parts per million (ppm) through March 2015, the first time it has been at this level for an entire month, according to the National Oceanic and Atmospheric Administration (NOAA). The current concentration of greenhouse gases is the highest it has been for millions of years.

 

atmospheric co2 400ppm

 

“It was only a matter of time that we would average 400 parts per million globally,” says Pieter Tans, lead scientist of NOAA’s Global Greenhouse Gas Reference Network. “We first reported 400 ppm when all of our Arctic sites reached that value in the spring of 2012. In 2013, the record at NOAA’s Mauna Loa Observatory first crossed the 400 ppm threshold. Reaching 400 parts per million as a global average is a significant milestone.

“This marks the fact that humans burning fossil fuels have caused global carbon dioxide concentrations to rise more than 120 parts per million since pre-industrial times,” he adds. “Half of that rise has occurred since 1980.”

NOAA bases the global CO2 concentration on air samples taken from 40 sites around the world. NOAA and partner scientists collect air samples in flasks while standing on cargo ship decks, on the shores of remote islands, and other isolated locations. It takes some time after each month's end to compute this global average because samples are shipped for analysis at NOAA’s Earth System Research Laboratory in Boulder, Colorado.

“We choose to sample at these sites because the atmosphere itself serves to average out gas concentrations that are being affected by human and natural forces. At these remote sites, we get a better global average,” said Ed Dlugokencky, the NOAA scientist who manages the global network.

 

  atmospheric co2 400ppm
Patricio Eladio Rojas Ledezma, a meteorologist, collects air samples on Easter Island, Chile.

 

The last time atmospheric levels of carbon dioxide were at 400ppm was during the mid-Pliocene, over 3 million years ago. Back then, our ancestors had brains about as big as those of modern chimps. They had only recently developed stone tools and were roaming the savannahs of Africa while being hunted by sabre-toothed cats. Average global temperatures in the mid-Pliocene were up to 3°C hotter than today, exceeding 10°C in the polar regions, with sea levels around 25m (82ft) higher. Many species of plants and animals were living several hundred kilometres further north of where their nearest relatives exist today.

On a geological timescale, the present rate of change in atmospheric CO2 level is unprecedented. During the ancient past, a rise of 10ppm might have taken 1,000 years or more. Today, human activity is adding that much every five years, as we overwhelm nature's ability to absorb it. On current trends, the world is on track for a doubling of greenhouse gas levels in the second half of this century – potentially causing 4 to 6°C of warming. This would lead to a radically altered planet with grave consequences for humanity.

Dr. James Butler, director of NOAA’s Global Monitoring Division, explains that reversing the global CO2 level would be difficult because of its long lifetime: “Elimination of about 80 percent of fossil fuel emissions would essentially stop the rise in carbon dioxide in the atmosphere – but concentrations of carbon dioxide would not start decreasing until even further reductions are made and then it would only do so slowly.”

 

  speech bubble Comments »
 

 

 

1st May 2015

Revolutionary new energy storage system announced by Tesla

Tesla has revealed a new battery technology for homes and businesses, which provides a way to store energy from localised renewables and can function as a backup system during power outages.

 

powerwall 2015 battery technology

 

A major barrier to the widespread adoption of clean energy has been the intermittent nature of wind and solar. The Sun doesn't always shine, and the wind doesn't always blow – making it difficult or impossible to harness these resources on a 24-hour basis.

Elon Musk, CEO of electric vehicle firm Tesla Motors, yesterday unveiled a revolutionary new technology that can solve these issues. The Powerwall, pictured above, is a rechargeable lithium-ion battery product, intended primarily for home use. It stores electricity generated from rooftop solar panels, which can then be used for domestic consumption, load shifting, or backup power.

With a constant supply of renewable energy at a local scale, the Powerwall offers complete independence from the utility grid, meaning that customers no longer have to worry about expensive bills incurred during peak hours. If a utility company experiences a major outage, the Powerwall can serve as the home power supply instead, which is especially useful in areas prone to storms or unreliable grids. It also recharges electric vehicles more cheaply during night hours while surplus power can be flowed back to the grid when needed.

 

powerwall 2015 battery technology

 

Tesla claims the Powerwall is fully automated, simple to install, and requires no maintenance. It is being marketed in two models: 10 kWh weekly cycle ($3,500) and 7 kWh daily cycle ($3,000) versions. Multiple batteries can be installed together for homes with greater energy needs; up to 90 kWh total for the 10 kWh battery and 63 kWh total for the 7 kWh battery. Both are rated for indoor and outdoor installation, and guaranteed for ten years.

The Powerwall begins shipping this summer. It will be sold to companies including SolarCity, which is running a pilot project in 500 California houses, using 10-kWh battery packs. Tesla is bullish about the prospects for batteries, electric vehicles and clean energy. The company is building a "gigafactory" to develop and expand these technologies at a large scale, with more factories to come in the future.

While the current price of the Powerwall may seem a little on the high side, analysts forecast a substantial decline in battery costs over the next decade and beyond, with a similar fall in solar panel costs. When combined with smart grids, the proliferation of this technology seems inevitable. As predicted on our future timeline, it is likely that home energy storage systems will be commonplace by 2030.

A much larger version of the Powerpack – described as an "infinitely scalable system" – will be made available for businesses and industrial applications. This will come in 100 kWh battery blocks, which can scale from 500 kWH, up to 1 GWh and even higher: "Our goal here is to change the way the world uses energy at an extreme scale," says Musk. You can watch his full keynote presentation (which was powered by solar energy) in the video below.

 

 

 

  speech bubble Comments »
 

 

 

 
     
       
     
   
« Previous  
   
     
   

 
     
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2014

 

2013

 

2012

 

2011

 

2010

 

 
 
 
 

 


future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed