future timeline technology singularity humanity
 
  Follow us »
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed
 
     
     
 
       
 
 
 

Blog » Business & Politics

 
     
 

18th April 2014

World's first marijuana vending machines are unveiled in Colorado

In a sign of the changing times, marijuana is now publicly available from vending machines in Colorado. American Green, part of Tranzbyte Corporation, has begun distributing "Zazzz Machines" containing the drug. These utilise radio-frequency identification tags (RFID) to track the products, along with biometrics to verify a customer's age. They even accept Bitcoin, a new digital currency. The first machine was unveiled on 12th April and is located at the Herbal Elements store in Avon, Colorado. A recent Gallup poll showed a clear majority of Americans (58%) in favour of marijuana being made fully legal, with growing numbers admitting to have tried it. Colorado expects to collect nearly $100 million in tax revenue from recreational marijuana use this year – about 40% more than originally forecast.

 

 

  speech bubble Comments »
 

 

 

31st March 2014

Japanese whaling program ruled illegal

In the landmark case of Australia v. Japan, the International Court of Justice (ICJ) in The Hague today ruled that Japan's JARPA II whaling program is not for scientific purposes and has ordered all permits to be revoked.

 

whale in southern ocean

 

A major victory was achieved by conservationists today as the ICJ announced their binding decision on Australia v. Japanruling by a vote of 12 to 4 that Japan's Antarctic whaling program is not scientific research as defined under the International Whaling Commission regulations. The Court orders that Japan "revoke any extant authorization, permit or licence to kill, take or treat whales in relation to JARPA II, and refrain from granting any further permits."

Like its predecessor, the JARPA II program takes place in the Southern Ocean. Starting in 2005 and continuing to the present day, objectives have included monitoring the Antarctic ecosystem, modeling competition between whale species, recording changes in stock structure and improving future management of whales. However, the research methodology and value has come under intense scrutiny, as it has been argued that non-lethal alternatives are possible and that Japan's research is commercial whaling in disguise.

"With today's ruling, the ICJ has taken a fair and just stance on the right side of history by protecting the whales of the Southern Ocean Whale Sanctuary and the vital marine ecosystem of Antarctica, a decision that impacts the international community and future generations," said Captain Alex Cornelissen of Sea Shepherd Global. "Though Japan's unrelenting harpoons have continued to drive many species of whales toward extinction, Sea Shepherd is hopeful that in the wake of the ICJ's ruling, it is whaling that will be driven into the pages of the history books."

"Despite the moratorium on commercial whaling, Japan has continued to claim the lives of thousands of the gentle giants of the sea in a place that should be their safe haven," said Sea Shepherd Founder, Captain Paul Watson. "Sea Shepherd and I, along with millions of concerned people around the world, certainly hope that Japan will honor this ruling by the international court and leave the whales in peace."

At a public meeting in Los Angeles in December 2013, the Ambassador from Japan to the U.S., Kenichiro Sasae, said that his country would abide by the ICJ ruling. Sea Shepherd Global – the only organisation to directly intervene against Japan's illegal whaling – has ships prepared to return to the Southern Ocean, should Japan choose to ignore this decision.

 

  speech bubble Comments »
 

 

 

26th March 2014

Oculus VR acquired by Facebook

Oculus VR – the developer of a consumer virtual reality headset planned for 2015 – has been purchased by Facebook for $2 billion.

 

oculus rift vr

 

Following the demonstration of a prototype at E3, the Oculus Rift made headlines in 2012, when a Kickstarter campaign was launched to further develop the product. Within four hours, it had achieved its target of $250,000 and in 36 hours, the campaign surpassed $1 million, eventually securing $2.4 million in funding. Prominent figures from the games industry publicly endorsed the project – including John Carmack, co-founder of Id Software; Gabe Newell, the co-founder of Valve; and Cliff Bleszinski, design director at Epic Games.

With an extremely wide field of view, high resolution display, and ultra-low latency head tracking, the Rift could provide a truly immersive experience, allowing gamers to step inside their favourite games and explore new worlds like never before. Previous attempts to create VR either lacked the features required for believable immersion, or were far too expensive ($20,000+) and reserved for the military or scientific community. The Rift promised to change all that, with a revolutionary new product designed to maximise immersion and comfort, at a price everyone could afford.

The first "dev kit" began shipping in March 2013, for developers to integrate the device into their games. A 1080p version was demonstrated in June 2013 and the Omni treadmill device was announced in the same month, providing an even greater level of realism and interaction for players. At the Consumer Electronics Show in January 2014, an updated prototype was unveiled with a special low-persistence of vision OLED display; its new motion tracking system could detect actions like crouching or leaning, to alleviate sickness experienced by some users.

Last week, Oculus VR announced the much-anticipated second development kit (DK2), featuring several improvements over the first – including a higher-resolution low-persistence OLED display, higher refresh rate, head positional tracking, detachable cable and removal of the external control box.

Now a sudden and rather unexpected development has occurred, with Facebook purchasing the company for $2 billion according to press release and personal statement by its founder Mark Zuckerberg. The social media giant intends to expand the device's potential beyond just gaming:

"We're going to make Oculus a platform for many other experiences. Imagine enjoying a court side seat at a game, studying in a classroom of students and teachers all over the world or consulting with a doctor face-to-face just by putting on goggles in your home. This is really a new communication platform. By feeling truly present, you can share unbounded spaces and experiences with the people in your life. Imagine sharing not just moments with your friends online, but entire experiences and adventures."

"We are excited to work with Mark and the Facebook team to deliver the very best virtual reality platform in the world," said Brendan Iribe, co-founder and CEO of Oculus. "We believe virtual reality will be heavily defined by social experiences that connect people in magical, new ways. It is a transformative and disruptive technology, that enables the world to experience the impossible, and it's only just the beginning."

Reactions in the last 24 hours have been negative, however. Facebook shares fell by 6% following the announcement, while the Kickstarter page is being flooded with angry comments. The Oculus founder, 21-year old Palmer Luckey, attempted to explain his decision with a post on Reddit, but has received a similar hostile response. Markus Persson, creator of the popular game Minecraft, states on Twitter: "We were in talks about maybe bringing a version of Minecraft to Oculus. I just cancelled that deal. Facebook creeps me out."

 

  speech bubble Comments »
 

 

 

12th March 2014

Japan to cut bluefin tuna catch by 50% from 2015

Last year, a study found that Pacific bluefin tuna populations had fallen by 96% from their original levels, due to decades of overfishing. International cuts were agreed to limit juvenile fish catches, but Japan concluded that these measures did not go far enough. Japan's Fisheries Agency has now responded by announcing a 50% reduction in catches from 2015, while urging other nations to do the same.

More than 4.4 million tons of tuna are being hauled from the world's oceans each year – much of which ends up as sushi, which is becoming ever more popular as a cuisine. This level of depletion is now unsustainable and could lead to the extinction of tuna in the near future, unless urgent action is taken. About 60% of the global catch is obtained from the Pacific region, conducted mostly by so-called "distant water" fleets from as far afield as Europe. Other species including skipjack, yellowfin and bigeye tuna are also severely threatened.

Japan's announcement could be an important step towards restoring tuna populations worldwide. Another solution that offers some hope is the rapid growth of aquaculture, which may actually surpass wild catch harvests by 2026. In the longer term, it might be possible to eliminate fish kills entirely – this could be achieved using a combination of 3D printing and/or synthetic fish food that is artificially "grown" via tissue engineering. If this were to happen, fish populations could eventually be restored to their pre-industrial levels. Scientists have already been making progress in this area, demonstrating the first laboratory-grown hamburger last year.

 

bluefin tuna

 

  speech bubble Comments »
 

 

 

6th February 2014

India to build the world's largest solar power plant

India's government has signed a deal with six companies to build a 4 gigawatt (GW) solar power plant – by far the world's largest.

 

solar power

 

This facility – described by officials as an "ultra mega" project – is equivalent to four nuclear reactors and double the nation's entire current solar capacity. It will be 10 times bigger than any plant of its kind in the world. Located west of Jaipur, close to Sambhar Lake, it will be constructed in two phases over seven years, with phase 1 comprising 1000 MW followed by the remaining 3000 MW in a later phase.

In 2010, India launched a "solar mission" initiative, aiming to deliver 20 GW of solar capacity by 2022. This new project will be a significant step towards achieving that goal. The nation has an even more ambitious plan to reach 100 GW by 2030, enough to supply 200 million people.

With its high levels of sunlight, India is well-placed to exploit solar energy. Combined with plummeting installation costs and improving efficiency, solar is becoming a more attractive option with each passing year. It now stands at 7.50 rupees per kilowatt-hour, down from 17 rupees just three years ago. By comparison, natural gas is roughly 5.5 rupees per kWh, nuclear is 3 rupees per kWh and coal is 2.50 rupees per kWh. It won't be long before solar is able to match these cheaper forms of energy, and then things could get really interesting. Solar has the potential to be a highly disruptive technology – especially when combined with smart grids and local storage. Longer term, there is the possibility of using solar within continent-wide supergrids.

 

india solar map

SolarGIS © 2014 GeoModel Solar

 

  speech bubble Comments »
 

 

 

2nd February 2014

Great Barrier Reef dumping plan approved

This week, the Australian government approved plans to dump five million cubic metres of sediment near the Great Barrier Reef, as part of an expansion to create the world's largest coal port.

 

great barrier reef australia

 

Visible from outer space, the Great Barrier Reef is the world's largest coral reef system and the biggest single structure made by living organisms. It is home to a staggering diversity of marine life – including more than 1,500 fish species alongside birds, sea turtles, sea snakes, dolphins, whales, dugongs, and molluscs such as the giant clam; not to mention thousands of different plants like seagrasses and seaweeds. It has been labelled as one of the seven natural wonders of the world and was chosen as a World Heritage Site in 1981. Tourism is an important economic activity for the region, generating over $6 billion per year.

With its delicate ecosystem, the Great Barrier Reef is highly sensitive and vulnerable to sudden environmental changes. According to a recent study, it has lost more than half its coral cover since 1985, with two-thirds of the loss occurring from 1998. Among the human-caused threats are climate change, pollution, overfishing, shipping accidents and oil spills. Natural causes include tropical cyclones, disease and invasions by crown-of-thorns starfish. Much of the reef could be wiped out by the middle of this century, based on current trends.

Despite its already fragile state, the Great Barrier Reef now faces additional harm in the form of Abbot Point – a coal port being expanded to provide new export facilities from the Galilee Basin in Queensland. When shipments begin in 2016, it will become the largest port of its kind in the world. To allow ships into the port, a massive dredging project is needed, with a disposal site for the sludge located 16 miles (25 km) to the north-east. An investigation zone is being assessed for alternative locations, as shown below.

 

great barrier reef dumping map

 

The Great Barrier Reef Marine Park Authority (GBRMPA) notes that "the seafloor of the approved disposal area consists of sand, silt and clay and does not contain coral reefs or seagrass beds," claiming the operation will be "subject to strict environmental conditions."

Federal minister for the environment, Greg Hunt, said that water quality would actually improve in the region, due to conditions on the development that include programmes to support the health of the reef: "The conditions put in place for these projects will result in an improvement in water quality and strengthen the Australian government's approach to meeting the challenges confronting the reef into the future."

Scientists have been expressing a different opinion, however. A coral reef ecologist from the University of Queensland, Selina Ward, dismissed Hunt's remarks as "ridiculous" and explained that a huge amount of work had already gone into improving the water quality in recent years. To offset the damage arising from dredging operations of this size would take "unimaginable effort."

When sediment is dumped in this way, it can expand and travel outward, carried by ocean currents. The food chain is disrupted as seagrass and other plants die, in turn killing off animal populations that rely on them. Coral is weakened as increased sediment clouds the water and reduces the amount of sunlight getting through, harming algae that live symbiotically with them. Carbon is stored below seagrass in substantial quantities and this can be released when it dies – these meadows are currently disappearing at a global annual rate of 1.5 per cent, with almost 300 million tons of carbon added back into the environment each year as a result, according to Nature Geoscience. Another study concludes that seagrass is 35 times more efficient at absorbing carbon than rainforests.

There are further impacts to consider. Expanding the port will lead to a rise in ship traffic, increasing the chance of a collision with the reef or with other marine life. Humpback whale mothers and calves have been observed resting in the shallow waters around Abbot Point during migration. Green and flatback turtles on Abbot Beach will have their egg laying disrupted as they are confused by all the noise, lights and construction activity nearby. Directly behind the port itself is Caley Valley Wetland, home to several threatened bird species.

A group of 233 scientists had urged the Authority to reject the expansion, with a joint letter to chairman Russell Reichelt that stated: "The best available science makes it very clear that expansion of the port at Abbot Point will have detrimental effects on the Great Barrier Reef. Sediment from dredging can smother corals and seagrasses and expose them to poisons and elevated nutrients."

Last year, UNESCO had warned that the Great Barrier Reef might be placed on its list of World Heritage sites in danger unless action was taken to safeguard the region. A recent poll showed that 91 percent of Australians think protecting the Great Barrier Reef is the country's most important environmental issue – a number of huge petitions had been submitted prior to this week's decision.

The expansion of Abbot Point could be just the beginning, however. Several other massive dredging projects may emerge along the north-east coast, with Queensland's state government fast-tracking mega ports along the reef and dumping potentially 140 million tons of sediment by 2025, according to researchers based at James Cook University in Queensland. Abbot Point itself could be expanded further to accommodate the Alpha North Coal Project. The Australian Prime Minister, Tony Abbott, appears to show little interest in the environment, having abolished the Climate Commission and slashed a number of clean energy initiatives.

The industrialisation of the Great Barrier Reef is just the latest in a string of developments in areas previously considered immune to human influence. For instance, plans are underway to build a highway through the Serengeti National Park, while oil drilling is approved in the heart of Yasuni National Park, the most biologically diverse spot on Earth.

 

  speech bubble Comments »
 

 

 

25th January 2014

Global warming continues: 2013 was fourth hottest year on record

The average combined land and ocean surface temperature for January–December 2013 was tied as the fourth warmest such period on record, at 0.62°C (1.12°F) above the 20th century average.

 

global warming 2013 map

 

The latest summary of global temperature released by the National Oceanic and Atmospheric Administration (NOAA) concludes that warmer-than-average temperatures affected the vast majority of the globe during 2013. Record warmth was observed across much of southern and western Australia, southwestern Ethiopia, eastern Tanzania, parts of central Asia around Kazakhstan and Uzbekistan, a large section of the southwestern Pacific Ocean, along with regions of the Arctic, central Pacific, and central Indian Oceans.

Temperatures were cooler-than-average across the central United States – a region that saw record warmth in 2012 – along with small sections of the eastern Pacific Ocean and the Southern Ocean off the tip of South America. No record coldest regions were observed for the January–December 2013 period, as shown in the map below.

Globally, 2010 remains the hottest year recorded by NOAA at 0.66°C (1.2°F) above the 20th century average, with 2005 and 1998 in second and third place, respectively. Including 2013, all 13 years of the 21st century (2001-2013) rank among the 15 warmest in the 134-year observational record. Viewed over a longer timescale, the trend is even more obvious. Last year's high temperatures occurred even without El Niño, suggesting that a new record may soon be reached and casting doubt on recent claims of a "pause" in warming.

 

global warming map 2013 temperatures

 

Despite the overwhelming evidence, the near-unanimous agreement from climate experts, and growing number of disasters affecting the world, much of the public still believes that a controversy exists in the scientific community and/or experts are distorting the truth. Gallup polls show that 40% of U.S. adults view global warming as exaggerated, with a similar number thinking natural causes are to blame.

In fact, the evidence for climate change (a term used since at least 1955) and humanity's contribution to it has become stronger than ever. Study after study confirms that human industrial activity is clearly and by far the dominant factor driving the recent changes in our atmosphere:

 

global warming contributions

Credit: SkepticalScience (CC BY 3.0)

 

We have known since the 19th century that CO2 is a greenhouse gas – trapping heat in ways that can be demonstrated with even simple experiments. By analysing the ratio of carbon isotopes, we can easily determine what proportion is natural and what proportion is man-made. From this, we know that our carbon emissions have been absolutely colossal when measured on a geologic timescale, with changes now happening 10 times faster than any period since the Cretaceous–Paleogene extinction event of 65 million years ago. We know that the so-called Medieval warm period, while unusually warm in some regions like the North Atlantic, was much cooler than today on a global basis. We know that solar activity is not a cause of recent warming and new research indicates that climate sensitivity to CO2 input has been underestimated. We know from simple experiments that even a small increase in parts per million can have an obvious impact.

There is abundant evidence of current impacts in the form of shrinking glaciers (including the Glacier National Park), larger and more damaging wildfires, ocean acidification and deoxygenation, loss of coral reefs, fish migrations, bark beetle and other pest movements, rising sea levels, coastal erosion, extremes in flooding and drought, along with more frequent heat waves. We know that Arctic sea ice is melting 50 years ahead of earlier forecasts and that ice loss in the region is far greater than the relatively small gain in the Antarctic. We know that vast areas of carbon-absorbing forests have been cut down over the centuries and particularly during the last decade – in order to make way for our sprawling cities and their carbon-spouting automobiles – not to mention thousands of planes in the skies overhead – all of which have appeared on this planet in the blink of an eye, geologically speaking. Any supposed "benefits" to plants from extra CO2 will be offset by the negative effects from drought, weeds and higher temperatures. There are tens of millions of people around the world already being affected by this panoply of converging impacts. Recent ventures into unconventional fossil fuels are the stuff of nightmares.

We have the world's most powerful supercomputers, making trillions of calculations per second for months on end, running state-of-the-art simulations with fantastic levels of detail. Contrary to what some would claim, these models have proven remarkably successful, correctly predicting:

• That our land, atmosphere and oceans would warm.
• That the troposphere would warm and the stratosphere would cool.
• That nighttime average temperatures would increase more than daytime average temperatures.
• That winter average temperatures would increase more than summer average temperatures.
• That polar amplification would lead to greater temperature increases nearer the poles.
• That the Arctic would warm faster than the Antarctic.
• The magnitude (0.3 K) and duration (two years) of the cooling from the Mt. Pinatubo eruption.
• The amount of water vapour feedback due to ENSO.
• The response of southern ocean winds to the ozone hole.
• The expansion of the Hadley cells.
• The poleward movement of storm tracks.
• The rising of the tropopause and the effective radiating altitude.
• The clear sky super greenhouse effect from increased water vapour in the tropics.
• The near constancy of relative humidity on global average.
• That coastal upwelling of ocean water would increase.
• They performed a retrodiction for the Last Glacial Maximum sea surface temperatures, which was inconsistent with the paleo evidence, and better paleo evidence subsequently showed the models were right.

And yet, even without these computer models, there is clear evidence of climate change and our influence on it. Decades of peer-reviewed studies in the world's top scientific journals have confirmed this reality; just as they confirmed the reality of evolution, our planet's geologic history, the germ theory of disease, links between smoking and cancer, depletion of the ozone layer by CFCs, along with countless other biological, chemical and physical processes. The science can never be perfect and there will always be gaps, but today no scientific body of national or international standing disputes the fundamental points.

There are, of course, a small number of individual climate scientists who claim to be sceptical. In almost every case, however, they either have ties to fossil fuel interests, or their work has never been peer-reviewed and published in a respected journal. It is worth noting that individuals like Christopher Monckton are not climate scientists and are totally unqualified in the field. A recent documentary, The Great Global Warming Swindle has been savaged by climatologists for its cherry picking, inaccuracies and misleading claims. Many arguments continue to be made by sceptics (such as the 1970s cooling myth), but literally none stand up to scrutiny. The science behind climate change is robust and has withstood almost everything thrown at it – including the recent "Climategate", with multiple independent inquiries finding no evidence of fraud or scientific misconduct.

Given all of the above, the risks of inaction – and the obvious benefits of clean technology – how can people be so eager to embrace fossil fuels, so confident in their scepticism, and willing to take such a gamble on their children's future? Even the conservative U.S. military now takes the issue seriously and is preparing for the impacts. If climate scientists are in it for the money, they're doing it wrong.

Global warming is the biggest story of our time, a result of our explosive growth in population and technology. It will define the 21st century and possibly many centuries to come. Ignoring the evidence and casually dismissing what decades of peer-reviewed science have told us would be a mistake of truly monumental proportions.

 

  speech bubble Comments »
 

 

 

19th January 2014

Letter to Barack Obama (part 2)

Concerned by his "all of the above" energy strategy, a group of environmentalists this week sent a joint letter to President Barack Obama, calling on him to expand clean energy. This follows a similar effort last year by business leaders, philanthropists and election campaign supporters. The letter is reproduced here in full.

 

carbon dioxide levels

 


 

American Rivers | Clean Water Action | Defenders of Wildlife | Earthjustice

Energy Action Coalition | Environment America | Environmental Defense Fund

Friends of the Earth | League of Conservation Voters | National Audubon Society |
National Wildlife Federation | Native American Rights Fund

Natural Resources Defense Council | Oceana | Physicians for Social Responsibility |
Population Connection | Sierra Club | Voices for Progress

 

President Barack Obama
The White House
1600 Pennsylvania Ave NW
Washington, DC 20500

Dear Mr. President,

We applaud the actions you have taken to reduce economy-wide carbon pollution and your commitment last June "to take bold action to reduce carbon pollution" and "lead the world in a coordinated assault on climate change." We look forward to continuing to work with you to achieve these goals.

In that speech, you referenced that in the past you had put forward an "all of the above" energy strategy, yet noted that we cannot just drill our way out of our energy and climate challenge. We believe that continued reliance on an "all of the above" energy strategy would be fundamentally at odds with your goal of cutting carbon pollution and would undermine our nation's capacity to respond to the threat of climate disruption. With record-high atmospheric carbon concentrations and the rising threat of extreme heat, drought, wildfires and super storms, America's energy policies must reduce our dependence on fossil fuels, not simply reduce our dependence on foreign oil.

We understand that the U.S. cannot immediately end its use of fossil fuels and we also appreciate the advantages of being more energy independent. But an "all of the above" approach that places virtually no limits on whether, when, where or how fossil fuels are extracted ignores the impacts of carbon-intense fuels and is wrong for America's future. America requires an ambitious energy vision that reduces consumption of these fuels in order to meet the scale of the climate crisis.

An "all of the above" strategy is a compromise that future generations can't afford. It fails to prioritize clean energy and solutions that have already begun to replace fossil fuels, revitalize American industry, and save Americans money. It increases environmental injustice while it locks in the extraction of fossil fuels that will inevitably lead to a catastrophic climate future. It threatens our health, our homes, our most sensitive public lands, our oceans and our most precious wild places. Such a policy accelerates development of fuel sources that can negate the important progress you've already made on lowering U.S. carbon pollution, and it undermines U.S. credibility in the international community.

Mr. President, we were very heartened by your commitment that the climate impacts of the proposed Keystone XL pipeline would be "absolutely critical" to the decision and that it would be contrary to the "national interest" to approve a project that would "significantly exacerbate the problem of carbon pollution." We believe that a climate impact lens should be applied to all decisions regarding new fossil fuel development, and urge that a "carbon-reducing clean energy" strategy rather than an "all of the above" strategy become the operative paradigm for your administration's energy decisions.

In the coming months your administration will be making key decisions regarding fossil fuel development -- including the Keystone XL pipeline, fracking on public lands, and drilling in the Arctic ocean -- that will either set us on a path to achieve the clean energy future we all envision or will significantly exacerbate the problem of carbon pollution. We urge you to make climate impacts and emission increases critical considerations in each of these decisions.

Mr. President, we applaud you for your commitment to tackle the climate crisis and to build an economy powered by energy that is clean, safe, secure, and sustainable.

Sincerely,

 

Wm. Robert Irvin
President and CEO
American Rivers

Robert Wendelgass
President
Clean Water Action

Jamie Rappaport Clark
President and CEO
Defenders of Wildlife

Trip Van Noppen
President
Earthjustice

Maura Cowley
Executive Director
Energy Action Coalition

Margie Alt
Executive Director
Environment America

Fred Krupp
President
Environmental Defense Fund

Eric Pica
President
Friends of the Earth

Gene Karpinski
President
League of Conservation Voters

David Yarnold
President and CEO
National Audubon Society

Larry J. Schweiger
President & CEO
National Wildlife Federation

John Echohawk
Executive Director
Native American Rights Fund

Frances Beinecke
President
Natural Resources Defense Council

Andrew Sharpless
Chief Executive Officer
Oceana

Catherine Thomasson, MD
Executive Director
Physicians for Social Responsibility

John Seager
President
Population Connection

Michael Brune
Executive Director
Sierra Club

Sandy Newman
President
Voices for Progress

 

  speech bubble Comments »
 

 

 

19th January 2014

NASA budget of $17.6 billion is approved

President Barack Obama has signed a budget that provides NASA with $17.6 billion for this year – fully funding both the heavy-lift Space Launch System and Orion capsule that will eventually take humans to Mars.

 

nasa budget 2014 sls orion
The Space Launch System (left) and Orion capsule (right).

 

NASA's budget for 2014 was passed by Congress earlier this week and officially signed by the President on Friday. A total of $17.65 billion has been allocated to the space agency, which is slightly less than the $17.7 billion it had requested. However, some analysts had expected a figure as low as $16.1 billion, due to recent budget cuts and spending concerns arising from the sequester of 2013. For space enthusiasts, the final approved figure is therefore a welcome surprise.

Some highlights from the budget include:

• $1,918 million for the Space Launch System (SLS).

The SLS is a heavy launch vehicle intended to replace the Space Shuttle. It is designed to be upgraded over time with more powerful versions. Initially carrying payloads of 70 metric tons into orbit, the SLS will eventually be fitted with an upper "Earth Departure Stage" capable of lifting at least 130 metric tons. This will be 12 metric tons greater than the Apollo-era Saturn V, making it the largest and most powerful rocket ever built. It will take astronauts and hardware to asteroids, the Moon, Mars, and most of the Earth's Lagrangian points. A first unmanned test launch is planned for 2017, with NASA being allocated an extra $200 million to ensure this deadline is met. A manned flight around the Moon and possibly to an asteroid is expected to occur in 2021, with manned missions to Mars in the 2030s. The additional funding in this year's budget will "maintain critical forward momentum" on the program, according to legislators.

• $1,197 million for the Orion Multi-Purpose Crew Vehicle (MPCV).

Orion is a small capsule designed to transport up to six astronauts and cargo beyond Earth orbit. It will be integrated with and carried by the SLS rockets. A first unmanned test flight is scheduled for later this year, during which its altitude will reach higher than any spacecraft intended for human use since 1973. Manned flights will commence in the 2020s.

• $5,151 million for science.

This includes $80 million for planning and development of a Europa mission. The next Discovery-class mission will be announced by May 2014, with selection of the mission(s) in September 2015. Meanwhile, NASA's flagship project and Hubble successor – the James Webb Space Telescope – remains funded and on track for delivery in 2018. Among its primary objectives will be capturing images of reionization and "first light" from stars after the Big Bang.

The remaining budget will go towards operational maintenance, space technology, aeronautics, grants, education and other services provided by NASA. Despite this week's good news, however, the longer term picture is less clear for NASA. As shown in the graph below, its budget as a percentage of the federal budget has been gradually declining and is now a mere fraction of its peak in the 1960s. It will be interesting to see how the private sector can influence the agency's strategy in the coming decades.

 

NASA budget as a percentage of federal budget

 

  speech bubble Comments »
 

 

 

9th January 2014

IBM forms Watson Group to meet growing demand for cognitive innovations

Headquartered in New York City's "Silicon Alley", the new Watson Group formed by IBM will fuel innovative products and startups – introducing cloud solutions to accelerate research, visualise Big Data and enable analytics exploration.

 

 

IBM today announced it will establish the IBM Watson Group, a new business unit dedicated to the development and commercialisation of cloud-delivered cognitive innovations. The move signifies a strategic shift by IBM to accelerate into the marketplace a new class of software, services and apps that can "think", improve by learning, and discover answers and insights to complex questions from massive amounts of Big Data.

IBM will invest more than $1 billion into the Watson Group, focusing on research and development to bring cloud-delivered cognitive applications and services to market. This will include $100 million available for venture investments to support IBM's recently launched ecosystem of start-ups and businesses, which are building a new class of cognitive apps powered by Watson, in the IBM Watson Developers Cloud.

According to technology research firm Gartner, smart machines will be the most disruptive change ever brought about by information technology, and can make people more effective, empowering them to do "the impossible."

The IBM Watson Group will have a new headquarters at 51 Astor Place in New York City's "Silicon Alley" technology hub, leveraging the talents of 2,000 professionals, whose goal is to design, develop and accelerate the adoption of Watson cognitive technologies that transform industries and professions. The new group will tap subject matter experts from IBM's Research, Services, Software and Systems divisions, as well as industry experts who will identify markets that cognitive computing can disrupt and evolve, such as healthcare, financial services, retail, travel and telecommunications.

Nearly three years after its triumph on the TV show Jeopardy!, IBM has advanced Watson from a quiz game innovation into a commercial technology. Now delivered from the cloud and powering new consumer apps, Watson is 24 times faster and 90 percent smaller – IBM has shrunk Watson from the size of a master bedroom to three stacked pizza boxes.

Named after IBM founder Thomas J. Watson, the machine was developed in IBM’s Research labs. Using natural language processing and analytics, Watson handles information akin to how people think, representing a major shift in the ability to quickly analyse, understand and respond to Big Data. Watson’s ability to answer complex questions in natural language with speed, accuracy and confidence will transform decision making across a range of industries.

"Watson is one of the most significant innovations in IBM's 100 year history, and one that we want to share with the world," says IBM Senior Vice President Mike Rhodin (pictured below), who will lead the group. "These new cognitive computing innovations are designed to augment users’ knowledge – be it the researcher exploring genetic data to create new therapies, or a business executive who needs evidence-based insights to make a crucial decision."

 

mike rhodin IBM Watson

 

  speech bubble Comments »
 

 

 

7th January 2014

Intel at CES 2014

At the Consumer Electronics Show (CES) in Las Vegas, Intel Corporation has been showing off its latest innovative technologies. These include an intelligent 3D camera system, a range of new wearable electronics, and a 22nm dual-core PC the size of an SD card.

 

intel edison 22nm dual core pc 2014 technology

 

Intel CEO Brian Krzanich has outlined a range of new products, initiatives and strategic relationships aimed at accelerating innovation across a range of mobile and wearable devices. He made the announcements during the pre-show keynote for the 2014 Consumer Electronics Show in Las Vegas, the biggest gathering of the tech industry in the USA.

Krzanich's keynote painted a vision of how the landscape of computing is being re-shaped and where security is too important not to have it embedded in all devices. The world is entering a new era of integrated computing defined not by the device, but the integration of technology into people's lifestyles in ways that offer new utility and value. As examples, he highlighted several immersive and intuitive technologies that Intel will begin offering in 2014, such as Intel RealSense – hardware and software that will bring human senses to Intel-based devices. This will include 3D cameras that deliver more intelligent experiences – improving the way people learn, collaborate and are entertained.

The first Intel RealSense 3D camera features a best-in-class depth sensor and a full 1080p colour camera. It can detect finger level movements enabling highly accurate gesture recognition, facial features for understanding movement and emotions. It can understand foregrounds and backgrounds to allow control, enhance interactive augmented reality (AR), simply scan items in three dimensions, and more.

This camera will be integrated into a growing spectrum of Intel-based devices including 2 in 1, tablet, Ultrabook, notebook, and all-in-one (AIO) designs. Systems with the new camera will be available beginning in the second half of 2014 from Acer, Asus, Dell, Fujitsu, HP, Lenovo and NEC.

To advance the computer's "hearing" sense, a new generation of speech recognition technology will be available on a variety of systems. This conversational personal assistant works with popular websites and applications. It comes with selectable personalities, and allows for ongoing dialogue with Intel-based devices. People can simply tell it to play music, get answers, connect with friends and find content – all by using natural language. This assistant is also capable of calendar checks, getting maps and directions, finding flights or booking a dinner reservation. Available offline, people can control their device, dictate notes and more without an Internet connection.

 

 

Krzanich then explained how Intel aims to accelerate wearable device innovation. A number of reference designs were highlighted including: smart earbuds providing biometric and fitness capabilities, a smart headset that is always ready and can integrate with existing personal assistant technologies, a smart wireless charging bowl, a smart baby onesie and a smart bottle warmer that will start warming milk when the onesie senses the baby is awake and hungry.

The smart earbuds (pictured below) provide full stereo audio, monitor heart rate and pulse all while the applications on the user's phone keep track of running distance and calories burned. The product includes software to precision-tune workouts by automatically choosing music that matches the target heart rate profile. As an added bonus, it harvests energy directly from the audio microphone jack, eliminating the need for a battery or additional power source to charge the product.

 

intel smart earbuds 2014 technology

 

The Intel CEO announced collaborations to increase dialogue and cooperation between fashion and technology industries to explore and bring to market new smart wearable electronics. He also kicked-off the Intel "Make it Wearable" challenge – a global effort aimed at accelerating creativity and innovation with technology. This effort will call upon the smartest and most creative minds to consider factors impacting the proliferation of wearable devices and ubiquitous computing, such as meaningful usages, aesthetics, battery life, security and privacy.

In addition to reference designs for wearable technology, Intel will offer a number of accessible, low-cost entry platforms aimed at lowering entry barriers for individuals and small companies, allowing them to create innovative web-connected wearables or other small form factor devices. Underscoring this point, Krzanich announced Intel Edison – a low-power, 22nm-based computer in an SD card form factor with built-in wireless abilities and support for multiple operating systems. From prototype to production, Intel Edison will enable rapid innovation and product development by a range of inventors, entrepreneurs and consumer product designers when available this summer.

 

intel edison 22nm dual core pc 2014 technology

 

"Wearables are not everywhere today, because they aren't yet solving real problems and they aren't yet integrated with our lifestyles," said Krzanich. "We're focused on addressing this engineering innovation challenge. Our goal is: if something computes and connects, it does it best with Intel inside."

Krzanich also discussed how Intel is addressing a critical issue for the industry as a whole: conflict minerals from the Democratic Republic of the Congo (DRC). Intel has achieved a critical milestone and the minerals used in microprocessor silicon and packages manufactured in Intel's factories are now "conflict-free", as confirmed by third-party audits.

"Two years ago, I told several colleagues that we needed a hard goal, a commitment to reasonably conclude that the metals used in our microprocessors are conflict-free," Krzanich said. "We felt an obligation to implement changes in our supply chain to ensure that our business and our products were not inadvertently funding human atrocities in the Democratic Republic of the Congo. Even though we have reached this milestone, it is just a start. We will continue our audits and resolve issues that are found."

 

intel conflict minerals

 

  speech bubble Comments »
 

 

 

4th January 2014

Ford unveils a solar-powered hybrid car

Ford Motor Company has announced the C-MAX Solar Energi Concept, a first-of-its-kind Sun-powered car with potential to deliver the best of what a plug-in hybrid offers – without depending on the electric grid for fuel.

 

 

Instead of powering its battery from an electrical outlet, the C-MAX Solar Energi harnesses power from the Sun by using a special concentrator that acts like a magnifying glass – directing intense rays to panels on the vehicle roof.

The result is a concept vehicle that takes a day’s worth of sunlight to deliver the same performance as the conventional C-MAX Energi plug-in hybrid, which draws its power from the electric grid. Ford C-MAX Energi gets a combined best miles per gallon equivalent in its class, with 108 MPGe city and 92 MPGe highway, for a combined average 100 MPGe. By using renewable power, it reduces the annual greenhouse gas emissions a typical owner would produce by four metric tons.

“Ford C-MAX Solar Energi Concept shines a new light on electric transportation and renewable energy,” said Mike Tinskey, Ford global director of vehicle electrification and infrastructure. “As an innovation leader, we want to further the public dialog about the art of the possible in moving the world toward a cleaner future.”

C-MAX Solar Energi Concept, which will be shown at the 2014 Consumer Electronics Show (CES) in Las Vegas, is a collaborative project of Ford, SunPower Corp and the Georgia Institute of Technology.

 

ford solar powered car 2014 technology

 

Strong electrified vehicle sales

The C-MAX Solar Energi Concept debuts as Ford caps a record year of electrified vehicle sales. The company expects to sell 85,000 hybrids, plug-in hybrids and all-electric vehicles for 2013 – the first full year its six new electrified vehicles were available in dealer showrooms.

Ford sold more plug-in vehicles in October and November than both Toyota and Tesla, and it outsold Toyota through the first 11 months of 2013. Plug-in hybrids continue to grow in sales as more customers discover the benefits of using electricity to extend their driving range.

Breakthrough clean technology

SunPower, which has been Ford’s solar technology partner since 2011, is providing high-efficiency solar cells for the roof of this concept car. Because of the extended time it takes to absorb enough energy to fully charge, Ford turned to the Georgia Institute of Technology for a way to amplify sunlight, to make a solar-powered hybrid feasible for daily use.

Researchers developed an off-vehicle solar concentrator (pictured below) with a special Fresnel lens to direct sunlight to the solar cells while boosting the impact of sunlight by a factor of eight. A Fresnel is a compact lens originally developed for use in lighthouses. Similar in concept to a magnifying glass, this patent-pending system tracks the Sun as it moves from east to west, drawing enough power each day to equal a four-hour battery charge (8 kilowatts).

 

ford solar power car technology 2014

 

With a full charge, the C-MAX Solar Energi Concept will achieve the same range as a conventional C-MAX Energi hybrid – up to 620 miles, including 21 electric-only miles. Additionally, the vehicle still has a charge port, and can be charged by connecting to a station via cord and plug, so that drivers retain the option to power up via the grid, if desired. 

After the C-MAX Solar Energi Concept is shown at CES, Ford and Georgia Tech will begin testing the vehicle in numerous real-world scenarios. The outcome of those tests will help to determine if the concept is feasible as a production car.  

Off-the-grid car

By tapping renewable solar energy with a rooftop solar panel system, the C-MAX Solar Energi Concept is not dependent on the traditional electric grid for its battery power. Research by Ford suggests the Sun could power up to 75 percent of all trips made by an average driver in a solar hybrid car. This could be especially important in places where the electric grid is underdeveloped, unreliable or expensive to use.

The vehicle also reinforces MyEnergi Lifestyle, a concept revealed by Ford and several partners at 2013 CES. MyEnergi Lifestyle uses math, science and computer modelling to help homeowners understand how they can take advantage of energy-efficient home appliances, solar power systems and plug-in hybrid vehicles to significantly reduce monthly expenses while also reducing their overall carbon footprint.

The positive environmental impact from Ford C-MAX Solar Energi could be significant. It would reduce yearly CO2 and other greenhouse gas emissions from the average U.S. car owner by as much as four metric tons – the equivalent of what a U.S. house produces in four months.

If all light-duty vehicles in the United States were to adopt Ford C-MAX Solar Energi Concept technology, annual greenhouse gas emissions could be reduced by approximately 1 billion metric tons.

 

ford solar power car technology 2014

 

  speech bubble Comments »
 

 

 

28th December 2013

The decline of the bank robber

Better technology and improved security measures have cut robberies in British banks by over 90 per cent in less than two decades, new figures reveal.

 

bank robbery trend

 

Industry-wide figures show that there were just 66 such robberies in 2011 – down from 847 in 1992.

Experts say the decline in branch robberies is due to:

  • Advances in branch security, with more CCTV cameras, screens to protect staff and even specialist fog to disperse criminals.

  • Staff now have access to smaller amounts of cash due to time-delay safes, teller recycling machines and growing use of automated teller machines (ATMs) in branches.

  • Closer co-operation between banks, police, post offices and other victims of armed gangs to share information about gangs.

  • Better training to ensure staff are better prepared to handle robberies.

DNA spray is another common deterrent – robbers are coated with a unique, traceable material that is extremely difficult to wash off skin and can prove that a suspect was at the premises of a robbery.

“Banks are working hard to confine armed robberies to the world of TV dramas," said Anthony Browne, CEO of the British Bankers' Association. "Being caught up in a bank job is a terrifying ordeal for staff and customers that can scar lives for decades. It’s great to see the number of these crimes has fallen sharply in recent years."

"Anyone trying to rob a bank now faces much better CCTV, protective screens that can rise in less than a second and even special fog designed to disperse criminals. Banks will continue to work closely with each other, post offices and the police to make such raids a thing of the past.”

A similar trend has been experienced in the US, according to FBI figures for 2012. Such crimes may be even more difficult (if not impossible) in the future as we move towards a cashless society. Physical currency may disappear entirely.

 

  speech bubble Comments »
 

 

 

2nd December 2013

"Amazon Prime Air" will use drones for 30 minute delivery

Online retailer Amazon has revealed a new rapid delivery method that will use unmanned aerial vehicles to send packages to customers within 30 minutes. Assuming the Federal Aviation Administration (FAA) approves it, this futuristic service – "Amazon Prime Air" – could be introduced by 2015. Read more at the company's press release.

 

 

  speech bubble Comments »
 

 

 

23rd November 2013

Driving economic growth into the Solar System

Planetary Resources, Inc. was co-founded in 2010 by Peter Diamandis and Eric C. Anderson. This new startup company hopes to address one of the paramount problems faced on Earth: resource scarcity. It will achieve this by developing a robotic asteroid mining industry, based on reduced fuel costs. As this video explains, prospecting and mining asteroids could drive economic growth into the Solar System, where potentially trillions of dollars' worth of metals and minerals lie. Planetary Resources has already signed an agreement with Virgin Galactic for payload services. In early 2014, they plan to launch "Arkyd-3", a testbed for the larger Arkyd-100 spacecraft that will hunt for asteroids.

 

 

 

  speech bubble Comments »
 

 

 

10th November 2013

Bulletproof nanotechnology suit goes on sale

Luxury bespoke tailoring house, Garrison Bespoke, has launched the first fashion-forward bulletproof suit with a live ammo field-testing event in Toronto, Canada.

 

 

 

Michael Nguyen, co-owner of Garrison Bespoke: "After receiving requests from high-profile clients who travel to dangerous places for work, we set out to develop a lightweight, fashion-forward bulletproof suit as a more discreet and stylish alternative to wearing a bulky vest underneath."

The Garrison Bespoke bulletproof suit is made with carbon nanotubes created using nanotechnology and originally developed to protect the US 19th Special Forces in Iraq. The patented material is thinner, more flexible and 50 percent lighter than Kevlar, which is traditionally used for bulletproof gear. The suit also protects against stabbing – the carbon nanotubes harden on impact to prevent a knife from penetrating.

The live ammo field-testing event was held in the Ajax Rod and Gun Club, Ontario. It demonstrated the suit's ability to shield against 9mm bullets. Nguyen claims the suit can block .45 bullets as well. Garrison Bespoke's latest collection – Town & Country – features a range of new clothing, all of which can be made bulletproof by request, with prices starting from $20,000.

 

bulletproof nanotechnology suit

 

  speech bubble Comments »
 

 

 

9th November 2013

Fossil fuels receive $500 billion a year in government subsidies

Just two years ahead of a crucial UN climate change summit, many of the world's richest countries continue to pour finance into fossil fuel subsidies. Average spending now runs at $112 per adult, according to a new report from the Overseas Development Institute.

 

fossil fuel pollution

 

The report Time to change the game notes that fossil fuel subsidies cost half a trillion dollars globally every year. These subsidies create "perverse incentives" favouring investment in carbon-intensive energy. The author Shelagh Whitley calls for bold action to phase out these subsidies in the G20 by 2020, and globally by 2025, with rich countries making the deepest and earliest cuts.

"The rules of the game are currently biased in favour of fossil fuels," says Whitley. "The status quo encourages energy companies to continue burning high-carbon fossil fuels and offers no incentive to change. We’re throwing money at policies that are only going to make the problem worse in the long run by locking us into dangerous climate change."

The report accuses governments of "shooting themselves in both feet" by failing to put a proper price on carbon and instead incentivising the use of high-carbon technologies. Fossil fuel subsidies include measures to reduce fossil fuel prices paid by consumers, as well as tax breaks for oil and gas companies.

According to the ODI report:

  • The average subsidy provided by rich governments for every tonne of carbon is $7. This is the same as the current cost of carbon in the EU carbon trading system – meaning the carbon price may as well not exist.

  • Domestic subsidies in rich countries outstrip international climate finance provided to help address climate change in developing countries by a ratio of 7:1.

  • In some countries – India, Pakistan and Bangladesh – fossil fuel subsidies are more than double the level of spending on health services.

  • In countries such as Egypt, Pakistan, Morocco and Bangladesh, fossil fuel subsidies outweigh the national fiscal deficit.

These facts add to research from the International Energy Agency (IEA), which revealed that subsidies to fossil fuels are six times higher than those for renewable energy – despite the popular perception that it is too expensive to go green. The Organisation for Economic Cooperation and Development (OECD) also recently stated that coal – the most polluting fuel of all – is subject to the lowest levels of taxation.

The report builds on earlier ODI research, showing that climate change will hit the developing world the hardest of all, posing a threat to poverty eradication.

 

carbon emissions 2020

 

  speech bubble Comments »
 

 

 

5th November 2013

Annual installations of wind power in Latin America will double by 2022

Although wind plant construction across Latin America is modest compared to more established markets like South and East Asia, North America, and Europe, the region's wind power industry is taking off at a rapid pace.

Latin America has become the hottest growth market for the wind energy industry, at a time when growth rates in other markets are flat due to a variety of policy and macroeconomic challenges. According to a report from Navigant Research, annual wind power installations across Latin America will roughly double, in terms of capacity, over the next 10 years – growing from nearly 2.2 gigawatts (GW) in 2013 to 4.3 GW by 2022.

"Latin America is expected to account for at least 5.5 percent of the world's new wind power installations in 2013," says Feng Zhao, research director with Navigant Research. "With the strong political support of most governments and rapid economic growth fueling rising electricity demand, wind markets in the region are expected to exhibit double-digit compound annual growth rates through the next 10 years."

The latest 2013 wind power contract auction round in Brazil resulted in prices stabilising at higher rates, and the signing on of 1,505 megawatts (MW) of new wind capacity, according to the report. This suggests that the unsustainable downward trend in prices bottomed out last year, and the Brazilian market appears to be poised for strong growth moving forward. This will provide a foundation for wind energy growth across Latin America, the study concludes. An Executive Summary is available for free download.

 

wind power latin america 2020 2022

 

  speech bubble Comments »
 

 

 

4th November 2013

Carbon budget to 2100 will be used by 2034 according to PwC analysis

According to analysis by PricewaterhouseCoopers (PwC), the world is on track to blow the 2°C carbon budget, estimated by the Intergovernmental Panel on Climate Change (IPCC) for the next 89 years, within 21 years.

 

carbon budget 2034 2050 2100

 

This puts the world on a path consistent with potential global warming of about 4°C by 2100, the most extreme scenario presented in the recent IPCC 5th Assessment Report on climate science.

The results, from the 5th annual PwC Low Carbon Economy Index, examine the amount of energy-related carbon emitted per unit of GDP needed to limit global warming to 2°C.

The report warns that this level of warming "will have serious and far reaching implications." Current investment planning cycles for major business and infrastructure investments now need to factor this into their decision making.

It finds that policies and low carbon technologies have failed to break the link between growth and carbon emissions in the global economy. The world's energy mix remains dominated by fossil fuels:

  • Reductions in carbon intensity globally have averaged 0.7% per year over the past five years – a fraction of the 6% reductions now required every year to 2100

  • The G7 averaged a 2.3% reduction while the E7 – which includes much of the manufacturing base of the global economy – managed only 0.4%

  • US, Australia and Indonesia achieved significant reductions in carbon intensity in 2012, but no country has sustained major reductions over several years

  • While the fracking revolution has helped lower emissions in the US, cheaper coal contributed to higher coal usage elsewhere, for example in the EU, raising concerns that decarbonisation in one country can just shift emissions elsewhere.

If the world continues at current rates of decarbonisation, the carbon budget outlined by the IPCC for the period 2012 to 2100 would be spent in less than a quarter of that time, and be used up by 2034. Emissions over and above that budget would be increasing the chances of dangerous climate change, with average warming of surface temperature projected to be beyond 2°C.

Jonathan Grant, director of PwC Sustainability & Climate Change: “G20 countries are still consuming fossil fuels like there’s no tomorrow. Despite rapid growth in renewables, they still remain a small part of the energy mix and are overwhelmed by the increase in the use of coal. The results raise real questions about the viability of our vast fossil fuel reserves and the way we power our economy. The 2 degrees carbon budget is simply not big enough to cope with the unmitigated exploitation of these reserves.”

 

coal mining

 

Energy efficiency progress was one bright spot in the analysis. 92% of the small reduction in carbon intensity achieved last year is down to improvements in energy efficiency with the remaining 8% through a shift towards a cleaner energy mix. Italy, the UK and Turkey rank as the most energy efficient economies in the G20, consuming less energy for every $m of GDP generated than their counterparts. But the report warns that there is a limit to which we can cut energy use per unit of GDP.

Five years ago, our global decarbonisation target was 3.5% per annum, now the challenge nearly doubles to 6%. This is more than eight times our current rate of decarbonisation, a level never achieved before, let alone sustained for decades. To achieve what the IPCC deems a ‘safe’ amount of CO2 in the atmosphere to limit the extreme effects of climate change would require halving carbon intensity within the next ten years, and reducing it to one-tenth of today’s levels by 2050. By 2100, the global energy system would need to be virtually zero-carbon.

Jonathan Grant: “Our analysis assumes long term moderate economic growth in emerging economies, and slow steady growth in developed economies. But, failing to tackle climate change is unlikely to result in such a benign scenario of steady growth. Something’s got to give, and potentially soon. This has implications for a raft of investments in carbon intensive technologies that are currently being planned and executed today.”

Leo Johnson, partner, PwC sustainability & climate change said: “What we have yet to see is the quartet of CCS, nuclear, biofuels and energy efficiency decoupling growth from carbon. We've gone over the carbon cliff. It's time to figure out the steps that are going to get us back. We've also got to question now whether our assumptions of long term growth are reasonable and compatible with a future where we fail to limit climate change.”

 

  speech bubble Comments »
 

 

 

31st October 2013

The Climate Change Vulnerability Index 2014

The sixth annual release of Maplecroft’s Climate Change and Environmental Risk Atlas has revealed that 31% of global economic output will be based in countries facing ‘high’ or ‘extreme risks’ from climate change by 2025 — a 50% increase on current levels and more than double since the company began researching the issue in 2008.

 

climate change vulnerability index map 2014

 

Global business exposed to climate change on multiple levels

Future estimates of the overall cost of climate change on the economy include a wide spectrum of opinions. What cannot be disputed is that the ‘high’ and ‘extreme risk’ countries in Maplecroft’s CCVI include emerging and developing markets, whose importance to the world economy is ever increasing. By 2025, China’s GDP is estimated to treble from current levels to $28 trillion, while India’s is forecast to rise to $5 trillion – totalling nearly 23% of global economic output between them.

India’s economic exposure to the impacts of extreme climate-related events was recently highlighted by Cyclone Phailin. The storm caused an estimated US$4.15 billion of damage to the agriculture and power sectors alone in the state of Odisha, which is also India’s most important mining region. Up to 1 million tons of rice were destroyed, while key infrastructure, including roads, ports, railway and telecommunications were severely damaged, causing major disruption to company operations and the supply chains of industrial mineral users.

“With global brands investing heavily in vulnerable growth markets to take advantage of the spending power of rising middle class populations, we are seeing increased business exposure to extreme climate-related events on multiple levels, including their operations, supply chains and consumer base,” states James Allan, Head of Environment at Maplecroft. “Cyclone Phailin demonstrates the critical need for business to monitor the changing frequency and intensity of climate-related events – especially where infrastructure and logistics are weak.”

According to Maplecroft, the ability of highly vulnerable countries to manage the direct impact of extreme events on infrastructure will be a significant factor in mitigating the economic impacts of climate change and may present opportunities for investment. Adaptive measures, such as building flood defences and greater infrastructure resiliency, will, however, require the sustained commitment of governments.

 

cyclone phailin
Cyclone Phailin, 11 October 2013

 

Most at risk cities Dhaka, Mumbai, Manila, Kolkata, Bangkok – lowest risk in London, Paris

With commercial activity and the middle classes predominantly based in urban centres, Maplecroft has also calculated the risks to the world’s largest cities to pinpoint where the economic exposure will be highest over the next 30 years.

According to the CCVI’s sub-national calculations, of the 50 cities studied, five present an ‘extreme risk’ – Dhaka in Bangladesh; Mumbai and Kolkata in India; Manila in the Philippines and Thailand’s Bangkok – while only London and Paris were classified as ‘low risk.’ Shenzhen and the Pearl River Delta, which encompasses the cities of Guangzhou, Dongguan and Foshan and make up China’s manufacturing heartland, are among the most exposed to physical risks from extreme climate-related events.

 

West Africa sees greatest increase in risk

Meanwhile, the regions facing the most increased levels of risk are West Africa and the Sahel. Maplecroft’s Exposure Index incorporates recently released UN IPCC climate projections for the period up to 2040, and identifies regions that are projected to undergo a significant shift in key climate parameters in that timeframe. Over this period a projected warming of 2°C will combine with substantial changes in rainfall and humidity, which will have a significant impact on communities and businesses located in West Africa and the Sahel.

The region’s increased risk is reflected by the inclusion of West Africa’s largest economy, Nigeria, as the world’s sixth most at risk country in the CCVI. Nigeria’s oil sector is left particularly exposed to the impacts of climate-related events. Flooding between July-November 2012 resulted in an estimated loss of 500,000 barrels-per-day in oil production, equivalent to over one-fifth of Nigeria’s output. The oil-rich Niger delta is especially vulnerable, with rising sea levels already resulting in erosion and the loss of some oil wells in this area.

 

  speech bubble Comments »
 

 

 

26th October 2013

A clear majority of Americans want marijuana to be legal

A new poll by Gallup has revealed that 58% of Americans now support the legalisation of marijuana. Only 39% are now against.

 

gallup poll marijuana

 

When Gallup first asked the question back in 1969, only 12% favoured legalisation of the drug. This figure had more than doubled by 1980. It then levelled off during the next two decades, before rising steadily again. In 2011, support reached a majority for the first time with 50% in favour and 46% against. The gap has now widened further – by a dramatic amount, in fact – as shown by the graph above.

Support for legalisation has risen across the political spectrum, but remains weakest among Republican voters, with only 35% in favour. For Democrats, the figure is 65%, up from 61% in 2012. The largest increase, by far, has occurred with independent voters, whose support has increased by a massive 12% – from 50% in 2012, to 62% now.

 

voters

 

Perhaps unsurprisingly, age makes a difference too, with younger people more likely to be in favour. Among those aged between 18 and 29, over two-thirds believe marijuana should be legal. This decreases to 45% for those 65 and older.

 

age ranges

 

In the last year, recreational use of pot became legal in two states – Colorado and Washington. Over 20 states allow marijuana use for medical purposes. A sizable percentage of Americans (38%) this year admitted to having tried the drug, which may be a contributing factor to its greater acceptance.

 

how many americans have tried pot

 

If these trends continue, and with Generation X playing a greater role in politics, there may come a tipping point in the not-too-distant future when marijuana is legal across many more states and – eventually – the entire nation. Taxing and regulating the drug could be financially beneficial, while freeing up police time, allowing officers to concentrate on more serious crimes.

 

marijuana usa map
Credit: Lokal_Profil (CC BY-SA 2.5)

 

  speech bubble Comments »
 

 

 

26th October 2013

By 2017, New York City aims to have replaced all of its street lights with LEDs

Mayor Bloomberg and Transportation Commissioner Sadik-Khan have announced that all 250,000 street lights in New York City will be replaced with energy-efficient LEDs by 2017 – reducing power consumption and maintenance costs, while also lowering carbon emissions.

 

LED street lights

 

These plans will be the largest LED retrofit in the country, are expected to deliver at least $14 million in savings annually and will be another step towards the goal of reducing citywide greenhouse gas emissions by 30 percent by 2030.

Mayor Bloomberg: “With a quarter-million street lights in our City, upgrading to more energy efficient lights is a large and necessary feat. It will save taxpayers millions of dollars, move us closer to achieving our ambitious sustainability goals and help us to continue reducing City government’s day-to-day costs and improving its operations.”

“Using LEDs for street lighting is more than just a bright idea," said Commissioner Sadik-Khan. "It’s a necessity for sustainable cities to operate more efficiently while also delivering clearer, better quality light for New Yorkers. From our parks to our bridges and to our streets and highways, these 250,000 lights will brighten New York City’s streetscapes for generations to come.”

In 2009, the Department of Transportation (DOT) partnered with the Climate Group, while the U.S. Department of Energy (DOE) conducted studies to collect data on the performance of LED fixtures, as part of a global study to quantify the benefits for cities with LED lights. The tests measured factors such as illumination, colour and energy consumption. Replacement trials in Central Park and on Manhattan's FDR Drive showed energy savings of up to 83%.

These LEDs have also been found to minimise light pollution, improve public safety and make outdoor areas feel more welcoming at night. The bulbs can last up to 20 years before needing replacement, compared to standard high-pressure sodium lights currently on streets, which last for only six years. LEDs don't contain mercury or lead, and therefore won't release toxic gases if damaged. Yet another benefit is that LEDs are less attractive to nocturnal insects, which are instead drawn to ultraviolet, blue and green lights emitted by conventional light sources.

More importantly, lighting accounts for nearly 6% of CO2 emissions – about 1,900 million tons globally each year – equivalent to emissions from 70% of the world’s passenger vehicles. A considerable dent in our carbon footprint would be achieved simply by switching to more energy efficient lamps.

This announcement comes as more and more cities have begun adopting LED lights. Last year, for example, Los Angeles completed its own massive project, retrofitting 141,000 street lights with LED bulbs. The LED lighting market is expected to grow rapidly from 2015 onwards – accounting for 64 percent of general lighting by 2020, with prices falling by 80%.

 

  speech bubble Comments »
 

 

 

17th October 2013

Nearly 30 million people are living in slavery, according to latest estimate

Slavery was banned globally by the UN Declaration of Human Rights in 1948. However, it continues to be a significant problem today. A new Global Slavery Index – the first of its kind – estimates that 29.8 million people are living as slaves in 2013.

 

slave hands tied

 

There is evidence of slavery that predates written records. It is thought to have first proliferated after the invention of agriculture during the Neolithic Revolution about 11,000 years ago. In today's world, there are various types of slavery. These include human trafficking, forced labour, forced marriage, debt bondage and child soldiers. In all cases, the victims are treated as property to be bought and sold, exploited and denied the freedom which others take for granted.

The Walk Free Foundation is a new global charity based in Perth, Australia, with a stated aim of ending modern slavery within a generation. It hopes to achieve this by mobilising a global activist movement, generating high quality research, enlisting businesses and raising unprecedented levels of capital, to drive change in those countries and industries bearing the greatest responsibility for slavery.

As part of this effort, Walk Free has just released the Global Slavery Index 2013 which it claims is the most accurate and comprehensive assessment of slavery that has ever been published. This survey has statistics for 162 countries, based on detailed consultations with an international panel of experts, think tanks, and academic institutions. It ranks each country using a combined measure of three factors — prevalence of modern slavery by population, the number of child marriages and the amount of human trafficking in and out of a country. The ranking is heavily weighted to reflect the first factor, prevalence.

In terms of the percentage of its population that is enslaved, Mauritania in West Africa is ranked first. It has an estimated 150,000 slaves — or about 4.47% in a total population of just 3,359,000. The Caribbean nation of Haiti is in second place (2.16%), with Pakistan ranked third (1.23%).

In terms of absolute numbers (as opposed to just percentages), India has by far the most slaves. As shown in this graph, there are nearly 14 million, almost half of the worldwide total. China follows, with an estimated 2.9 million enslaved, and Pakistan comes in third with slightly over two million. The top 10 countries together account for more than 22 million of the 29.8 million people enslaved. If the total number of slaves today was represented by a country, it would be equivalent to the entire population of Malaysia. If represented by a U.S. state, it would be second only to California in population.

 

slave numbers by country

 

Iceland, Ireland and the United Kingdom are tied last, with a ranking of 160. This does not mean these countries are slavery-free. On the contrary, it is estimated that 4,200 – 4,600 people are in modern slavery in the UK alone.

Nick Grono, CEO of the Walk Free Foundation, says in a press release: "It would be comforting to think that slavery is a relic of history, but it remains a scar on humanity on every continent. This is the first slavery index, but it can already shape national and global efforts to root out modern slavery across the world. We now know that just ten countries are home to over three-quarters of those trapped in modern slavery. These nations must be the focus of global efforts."

Professor Kevin Bales, lead researcher on the Index: "Most governments don't dig deeply into slavery for a lot of bad reasons. There are exceptions, but many governments don't want to know about people who can't vote, who are hidden away, and are likely to be illegal anyway. The laws are in place, but the tools and resources and the political will are lacking. And since hidden slaves can't be counted it is easy to pretend they don't exist. The Index aims to change that."

The report is endorsed by individuals including former Secretary of State Hillary Clinton, former Prime Ministers Tony Blair, Gordon Brown and Julia Gillard; and leading philanthropists, Bill Gates, Sir Richard Branson and Mo Ibrahim, as well as academics, business leaders, and policy makers. It can be viewed at GlobalSlaveryIndex.org.

 

 

  speech bubble Comments »
 

 

 

15th October 2013

US shutdown means one sad tale after another for scientists

By Alexis Webb, National Institute for Medical Research

 

capitol building washington

 

As a researcher funded by the US National Science Foundation (NSF), since October 1, I've known that I will be not be receiving my monthly fellowship. This has meant that my work — investigating genes' role in vertebrate development — has been put on hold.

Like many other scientists faced with the US government shutdown, I was not told how to prepare for this, or if any contingency plans would be offered. Because of furloughs, officials at the NSF are unreachable for help. National parks, museums and research institutes are all shut, with scientists and staff sent home. This means suspension of many critical aspects of research.

Now I have to continue my work without financial support, living and working in London with no salary until funding is restored. Because I'm in the UK, my host laboratory does not rely on US grant money to operate, and I continue to have access to equipment and supplies. Other researchers have not been as lucky: 73% of employees at the National Insitutes of Health (NIH) in Bethesda are unable to remain at the bench during the shutdown, stalling advances in biomedical research. The minimal staff allowed to stay are able to keep crucial materials like cell lines alive, but worries linger for what will happen when the limited supplies run out.

As the budget impasse enters its third week, the list of programs and facilities affected is growing. Researchers in health sciences worldwide have probably noticed the effects on critical tools such as PubMed, an online database of more than 23 million journal articles, currently being operated with minimal staff. Geologists and meteorologists must make do with limited resources, as only systems with a direct impact on health and safety remain accessible. If the shutdown isn't dealt with before October 18, astronomers will be unable to use telescopes at the National Optical Astronomy Observatory in Arizona to scan the skies.

Researchers submitting grants for funding find themselves in a holding pattern, uploading their proposals without acknowledgement. Alec Davidson, associate professor of neurobiology at Morehouse School of Medicine said, he is "not sure what will happen with the October 16 deadline for the R21 grant (a smaller, shorter-term grant than the standard R01)" and has received "no response from program officers" at the NIH.

 

scientist with microscope

 

The grant review process is also suspended until further notice. Those who have already completed early stages of submission are seeing their assessment and scoring dates being postponed. Even researchers who secured funding before October 1 are not immune to the effects of the shutdown and are unable to activate their grants to receive funds. Again, those who could address these concerns have been furloughed and are inaccessible.

Outside the lab, the world of science publishing is also beginning to feel the negative effects of the shutdown. Peer-reviewed journals rely on volunteer scientists to evaluate the quality of the research submitted to be published. Journal editors now find that they are unable to reach their reviewers, as non-essential researchers are not allowed to reply to email or phone calls. As manuscripts are unable to be reviewed and have revisions addressed, there is a concern that science journals will face a delay in publishing. For the time being, it seems that there are enough papers and funded researchers that the system won't grind to a halt just yet.

The shutdown is having a real impact on scientists' lives. The entire US Antarctic research program was cancelled because of it. The story of chemical oceanographer Jamie Collins was widely covered in the media last week. A third-year graduate student, he had travelled from Boston to the Palmer Research Station in Antarctica, only to be told not to leave his ship. The field expedition he had been preparing for over the summer was cancelled because the NSF cannot support the research stations without a budget. He will be returning to Boston without collecting necessary data for his PhD project.

Graduate students, like Collins, and postdoctoral researchers, like me, the so-called work horses of academic science, rely on government granting bodies to support our training. Our future in science often depends on the quality of research we generate during this critical time. If we are unable to perform experiments and collect data that we need to publish our work, how can we be expected to advance our careers? Talented young researchers are taking note of the value that the US Congress is placing on science by failing to fund the government and may look elsewhere. We can only hope the disheartening message of "Science – Closed Until Further Notice" will change before it's too late.

 

science

 

Alexis Webb does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations. This story is republished courtesy of The Conversation (under Creative Commons-Attribution/No derivatives).

 

  speech bubble Comments »
 

 

 

9th October 2013

Delayed aging is better investment than cancer or heart disease research

Following the recent announcement from Google that the company's next startup, Calico, will tackle the science of aging, a new study concludes that research to delay aging and the infirmities of old age would have better social and economic benefits than advances in individual fatal diseases like cancer or heart disease.

 

anti aging pill

 

With even modest gains in our scientific understanding of how to slow the aging process, an additional 5 percent of adults in the U.S. over the age of 65 would be healthy rather than disabled every year from 2030 to 2060, revealed the study in the October issue of Health Affairs.

Put another way, an investment in delayed aging would mean 11.7 million more healthy adults over the age of 65 in 2060. The analysis, from top scientists at Harvard University, the University of Southern California, Columbia University, the University of Illinois at Chicago and other institutions, assumes research investment leading to a 1.25 percent reduction in the likelihood of age-related diseases. In contrast to treatments for fatal diseases, slowing aging would have no health returns initially, but would have significant benefits over the long term.

In the United States, the number of people aged 65 and over is expected to more than double in the next 50 years, from 43 million in 2010 to 106 million in 2060. About 28 percent of the current population over 65 is disabled.

 

americans aged over 65 graph 2010 2060

 

"In the last half-century, major life expectancy gains were driven by finding ways to reduce mortality from fatal diseases," said lead author Dana Goldman, holder of the Leonard D. Schaeffer Director's Chair at the USC Schaeffer Center for Health Policy and Economics. "But now disabled life expectancy is rising faster than total life expectancy, leaving the number of years that one can expect to live in good health unchanged or diminished. If we can age more slowly, we can delay the onset and progression of many disabling diseases simultaneously."

The study showed significantly lower and declining returns for continuing the current research "disease model," which seeks to treat fatal diseases independently, rather than tackling the shared, underlying cause of frailty and disability: aging itself.

Lowering the incidence of cancer by 25 percent in the next few decades — in line with the most favorable historical trends — would barely improve population health over not doing anything at all, the analysis showed. The same is true of heart disease, the leading cause of death worldwide: About the same number of older adults would be alive but disabled in 2060 whether we do nothing or continue to combat cancer and heart disease individually. The findings are in line with earlier research showing that curing cancer completely would only increase life expectancy by about three years.

"Even a marginal success in slowing aging is going to have a huge impact on health and quality of life. This is a fundamentally new approach to public health that would attack the underlying risk factors for all fatal and disabling diseases," said corresponding author S. Jay Olshansky of the School of Public Health at the University of Illinois-Chicago. "We need to begin the research now. We don't know which mechanisms are going to work to actually delay aging, and there are probably a variety of ways this could be accomplished, but we need to decide now that this is worth pursuing."

 

stages of aging

 

Several lines of scientific inquiry have already shown how we might age more slowly, including studies of the genetics of "centenarians" and other long-lived people. Slowing the signs of biological aging has also been achieved in animal models, using pharmaceuticals or interventions such as caloric restriction.

But until now, no assessment has been made of the costs and health returns on developing therapies for delayed aging.

"We would be affecting every generation," Olshansky said. "This study is a benchmark in the world of public health."

The study showed that, with major advances in cancer treatment or heart disease, a 51-year-old can expect to live about one more year. A modest improvement in delaying aging would double this to two additional years — and those years are much more likely to be spent in good health.

The increase in healthy years of life would have an economic benefit of approximately $7.1 trillion over the next five decades, the researchers find. Their analysis did not account for the potential cognitive benefits for older adults with research in delayed aging.

However, the results of the study also showed that improving the population of healthy, older adults will not lower overall health care spending. With research advances in delayed aging, more people would be alive past the age of 65, which means significantly higher outlays for Medicare and Medicaid despite less per-person spending on medical costs.

"Shifting the focus of medical investment to delayed aging instead of targeting diseases individually would lead to significant gains in physical health and social engagement," Goldman said. "We see extremely large population health benefits, and the benefits will extend to future generations. There are major fiscal challenges, but these are manageable with reasonable policy changes, and the economic value of such a shift is too large to ignore."

 

  speech bubble Comments »
 

 

 

8th October 2013

First commercial-scale carbon capture and mineralisation plant in the U.S. begins construction

Skyonic Corporation has hosted a groundbreaking event at its Capitol SkyMine plant in San Antonio. The plant, which is the first of its kind in the United States, is expected to capture 300,000 tons of CO2 annually.

 

carbon capture and mineralisation plant

 

Although miniscule when compared to the nation's total CO2 output (which currently stands at 5.5 billion tons annually), Skyonic's new plant is a vital first step on the long road to decarbonisation. Once fully operational in 2014, it will capture around 75,000 tons directly, with an additional 225,000 tons being offset by the production of green products. The plant is expected to turn a profit within three years from the sale of products including sodium bicarbonate, hydrochloric acid (HCl), and bleach.

"The beginning of construction is a major milestone on the road to commercialisation," said Greg Hale, the President of Capitol Aggregates. "When Skyonic began operating its demonstration plant at Capitol Aggregates several years ago, we were excited about the prospect of producing more sustainable cement. Now that the project has reached a commercial scale, we couldn't be happier to have such a revolutionary process at our cement factory."

Skyonic has operated a demonstration-scale plant at the Capitol Aggregates site since 2010, with on-going support from the San Antonio community. The commercial-scale Capitol SkyMine plant will employ roughly 35 people and is expected to create more than 200 jobs through its construction and on-going operations.

"I applaud the Zachry Corporation and Skyonic for setting the standard with the first commercial carbon capture plant of its kind," Mayor Julián Castro said. "This project is another example of how San Antonio is becoming a leader in combining green technology and job creation."

Skyonic's electrolytic carbon capture technology – SkyMine – will selectively capture CO2, acid gases and heavy metals from the flue gas of the Capitol Aggregates Cement Plant, where the Capitol SkyMine plant will be retrofitted. The captured pollutants will be mineralised into products which are stored, transported and sold as safe, stable solids – eliminating many of the costs and concerns associated with other capture. The sodium bicarbonate, as well as the hydrochloric acid and bleach that is produced, can be sold at a profit. By producing valuable products using low cost chemical inputs and operating at energy efficient conditions, SkyMine captures CO2 at substantially lower operating cost than other technologies, allowing industrial emitters to turn a profit from reduced emissions.

"Industrial manufacturing is a cornerstone of the global economy and we're doing our part in making the process more lucrative for industries and cleaner for the environment," said Joe Jones, founder and CEO of Skyonic. "Our partners and investors have played an important role in getting to this commercialisation stage – and we're all looking forward to starting construction and making our first plant a stand out success."

 

  speech bubble Comments »
 

 

 

5th October 2013

World's most fuel-efficient car makes U.S. debut

The Volkswagen XL1 – the most aerodynamic and fuel-efficient car ever built – made its U.S. debut this week at the 23rd Annual Society of Environmental Journalists (SEJ) Conference held in Chattanooga, Tennessee.

 

XL1

 

The XL1 offers a European combined fuel consumption rating of 261 miles per gallon (more than 200 mpg estimated in the U.S. cycle). By way of comparison, the U.S. average for new passenger vehicles is currently around 32 mpg and forecast to reach 54 mpg by 2025. The XL1 goes 32 miles in all-electric mode as a zero-emissions car, with top speed of 99 mph, accelerating from 0 to 62 mph in 12.7 seconds.

“The XL1 offers a glimpse into Volkswagen’s present and future eco-mobility capabilities, and highlights the ultimate successes of ‘Thinking Blue’,” said Oliver Schmidt, General Manager of the Engineering and Environmental Office (EEO), Volkswagen Group of America. “Volkswagen is proud to debut this ultra-fuel-efficient vehicle before the Society of Environmental Journalists, a group that shares in our commitment to environmental stewardship.”

The XL1 follows pure sports-car design principles: light weight (1753 pounds), exceptional aerodynamics (Cd 0.19), and a low centre of gravity. This super-efficient vehicle has the ability to cruise down the road at a constant 62 mph while using just 8.4 PS (6.2kW) horsepower. In all-electric mode, it requires less than 0.1 kWh to cover more than a kilometre.

The car emits just 21 g/km of CO2, thanks to its high-tech lightweight design, aerodynamic efficiency, and plug-in hybrid system consisting of a 48 PS (35kW) two-cylinder TDI engine, 27-hp electric motor, seven-speed DSG dual-clutch automatic transmission, and lithium-ion battery.

 

Click to enlarge

XL1 schematic

 

Conceptually, the XL1 is the third evolutionary stage of Volkswagen’s 1-litre car strategy. When the new millennium was ushered in, Professor Ferdinand Piëch – now Chairman of the Supervisory Board of Volkswagen – formulated the visionary goal of putting into production a practical car with combined fuel consumption of one litre per 100 km (235 mpg). In the two-seat XL1, this vision has become a reality.

Despite the tremendous efficiency of the car, its engineers and designers successfully came up with a body design which delivers more everyday utility than the two previous prototypes. In the L1, shown in 2002 and 2009, driver and passenger sat in a "tandem" arrangement for optimal aerodynamics; in the XL1, the two occupants sit slightly offset, side by side, almost like a conventional vehicle.

The XL1 is 153.1 inches long, 65.6 inches wide, and just 45.4 inches tall. By comparison, a Volkswagen Polo is slightly longer (156.3 in) and wider (66.2 in), but is significantly taller (57.6 in). Even a purebred sports car like today’s Porsche Boxster is 5.1 inches taller. The XL1 will look spectacular going down the highway – a car of the future, built for today.

This technology comes at a price, of course. The XL1 will be sold at $146,000 with only 250 being produced.

 

Click to enlarge

         
  future car technology timeline future car technology timeline future car technology timeline future car technology timeline
         
  future car technology timeline future car technology timeline future car technology timeline future car technology timeline
         
  future car technology timeline future car technology timeline future car technology timeline future car technology timeline
         
  future car technology timeline future car technology timeline future car technology timeline future car technology timeline
         
  future car technology timeline future car technology timeline future car technology timeline future car technology timeline
         
  future car technology timeline future car technology timeline future car technology timeline future car technology timeline
         
  future car technology timeline future car technology timeline future car technology timeline future car technology timeline
         
  future car technology timeline future car technology timeline future car technology timeline future car technology timeline
         

 

  speech bubble Comments »
 

 

 

19th September 2013

Nearly half of US jobs could be at risk of computerisation within 20 years

A study by the Oxford Martin School shows that nearly half of US jobs could be at risk of computerisation within 20 years. Transport, logistics and office roles are most likely to come under threat.

 

automation

 

The new study, a collaboration between Dr Carl Benedikt Frey (Oxford Martin School) and Dr Michael A. Osborne (Department of Engineering Science, University of Oxford), found that jobs in transportation, logistics, as well as office and administrative support, are at "high risk" of automation. More surprisingly, occupations within the service industry are also highly susceptible, despite recent job growth in this sector.

"We identified several key bottlenecks currently preventing occupations being automated," says Dr. Osborne. "As big data helps to overcome these obstacles, a great number of jobs will be put at risk."

The study examined over 700 detailed occupation types, noting the types of tasks workers perform and the skills required. By weighting these factors, as well as the engineering obstacles currently preventing computerisation, the researchers assessed the degree to which these occupations may be automated in the coming decades.

 

automation graph

 

"Our findings imply that as technology races ahead, low-skilled workers will move to tasks that are not susceptible to computerisation – i.e., tasks that require creative and social intelligence," the paper states. "For workers to win the race, however, they will have to acquire creative and social skills."

"While computerisation has been historically confined to routine tasks involving explicit rule-based activities, algorithms for big data are now rapidly entering domains reliant upon pattern recognition and can readily substitute for labour in a wide range of non-routine cognitive tasks. In addition, advanced robots are gaining enhanced senses and dexterity, allowing them to perform a broader scope of manual tasks. This is likely to change the nature of work across industries and occupations."

The low susceptibility of engineering and science occupations to computerisation, on the other hand, is largely due to the high degree of creative intelligence they require. However, even these occupations could be taken over by computers in the longer term.

Dr Frey said the United Kingdom is expected to face a similar challenge to the US. "While our analysis was based on detailed datasets relating to US occupations, the implications are likely to extend to employment in the UK and other developed countries," he said.

Full version of the paper:
http://www.futuretech.ox.ac.uk/files/The_Future_of_Employment_OMS_Working_Paper_1.pdf

 

automation graph

 

  speech bubble Comments »
 

 

 

18th September 2013

Google launches new company with aim to defeat aging

Google today announced Calico, a new company that will focus on health and well-being – in particular the challenge of aging and associated diseases. Arthur D. Levinson, Chairman and former CEO of Genentech and Chairman of Apple, will be Chief Executive Officer and a founding investor.

 

google

 

Announcing this new investment, Larry Page, Google CEO said: “Illness and aging affect all our families. With longer term, moonshot thinking around healthcare and biotechnology, I believe we can improve millions of lives. It’s impossible to imagine anyone better than Art — one of the leading scientists, entrepreneurs and CEOs of our generation — to take this new venture forward.”

“I’ve devoted much of my life to science and technology, with the goal of improving human health," Levinson commented. "Larry’s focus on outsized improvements has inspired me, and I’m tremendously excited about what’s next.”

Art Levinson will remain Chairman of Genentech and a director of Hoffmann-La Roche, as well as Chairman of Apple. Commenting on Art’s new role, Franz Humer, Chairman of Hoffmann-La Roche, said: “Art’s track record at Genentech has been exemplary, and we see an interesting potential for our companies to work together going forward. We’re delighted he’ll stay on our board.”

Tim Cook, Chief Executive Officer of Apple, said: “For too many of our friends and family, life has been cut short or the quality of their life is too often lacking. Art is one of the crazy ones who thinks it doesn’t have to be this way. There is no one better suited to lead this mission and I am excited to see the results.”

 

  speech bubble Comments »
 

 

 

16th September 2013

Europe's largest tidal energy project given go-ahead

MeyGen has been awarded consent by the Scottish Government for an 86 megawatt (MW) tidal energy project, following the completion of the statutory approval process with the regulator Marine Scotland.

 

meygen tidal energy

 

The project is located in the Inner Sound of the Pentland Firth off the north coast of Caithness, home to one of Europe’s greatest tidal resources. It is the largest tidal stream energy project to be awarded consent in Europe and constitutes the first phase of a site that could yield almost 400MW by 2020.

MeyGen plans to build an initial demonstration array of up to 6 turbines, with construction starting in early 2014 and turbines commissioned in 2015. This initial array will provide valuable environmental data for the subsequent phases and the wider tidal energy industry. When fully operational, it could generate enough electricity to power around 40% of homes in the Scottish Highlands.

 

scotland map

 

Ed Rollings, Environment & Consents Manager of MeyGen, commented: “The award of this consent is the culmination of over four years of environmental work and extensive consultation with stakeholders and the local community.

“The Pentland Firth and Orkney Waters region is an internationally important area for wildlife and we are committed to continuing research with interested parties to ensure that the exploitation of this clean, predictable and sustainable energy resource is done so in a manner that does not have a detrimental effect on the species and habitats in the area.”

Fergus Ewing, the Scottish energy minister, added: "We must tackle climate change. We need to reduce our reliance on fossil fuels through better and more efficient uses of energy. Marine energy – a homegrown technology with huge potential – is part of the solution."

Another huge marine project – the world's largest wave farm – was recently approved by the Scottish government and will be constructed on the other side of the country. First Minister Alex Salmond has set the ambitious goal of generating 100% of electricity from renewables by 2020.

Seawater is 832 times denser than air, so a 5 knot ocean current has more kinetic energy than a 220 mph wind.  Therefore, ocean currents have extremely high energy density and require smaller devices to harness than wind power. Since oceans cover 70% of Earth’s surface, ocean energy (including wave power, tidal current power and ocean thermal energy conversion) is a vast untapped resource, estimated at between 2,000 and 4,000 TWh per year. The potential of marine energy is being explored by a number of other countries, including the USA, which last year granted a license for the nation's first wave power station.

 

turbine schematic

 

  speech bubble Comments »
 

 

 
     
       
     
   
« Previous  
   
     
   

 
     
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

 
 
 
 

 


future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed