future timeline technology singularity humanity
 
   
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed
 
     
     
 
       
 
 
 

Blog » January 2014

 
     
 

31st January 2014

A simple new way to induce pluripotency: acid

Japanese researchers have converted adult cells from mice into stem cells by exposing them to acid. This could pave the way for routine use of stem cells in medicine with a technique that is cheaper, faster and more efficient than before.

 

 

An unusual reprogramming phenomenon by which the fate of somatic cells can be drastically altered through changes to the external environment is described in two new articles.

Postnatal somatic cells committed to a specific lineage are shown to be converted into a pluripotent state (capable of differentiating into nearly all cell types) when exposed to an environmental stress, in this case short exposure to low pH. This reprogramming process does not need nuclear manipulation or the introduction of transcription factors – thought to be necessary to induce pluripotency – so the work may have important implications for regenerative medicine.

Reprogramming in response to environmental stress has been observed in plants, whereby mature cells can become immature cells capable of forming a whole new plant structure, including roots and stalks. Whether animal cells have a similar potential has been a challenging question, but one that Haruko Obokata and co-authors have addressed. They demonstrate that mammalian somatic cells can be reprogrammed when stressed by low-pH conditions, and have named this phenomenon Stimulus-Triggered Acquisition of Pluripotency (STAP).

So-called STAP cells have some characteristics that resemble embryonic stem cells, but the STAP cells only have a limited capacity for self-renewal. In a second paper, Obokata and colleagues investigate the nature of STAP cells and suggest that they represent a unique state of pluripotency. The researchers also demonstrate that under pluripotent stem-cell culture conditions, STAP cells can be transformed into robustly self-renewing stem cells, similar to embryonic stem cells.

Together, these findings reveal that cells in the body have the potential to become pluripotent and provide new insights into the diverse cellular states.

Professor Chris Mason, an expert in regenerative medicine at University College London: "If it works in man, this could be the game changer that ultimately makes a wide range of cell therapies available using the patient's own cells as starting material – the age of personalised medicine would have finally arrived.

"Who would have thought that to reprogram adult cells to an embryonic stem cell-like (pluripotent) state just required a small amount of acid for less than half an hour? An incredible discovery."

 

STAP cells generated an entire fetus body.
STAP cells generated an entire fetus body. Credit: Haruko Obokata.

 

  speech bubble Comments »
 

 

 

30th January 2014

Leech can survive being frozen for long periods

Japanese researchers have discovered a species of leech able to survive for 24 hours at -196°C (-321°F) and for 32 months at -90°C (-130°F). This finding could provide insights into cryopreservation for humans. It also increases the chances of life elsewhere in the universe.

 

leech
Credit: Dai Suzuki et al / PLOS ONE

 

Pictured above is Ozobranchus jantseanus, a parasitic leech that feeds on the blood of freshwater turtles in East Asia. These creatures typically reach up to 15 mm (0.6") in length, attaching themselves externally and relying on their hosts for all of their life stages, from egg to adult.

A team from Tokyo University of Marine Science and Technology – working on a separate project – had pulled a frozen, dead turtle out of storage, when they noticed several of these parasites clinging to the animal. The leeches appeared to be still alive, wriggling as they began to warm up. The researchers knew they had found something highly unusual, as it was rare for any organism to withstand such low temperatures. A new study was undertaken to investigate this further and to determine the full extent of their cold tolerance.

Prior to the experiment, only two other animals were known to survive in liquid nitrogen – a microscopic invertebrate known as the waterbear and the larvae of a drosophilid fly – with maximum recorded submersion times of 15 minutes and one hour, respectively.

 

liquid nitrogen

 

Cryoexposure tests were conducted on seven different species of leech, at -90°C and -196°C, for a period of 24 hours. Ozobranchus jantseanus was the only one to survive both temperatures. In fact, it could withstand multiple freezes and repeated freeze-thaw cycles, without needing time to acclimatise.

An even more amazing discovery was to follow, however. Every single leech from this species placed at -90°C (-130°F) survived at least nine months, with some lasting in storage for 32 months, or more than 2.5 years. This is colder than the average temperature on Mars.

At present, it is unknown how these creatures are managing to cope with such an extreme environment. "It is likely that this cryotolerant ability has arisen in response to some as yet unclarified adaptation," the researchers state. "It is hoped that these findings will contribute to the development of new cryopreservation methods that do not require additives, and also to the resuscitation of organisms that have been frozen underground in permafrost areas, on Antarctica, and possibly on other planets."

Their work is published in PLOS ONE.

 

  speech bubble Comments »
 

 

 

27th January 2014

World's first multi-material, full-colour 3D printer

Stratasys Ltd. has announced the launch of its ground-breaking Objet500 Connex3, the first and only machine to combine colours with multi-material 3D printing.

 

3D printer 2014 technology

 

A game-changer for product design, engineering and manufacturing processes, the Objet500 Connex3 features a unique triple-jetting technology. This combines droplets of three base materials to produce parts with virtually unlimited combinations of rigid, flexible, and transparent colour materials as well as colour digital materials – all in a single print run.

This ability to achieve the characteristics of an assembled part without assembly or painting is a significant time-saver, helping manufacturers to validate designs and make decisions earlier before committing to manufacturing, and bringing products to market 50% faster.

"Stratasys' goal is to help our customers revolutionise their design and manufacturing processes," says Stratasys CEO, David Reis. "I believe our new Objet500 Connex3 Colour Multi-material 3D Printer will transform the way our customers design, engineer and manufacture new products. In general and with Connex technology in particular, we will continue to push the envelope of what's possible in a 3D world."

 

Click to enlarge

  3D printer 2014 technology 3D printer 2014 technology 3D printer 2014 technology
 
3D printer 2014 technology

3D printer 2014 technology

3D printer 2014 technology
 
3D printer 2014 technology

3D printer 2014 technology

3D printer 2014 technology

 

 

Similar to a 2D inkjet printer, three colour materials – VeroCyan, VeroMagenta and VeroYellow – are combined to produce hundreds of vivid colours. These colour materials join Stratasys' extensive range of PolyJet photopolymer materials including digital materials, rigid, rubber-like, transparent, and high temperature materials to simulate both standard and higher temperature engineering plastics.

The Objet500 Connex3 also features six palettes for new rubber-like Tango colours, ranging from opaque to transparent in various shore values to address markets such as automotive, consumer and sporting goods and fashion.

Stratasys VP of product marketing and sales operations, Igal Zeitun: "As the first true multi-purpose 3D printer, we believe the Objet500 Connex3 is in a league of its own – enabling you to dream up a product in the morning, and hold it in your hands by the afternoon, with the exact intended colour, material properties and surface finish."

Duncan Wood, publisher of 3D printing magazine TCT, told the BBC: "This is groundbreaking stuff. Being able to produce single products incorporating materials of different rigidity and colour has been the holy grail of 3D printing to date. This is industrial-grade technology that will afford designers a level of creativity they've never had before."

 

 

  speech bubble Comments »
 

 

 

25th January 2014

Global warming continues: 2013 was fourth hottest year on record

The average combined land and ocean surface temperature for January–December 2013 was tied as the fourth warmest such period on record, at 0.62°C (1.12°F) above the 20th century average.

 

global warming 2013 map

 

The latest summary of global temperature released by the National Oceanic and Atmospheric Administration (NOAA) concludes that warmer-than-average temperatures affected the vast majority of the globe during 2013. Record warmth was observed across much of southern and western Australia, southwestern Ethiopia, eastern Tanzania, parts of central Asia around Kazakhstan and Uzbekistan, a large section of the southwestern Pacific Ocean, along with regions of the Arctic, central Pacific, and central Indian Oceans.

Temperatures were cooler-than-average across the central United States – a region that saw record warmth in 2012 – along with small sections of the eastern Pacific Ocean and the Southern Ocean off the tip of South America. No record coldest regions were observed for the January–December 2013 period, as shown in the map below.

Globally, 2010 remains the hottest year recorded by NOAA at 0.66°C (1.2°F) above the 20th century average, with 2005 and 1998 in second and third place, respectively. Including 2013, all 13 years of the 21st century (2001-2013) rank among the 15 warmest in the 134-year observational record. Viewed over a longer timescale, the trend is even more obvious. Last year's high temperatures occurred even without El Niño, suggesting that a new record may soon be reached and casting doubt on recent claims of a "pause" in warming.

 

global warming map 2013 temperatures

 

Despite the overwhelming evidence, the near-unanimous agreement from climate experts, and growing number of disasters affecting the world, much of the public still believes that a controversy exists in the scientific community and/or experts are distorting the truth. Gallup polls show that 40% of U.S. adults view global warming as exaggerated, with a similar number thinking natural causes are to blame.

In fact, the evidence for climate change (a term used since at least 1955) and humanity's contribution to it has become stronger than ever. Study after study confirms that human industrial activity is clearly and by far the dominant factor driving the recent changes in our atmosphere:

 

global warming contributions

Credit: SkepticalScience (CC BY 3.0)

 

We have known since the 19th century that CO2 is a greenhouse gas – trapping heat in ways that can be demonstrated with even simple experiments. By analysing the ratio of carbon isotopes, we can easily determine what proportion is natural and what proportion is man-made. From this, we know that our carbon emissions have been absolutely colossal when measured on a geologic timescale, with changes now happening 10 times faster than any period since the Cretaceous–Paleogene extinction event of 65 million years ago. We know that the so-called Medieval warm period, while unusually warm in some regions like the North Atlantic, was much cooler than today on a global basis. We know that solar activity is not a cause of recent warming and new research indicates that climate sensitivity to CO2 input has been underestimated. We know from simple experiments that even a small increase in parts per million can have an obvious impact.

There is abundant evidence of current impacts in the form of shrinking glaciers (including the Glacier National Park), larger and more damaging wildfires, ocean acidification and deoxygenation, loss of coral reefs, fish migrations, bark beetle and other pest movements, rising sea levels, coastal erosion, extremes in flooding and drought, along with more frequent heat waves. We know that Arctic sea ice is melting 50 years ahead of earlier forecasts and that ice loss in the region is far greater than the relatively small gain in the Antarctic. We know that vast areas of carbon-absorbing forests have been cut down over the centuries and particularly during the last decade – in order to make way for our sprawling cities and their carbon-spouting automobiles – not to mention thousands of planes in the skies overhead – all of which have appeared on this planet in the blink of an eye, geologically speaking. Any supposed "benefits" to plants from extra CO2 will be offset by the negative effects from drought, weeds and higher temperatures. There are tens of millions of people around the world already being affected by this panoply of converging impacts. Recent ventures into unconventional fossil fuels are the stuff of nightmares.

We have the world's most powerful supercomputers, making trillions of calculations per second for months on end, running state-of-the-art simulations with fantastic levels of detail. Contrary to what some would claim, these models have proven remarkably successful, correctly predicting:

• That our land, atmosphere and oceans would warm.
• That the troposphere would warm and the stratosphere would cool.
• That nighttime average temperatures would increase more than daytime average temperatures.
• That winter average temperatures would increase more than summer average temperatures.
• That polar amplification would lead to greater temperature increases nearer the poles.
• That the Arctic would warm faster than the Antarctic.
• The magnitude (0.3 K) and duration (two years) of the cooling from the Mt. Pinatubo eruption.
• The amount of water vapour feedback due to ENSO.
• The response of southern ocean winds to the ozone hole.
• The expansion of the Hadley cells.
• The poleward movement of storm tracks.
• The rising of the tropopause and the effective radiating altitude.
• The clear sky super greenhouse effect from increased water vapour in the tropics.
• The near constancy of relative humidity on global average.
• That coastal upwelling of ocean water would increase.
• They performed a retrodiction for the Last Glacial Maximum sea surface temperatures, which was inconsistent with the paleo evidence, and better paleo evidence subsequently showed the models were right.

And yet, even without these computer models, there is clear evidence of climate change and our influence on it. Decades of peer-reviewed studies in the world's top scientific journals have confirmed this reality; just as they confirmed the reality of evolution, our planet's geologic history, the germ theory of disease, links between smoking and cancer, depletion of the ozone layer by CFCs, along with countless other biological, chemical and physical processes. The science can never be perfect and there will always be gaps, but today no scientific body of national or international standing disputes the fundamental points.

There are, of course, a small number of individual climate scientists who claim to be sceptical. In almost every case, however, they either have ties to fossil fuel interests, or their work has never been peer-reviewed and published in a respected journal. It is worth noting that individuals like Christopher Monckton are not climate scientists and are totally unqualified in the field. A recent documentary, The Great Global Warming Swindle has been savaged by climatologists for its cherry picking, inaccuracies and misleading claims. Many arguments continue to be made by sceptics (such as the 1970s cooling myth), but literally none stand up to scrutiny. The science behind climate change is robust and has withstood almost everything thrown at it – including the recent "Climategate", with multiple independent inquiries finding no evidence of fraud or scientific misconduct.

Given all of the above, the risks of inaction – and the obvious benefits of clean technology – how can people be so eager to embrace fossil fuels, so confident in their scepticism, and willing to take such a gamble on their children's future? Even the conservative U.S. military now takes the issue seriously and is preparing for the impacts. If climate scientists are in it for the money, they're doing it wrong.

Global warming is the biggest story of our time, a result of our explosive growth in population and technology. It will define the 21st century and possibly many centuries to come. Ignoring the evidence and casually dismissing what decades of peer-reviewed science have told us would be a mistake of truly monumental proportions.

 

  speech bubble Comments »
 

 

 

24th January 2014

A quarter of all ray and shark species face imminent extinction

A quarter of the world's cartilaginous fish – namely sharks and rays – face extinction within a few decades, according to the first study to systematically and globally assess their fate.

 

shark and manta ray

 

The International Union for Conservation of Nature's (IUCN's) Shark Specialist Group (SSG), co-chaired by Nick Dulvy, conducted the study, which was published in the eLIFE journal this week.

Previous studies have documented local overfishing of some populations of sharks and rays. This new survey is the first to assess their status throughout coastal seas and oceans. It reveals that globally, one-quarter (249) of 1,041 known shark, ray and chimaera species fall under threatened categories on the IUCN Red List.

"We now know that many species of sharks and rays – not just the charismatic white sharks – face extinction across the ice-free seas of the world," says Dulvy. "There are no real sanctuaries for sharks where they are safe from overfishing."

Over two decades, the authors applied the IUCN's Red List categories and criteria to the 1,041 species at 17 workshops involving more than 300 experts. They incorporated all available information on distribution, catch, abundance, population trends, habitat use, life histories, threats and conservation measures.

Sharks and rays are at substantially higher risk of extinction than many other animals and have the lowest percentage of species considered safe. Using the IUCN Red List, the authors classified 107 species of rays (including skates) and 74 species of sharks as threatened. Just 23 percent of species were labeled as being Least Concern.

Major hotspots for shark and ray depletion identified in the study were the Indo-Pacific (particularly the Gulf of Thailand), the Red Sea and the Mediterranean Sea.

 

shark ray global fishing map

 

"In the most peril are the largest species of rays and sharks, especially those living in relatively shallow water that is accessible to fisheries. The combined effects of overexploitation – especially for the lucrative shark fin soup market – and habit degradation are most severe for the 90 species found in freshwater.

"A whole bunch of wildly charismatic species is at risk. Rays, including the majestic manta and devil rays, are generally worse off than sharks. Unless binding commitments to protect these fish are made now, there is a real risk that our grandchildren won't see sharks and rays in the wild."

Losing these fish will be like losing whole chapters of our evolutionary history, says Dulvy. "They are the only living representatives of the first lineage to have jaws, brains, placentas and the modern immune system of vertebrates."

The potential loss of the largest species is frightening for many reasons, he adds. "The biggest species tend to have the greatest predatory role. The loss of top or apex predators cascades throughout marine ecosystems."

The IUCN SSG is calling on governments to safeguard sharks, rays and chimaeras through a variety of measures, including the following: prohibition on catching the most threatened species, science-based fisheries quotas, protection of key habitats and improved enforcement.

 

shark fin soup
Sharks' fin on the menu of a restaurant in Singapore. Credit: ProjectManhattan

 

  speech bubble Comments »
 

 

 

24th January 2014

New technique improves nano-scale images

When capturing images at the atomic scale, even tiny movements of the sample can result in skewed or distorted images – and those movements are virtually impossible to prevent. Now microscopy researchers at North Carolina State University have developed a new technique that accounts for that movement and eliminates the distortion from the finished product.

 

nano scale image

 

At issue are scanning transmission electron microscopes (TEMs), which can obtain images of a material’s individual atoms. To take those images, scientists have to allow a probe to scan across the sample area – which has an area of less than 25 nanometres squared. That scanning can take tens of seconds.

The sample rests on a support rod, and while the scanning takes place, the rod expands or contracts due to subtle changes in ambient temperature. The rod’s expansion or contraction is imperceptible to the naked eye, but because the sample area is measured in nanometres the rod’s movement causes the sample material to shift slightly. This so-called “drift” can cause the resulting scanning TEM images to be significantly distorted.

“But our approach effectively eliminates the effect of drift on scanning TEM images,” says Dr. James LeBeau, an assistant professor of materials science and engineering at NC State and senior author of a paper describing the work.

Researchers programmed the microscope to rotate the direction in which it scans the sample. For example, it might first take an image scanning from left to right, then take one scanning from top to bottom, then right to left, then bottom to top. Each scanning direction captures the distortion caused by drift from a different vantage point.

The researchers plug those images into a program they developed that measures the features in each image and uses that data to determine the precise direction and extent of drift within the sample. Once the drift is quantified, the images can be adjusted to remove the distortion caused by the drift. The resulting images accurately represent the actual structure of the sample, giving scientists new capabilities to understand bonding between atoms.

“Historically, a major problem with drift has been that you need to have a reference material in any nanoscale image, so that you can tell how the image has been distorted,” LeBeau says. “This technique makes that unnecessary. That means we can now look at completely unknown samples and discover their crystalline structures – which is an important step in helping us control a material’s physical properties.”

The paper, “Revolving scanning transmission electron microscopy: correcting sample drift distortion without prior knowledge,” will be published in the March issue of Ultramicroscopy. Lead author of the paper is Dr. Xiahan Sang, a postdoctoral researcher at NC State. There is a patent pending on the technique.

 

  speech bubble Comments »
 

 

 

24th January 2014

Plumes of water vapour detected from dwarf planet Ceres

Scientists using the far-infrared abilities of the Herschel space observatory have made the first definitive detection of water vapour on the largest and roundest object in the asteroid belt, Ceres. A space probe is due to arrive there in 2015.

 

ceres

 

Plumes of water vapour are thought to shoot up periodically from Ceres when portions of its icy surface warm slightly. Ceres is classified as a dwarf planet – a Solar System body larger than an asteroid, but smaller than a planet.

Herschel is a European Space Agency (ESA) mission with important NASA contributions.

"This is the first time water vapour has been unequivocally detected on Ceres or any other object in the asteroid belt and provides proof that Ceres has an icy surface and an atmosphere," said Michael Küppers of ESA in Spain, lead author of a paper in the journal Nature.

The results come at just the right time for NASA's Dawn mission, which is on its way to Ceres now after spending more than a year orbiting the large asteroid Vesta. Dawn is scheduled to arrive at Ceres in the spring of 2015, where it will take the closest ever look at its surface.

 

dawn probe ceres 2015



"We've got a spacecraft on the way to Ceres, so we don't have to wait long before getting more context on this intriguing result, right from the source itself," said Carol Raymond, deputy principal investigator at NASA's Jet Propulsion Laboratory in California. "Dawn will map the geology and chemistry of the surface in high resolution, revealing the processes that drive the outgassing activity."

For the last century, Ceres was known as the largest asteroid in our Solar System. But in 2006, the International Astronomical Union, a governing organisation responsible for naming planetary objects, reclassified Ceres as a dwarf planet because of its large size. It is roughly 590 miles (950 kilometres) in diameter. When it first was spotted in 1801, astronomers thought it was a planet orbiting between Mars and Jupiter. Later, other bodies with similar orbits were found, marking the discovery of our Solar System's main belt of asteroids.

Scientists believe Ceres contains rock in its interior with a thick mantle of ice that – if melted – would amount to more fresh water than is present on all of Earth. The materials making up Ceres likely date from the first few million years of our Solar System's existence and accumulated before the planets formed.

 

ceres moon earth scale diagram
Scale image of Earth, the Moon and Ceres.

 

Until now, ice had been theorised to exist on Ceres but had not been detected conclusively. It took Herschel's far-infrared vision to see, finally, a clear spectral signature of the water vapour. But Herschel did not see water vapour every time it looked. While the telescope spied water vapour four different times, on one occasion there was no signature.

Here is what scientists think is happening: when Ceres swings through the part of its orbit that is closer to the Sun, a portion of its icy surface becomes warm enough to cause water vapour to escape in plumes at a rate of about 6 kilograms (13 pounds) per second. When Ceres is in the colder part of its orbit, no water escapes.

The strength of the signal also varied over hours, weeks and months, because of the water vapour plumes rotating in and out of Herschel's views as the object spun on its axis. This enabled the scientists to localise the source of water to two darker spots on the surface of Ceres, previously seen by NASA's Hubble Space Telescope and ground-based telescopes. These dark spots might be more likely to outgas because dark material warms faster than light material. When the Dawn spacecraft arrives at Ceres, it will investigate these features.

The results are somewhat unexpected because comets, the icier cousins of asteroids, are known typically to sprout jets and plumes, while objects in the asteroid belt are not.

"The lines are becoming more and more blurred between comets and asteroids," said Seungwon Lee of JPL, who helped with the water vapour models along with Paul von Allmen, also of JPL. "We knew before about main belt asteroids that show comet-like activity, but this is the first detection of water vapour in an asteroid-like object."

 

  speech bubble Comments »
 

 

 

20th January 2014

Comet-chasing Rosetta probe awakes from hibernation

This evening – at 18:17 GMT – the European Space Agency's Rosetta spacecraft awoke from hibernation mode in preparation for its encounter with a comet, 67P/Churyumov–Gerasimenko.

The probe was launched in March 2004 and performed several flybys – of Earth, Mars and two asteroids – before entering a low power state in June 2011, in order to conserve energy. It has now reawakened and successfully communicated with ESA teams back on Earth.

The spacecraft consists of two main elements: the orbiter, which features 12 instruments, and the "Philae" robotic lander with an additional nine instruments. The first images of 67P are expected in May, from 2 million km (1.25 million mi) away. Rendezvous with the comet occurs in August this year, with deployment of the lander in November. Because of the comet's extremely low gravity, a harpoon system will lock the probe and drag it towards the surface, with legs dampening its eventual impact. Additional drills are used to further secure the lander on the comet.

Once on the surface, Philae will conduct the most detailed study of a comet ever attempted. Measurements of the ice, nucleus and chemical compounds present could reveal new details about the Solar System's history; perhaps even the origin of life itself. Among its many instruments are a drill that will bore 23 cm below the surface. A camera will also take high-resolution images (2048 × 2048 pixels) of the descent and a panorama of the landing site.

Rosetta will be the first spacecraft to fly alongside a comet as it heads towards the inner Solar System and the first to examine at close range how a frozen comet is transformed by the Sun's warmth. The mission runs until December 2015.

 

 

  speech bubble Comments »
 

 

 

19th January 2014

Letter to Barack Obama (part 2)

Concerned by his "all of the above" energy strategy, a group of environmentalists this week sent a joint letter to President Barack Obama, calling on him to expand clean energy. This follows a similar effort last year by business leaders, philanthropists and election campaign supporters. The letter is reproduced here in full.

 

carbon dioxide levels

 


 

American Rivers | Clean Water Action | Defenders of Wildlife | Earthjustice

Energy Action Coalition | Environment America | Environmental Defense Fund

Friends of the Earth | League of Conservation Voters | National Audubon Society |
National Wildlife Federation | Native American Rights Fund

Natural Resources Defense Council | Oceana | Physicians for Social Responsibility |
Population Connection | Sierra Club | Voices for Progress

 

President Barack Obama
The White House
1600 Pennsylvania Ave NW
Washington, DC 20500

Dear Mr. President,

We applaud the actions you have taken to reduce economy-wide carbon pollution and your commitment last June "to take bold action to reduce carbon pollution" and "lead the world in a coordinated assault on climate change." We look forward to continuing to work with you to achieve these goals.

In that speech, you referenced that in the past you had put forward an "all of the above" energy strategy, yet noted that we cannot just drill our way out of our energy and climate challenge. We believe that continued reliance on an "all of the above" energy strategy would be fundamentally at odds with your goal of cutting carbon pollution and would undermine our nation's capacity to respond to the threat of climate disruption. With record-high atmospheric carbon concentrations and the rising threat of extreme heat, drought, wildfires and super storms, America's energy policies must reduce our dependence on fossil fuels, not simply reduce our dependence on foreign oil.

We understand that the U.S. cannot immediately end its use of fossil fuels and we also appreciate the advantages of being more energy independent. But an "all of the above" approach that places virtually no limits on whether, when, where or how fossil fuels are extracted ignores the impacts of carbon-intense fuels and is wrong for America's future. America requires an ambitious energy vision that reduces consumption of these fuels in order to meet the scale of the climate crisis.

An "all of the above" strategy is a compromise that future generations can't afford. It fails to prioritize clean energy and solutions that have already begun to replace fossil fuels, revitalize American industry, and save Americans money. It increases environmental injustice while it locks in the extraction of fossil fuels that will inevitably lead to a catastrophic climate future. It threatens our health, our homes, our most sensitive public lands, our oceans and our most precious wild places. Such a policy accelerates development of fuel sources that can negate the important progress you've already made on lowering U.S. carbon pollution, and it undermines U.S. credibility in the international community.

Mr. President, we were very heartened by your commitment that the climate impacts of the proposed Keystone XL pipeline would be "absolutely critical" to the decision and that it would be contrary to the "national interest" to approve a project that would "significantly exacerbate the problem of carbon pollution." We believe that a climate impact lens should be applied to all decisions regarding new fossil fuel development, and urge that a "carbon-reducing clean energy" strategy rather than an "all of the above" strategy become the operative paradigm for your administration's energy decisions.

In the coming months your administration will be making key decisions regarding fossil fuel development -- including the Keystone XL pipeline, fracking on public lands, and drilling in the Arctic ocean -- that will either set us on a path to achieve the clean energy future we all envision or will significantly exacerbate the problem of carbon pollution. We urge you to make climate impacts and emission increases critical considerations in each of these decisions.

Mr. President, we applaud you for your commitment to tackle the climate crisis and to build an economy powered by energy that is clean, safe, secure, and sustainable.

Sincerely,

 

Wm. Robert Irvin
President and CEO
American Rivers

Robert Wendelgass
President
Clean Water Action

Jamie Rappaport Clark
President and CEO
Defenders of Wildlife

Trip Van Noppen
President
Earthjustice

Maura Cowley
Executive Director
Energy Action Coalition

Margie Alt
Executive Director
Environment America

Fred Krupp
President
Environmental Defense Fund

Eric Pica
President
Friends of the Earth

Gene Karpinski
President
League of Conservation Voters

David Yarnold
President and CEO
National Audubon Society

Larry J. Schweiger
President & CEO
National Wildlife Federation

John Echohawk
Executive Director
Native American Rights Fund

Frances Beinecke
President
Natural Resources Defense Council

Andrew Sharpless
Chief Executive Officer
Oceana

Catherine Thomasson, MD
Executive Director
Physicians for Social Responsibility

John Seager
President
Population Connection

Michael Brune
Executive Director
Sierra Club

Sandy Newman
President
Voices for Progress

 

  speech bubble Comments »
 

 

 

19th January 2014

NASA budget of $17.6 billion is approved

President Barack Obama has signed a budget that provides NASA with $17.6 billion for this year – fully funding both the heavy-lift Space Launch System and Orion capsule that will eventually take humans to Mars.

 

nasa budget 2014 sls orion
The Space Launch System (left) and Orion capsule (right).

 

NASA's budget for 2014 was passed by Congress earlier this week and officially signed by the President on Friday. A total of $17.65 billion has been allocated to the space agency, which is slightly less than the $17.7 billion it had requested. However, some analysts had expected a figure as low as $16.1 billion, due to recent budget cuts and spending concerns arising from the sequester of 2013. For space enthusiasts, the final approved figure is therefore a welcome surprise.

Some highlights from the budget include:

• $1,918 million for the Space Launch System (SLS).

The SLS is a heavy launch vehicle intended to replace the Space Shuttle. It is designed to be upgraded over time with more powerful versions. Initially carrying payloads of 70 metric tons into orbit, the SLS will eventually be fitted with an upper "Earth Departure Stage" capable of lifting at least 130 metric tons. This will be 12 metric tons greater than the Apollo-era Saturn V, making it the largest and most powerful rocket ever built. It will take astronauts and hardware to asteroids, the Moon, Mars, and most of the Earth's Lagrangian points. A first unmanned test launch is planned for 2017, with NASA being allocated an extra $200 million to ensure this deadline is met. A manned flight around the Moon and possibly to an asteroid is expected to occur in 2021, with manned missions to Mars in the 2030s. The additional funding in this year's budget will "maintain critical forward momentum" on the program, according to legislators.

• $1,197 million for the Orion Multi-Purpose Crew Vehicle (MPCV).

Orion is a small capsule designed to transport up to six astronauts and cargo beyond Earth orbit. It will be integrated with and carried by the SLS rockets. A first unmanned test flight is scheduled for later this year, during which its altitude will reach higher than any spacecraft intended for human use since 1973. Manned flights will commence in the 2020s.

• $5,151 million for science.

This includes $80 million for planning and development of a Europa mission. The next Discovery-class mission will be announced by May 2014, with selection of the mission(s) in September 2015. Meanwhile, NASA's flagship project and Hubble successor – the James Webb Space Telescope – remains funded and on track for delivery in 2018. Among its primary objectives will be capturing images of reionization and "first light" from stars after the Big Bang.

The remaining budget will go towards operational maintenance, space technology, aeronautics, grants, education and other services provided by NASA. Despite this week's good news, however, the longer term picture is less clear for NASA. As shown in the graph below, its budget as a percentage of the federal budget has been gradually declining and is now a mere fraction of its peak in the 1960s. It will be interesting to see how the private sector can influence the agency's strategy in the coming decades.

 

NASA budget as a percentage of federal budget

 

  speech bubble Comments »
 

 

 

17th January 2014

A smart contact lens for diabetes sufferers

Globally, an estimated 285 million people have diabetes – a chronic disease that occurs when the pancreas does not produce enough insulin, or when the body cannot effectively use the insulin it produces. Its incidence is growing rapidly, and by 2030, the number of cases is predicted to almost double. By 2050, as many as one in three U.S. adults could be affected if current trends continue.

To keep their blood sugar levels under control, sufferers need to constantly monitor themselves. This can involve pricking their finger to get a blood sample, two to four times per day. For many people, managing this condition is therefore a painful and disruptive process.

To address this problem, Internet giant Google has announced it is developing a smart contact lens. This wearable tech will measure glucose levels in tears, using a tiny wireless chip and miniaturised sensor, embedded between two layers of soft contact lens material. When glucose levels fall below a certain threshold, tiny LED lights will activate themselves to function as a warning system for the wearer.

Google admits it is still "early days" for this technology, but there is clearly great potential for improving the lives of diabetes sufferers around the world. To achieve their goal, they intend to partner with other technology companies who have previous experience of bringing products like this to market. You can read more at the Google Official Blog.

 

google smart contact lens
Credit: Google

 

  speech bubble Comments »
 

 

 

16th January 2014

$1,000 genome sequencing is finally here

U.S. biotechnology company, Illumina, has demonstrated the first machine capable of sequencing a complete human genome for less than $1,000. This landmark opens the floodgates to mass genome sequencing and will lead to cheaper and faster medical research.

 

1000 dollar genome sequencing technology

 

The Human Genome Project – an international effort to identify and map every gene in the body – was initiated in 1990 at a cost of $3 billion. It was the largest collaborative biological project ever undertaken, involving hundreds of laboratories and requiring 13 years to complete.

In the first decade of the 21st century, new sequencing methods led to costs plummeting at a rate even faster than Moore's Law, the trend of exponentially improving power seen in computer chips. In addition to price, the length of time needed to scan DNA was falling rapidly. By the early 2010s, thousands of human genomes had been decoded worldwide.

In recent years, however, this trend appeared to reach a plateau with costs hovering at between $3-5,000. Hopes for a $1,000 genome seemed unrealistic. The industry had surely experienced the law of diminishing returns.

That apparent stagnation has now ended, with U.S. company Illumina achieving a major breakthrough in the form of their HiSeq X Ten Sequencing System. In a press release, they claim to have "broken the sound barrier of human genomics", by enabling $1,000 whole genome sequencing for the first time.

 

human genome sequencing costs graph

 

HiSeq X Ten will be released in March 2014 at $1m each, in a minimum of 10 units. The figure of $1,000 per genome takes into account the cost of the machines and the chemicals needed to do the sequencing. It is, therefore, not yet economical for smaller research labs, hospitals, or doctor's offices. However, larger research institutes have already expressed an interest – including the Broad Institute in Cambridge, Massachusetts; Macrogen in Seoul, South Korea; and the Garvan Institute of Medical Research in Sydney, Australia. In the coming years, costs will continue to go down while speed and accuracy go up, and the average person will be able to scan their genome at a very affordable price.

Other companies, such as 23andMe, have been offering personalised DNA analysis for some time now. These tests, however, only do partial sequencing for a tiny fraction of total genes. The new machine from Illumina, on the other hand, scans the entire human genome, all 3.2 billion base pairs. Five complete genomes can be delivered in a day, or potentially 1,825 per year. Several advanced design features are utilised to generate this massive throughput, an order of magnitude faster than before: patterned flow cells (which contain billions of nanowells at fixed locations), combined with a new clustering chemistry (for high occupancy and monoclonality), and state-of-the art optics.

Jay Flatley, CEO of Illumina: “With the HiSeq X Ten, we’re delivering the $1,000 genome, reshaping the economics and scale of human genome sequencing, and redefining the possibilities for population-level studies in shaping the future of healthcare. The ability to explore the human genome on this scale will bring the study of cancer and complex diseases to a new level. Breaking the ‘sound barrier’ of human genetics not only pushes us through a psychological milestone, it enables projects of unprecedented scale. We are excited to see what lies on the other side.”

 

  speech bubble Comments »
 

 

 

14th January 2014

New method can wipe out cancer cells in the bloodstream

A new way to potentially stop cancer cells from spreading and moving throughout the bloodstream has been discovered by researchers at Cornell University.

 

blood cells proteins cancer

 

By attaching a protein to white blood cells, biomedical engineers at Cornell University have demonstrated the annihilation of metastasizing cancer cells moving through the bloodstream. The study, “TRAIL-Coated Leukocytes that Kill Cancer Cells in the Circulation,” is published in the 6th January edition of the journal PNAS.

“These circulating cancer cells are doomed,” said Michael King, Cornell professor of biomedical engineering and the study’s senior author. “About 90% of cancer deaths are related to metastases, but now we’ve found a way to dispatch an army of killer white blood cells that cause apoptosis – the cancer cell’s own death – obliterating them from the bloodstream. When surrounded by these guys, it becomes nearly impossible for the cancer cell to escape.”

Metastasis is the spread of cancer cells to other parts of the body. Surgery and radiation can be effective at treating primary tumors – but difficulty in detecting metastatic cancer cells has made treatment of spreading cancer problematic, say the scientists.

 

metastasis

 

King and his colleagues injected human blood samples, and later mice, with two proteins: E-selectin (an adhesive) and TRAIL (Tumor Necrosis Factor Related Apoptosis-Inducing Ligand). The TRAIL protein joined together with the E-selectin protein stick to leukocytes – white blood cells – ubiquitous in the bloodstream. When a cancer cell comes into contact with TRAIL, which becomes unavoidable in the chaotic blood flow, the cancer cell essentially kills itself.

“The mechanism is surprising and unexpected in that this repurposing of white blood cells in flowing blood is more effective than directly targeting the cancer cells with liposomes or soluble protein,” say the authors.

 

adhesive protein cancer cells in bloodstream

 

In the laboratory, King and his colleagues tested this concept’s efficacy. When treating cancer cells with the proteins in saline, they found a 60 percent success rate in killing the cancer cells. In normal laboratory conditions, the saline lacks white blood cells to serve as a carrier for the adhesive and killer proteins. Once the proteins were added to flowing blood, which models forces, mixing and other human-body conditions, however, the success rate in killing the cancer cells jumped to nearly 100 percent.

As this research is newly announced, King says animal trials will continue and he hopes that the research will proceed to human clinical trials sometime in the future.

 

 

  speech bubble Comments »
 

 

 

11th January 2014

Lions are critically endangered in West Africa

A report published this week concludes that the lion is facing extinction across the entire West African region. The West African lion once ranged continuously from Senegal to Nigeria, but the new paper reveals there are now only an estimated 250 adult lions restricted to four isolated and severely imperiled populations. Only one of those populations contains more than 50 lions.

 

west african lion
West African male lion. Credit: Jonas Van de Voorde (CC BY-SA 3.0)

 

Led by Dr. Philipp Henschel of conservation group Panthera, and co-authored by an international team from West Africa, the UK, Canada and the USA, this survey appears in the journal PLOS ONE. The report's sobering results represent a massive effort – taking six years and covering 11 countries where lions were presumed to exist in the last two decades. This new, highly detailed information builds on an earlier continent-wide review of lion status produced by Duke University to which Dr. Henschel also contributed. Both surveys were funded by National Geographic's Big Cats Initiative (BCI).

"When we set out in 2006 to survey all the lions of West Africa, the best reports suggested they still survived in 21 protected areas," explains Henschel. "We surveyed all of them, representing the best remaining lion habitat in West Africa. Our results came as a complete shock. All but a few of the areas we surveyed were basically 'paper parks', having neither management budgets nor patrol staff, and had lost all their lions and other iconic large mammals."

The team discovered that West African lions now survive in only 5 countries: Senegal, Nigeria and a single trans-frontier population on the shared borders of Benin, Niger and Burkina Faso. They are genetically distinct from the better-known lions of famous game parks in East and southern Africa. Recent molecular research shows they are closely related to the extinct "Barbary Lions" that once roamed North Africa, as well as to the last Asiatic lions surviving in India.

 

lions population map

 

"West African lions have unique genetic sequences not found in any other lions, including in zoos or captivity," explained Dr. Christine Breitenmoser, co-chair of the IUCN/SCC Cat Specialist Group, which determines the conservation status of wild cats around the world. "If we lose the lion in West Africa, we will lose a unique, locally adapted population found nowhere else. It makes their conservation even more urgent."

Lions have disappeared across Africa as human populations and their livestock herds have expanded, competing for land with lions and other wildlife. Wild savannas are converted for agriculture and cattle, the lion's natural prey is hunted out and lions are killed by pastoralists fearing the loss of their herds.

National Geographic explorer and BCI co-founder Dereck Joubert commented: "Every survey we do is inaccurate because as soon as you complete it, it is already out of date; the declines are so rapid. It is a terribly sad state of affairs when you can very accurately count the lions in an area because there are so few of them. This is critical work that again confirms that we are underestimating the rate of decline of lion populations and that the situation requires a global emergency intervention."

Today, fewer than 35,000 lions remain in Africa in about 25% of the species' original range. In West Africa, the lion now survives in less than 50,000 square kilometres – smaller than half the size of New York State – and only 1% of its original historic range in the region.

Panthera's President, Dr. Luke Hunter: "Lions have undergone a catastrophic collapse in West Africa. The countries that have managed to retain them are struggling with pervasive poverty and little funding for conservation. To save the lion – and many other critically endangered mammals including unique populations of cheetahs, African wild dogs and elephants – will require a massive commitment of resources from the international community."

 

  speech bubble Comments »
 

 

 

11th January 2014

Implantable device can reduce sleep apnea by 70 percent

Implantation of a sleep apnea device called Inspire Upper Airway Stimulation (UAS) therapy can lead to significant improvements for patients with obstructive sleep apnea (OSA), according to a study published in the New England Journal of Medicine. After one year, patients using the device had an average 70 percent reduction in sleep apnea severity, as well as significant reductions in daytime sleepiness.

 

sleep apnea technology
Credit: Inspire

 

Sleep apnea is a disorder characterised by pauses in breathing, or shallow and infrequent breathing, during sleep. Each of these pauses in breathing, called an apnea, can last from seconds to minutes, and may occur 30 times or more an hour. When normal breathing returns (sometimes accompanied by a loud snort or choking sound), the body moves out of deep sleep and into a lighter sleep. This results in poor overall quality of sleep and excessive tiredness during the daytime – increasing a person's risk for heart attack, stroke, high blood pressure and even death.

It is estimated that seven percent of Americans are affected by at least moderate sleep apnea. For those in middle age, this figure is higher, with as many as nine percent of women and 24 percent of men in this age group being affected, undiagnosed and untreated.

The costs of untreated sleep apnea reach further than just health issues. It is estimated that in the U.S. the average untreated patient's health care costs $1,336 more annually than an individual without sleep apnea. This may cause up to $3.4 billion/year in additional medical costs.

Treatments can include weight loss, upper airway surgeries, oral appliances, and continuous positive airway pressure (CPAP). However, while CPAP can be successful if used regularly, up to half of patients are unable to use it properly – largely due to discomfort with the mask and/or lack of desire to be tethered to a machine.

That's where a new device created at the University of Pittsburgh may help. Director of the UPMC Sleep Medicine Center, and lead author of the study, Dr Patrick Strollo, explains: "Inspire UAS therapy differs from other traditional sleep apnea devices and surgical procedures in that it targets muscle tone of the throat, rather than just the anatomy. Two thirds of patients using the device had successful control of their apneas, although even more reported improvement in snoring, daytime sleepiness and quality of life measures. Eighty-six percent of patients were still using the device every night at the one year mark, which compares very favourably to CPAP. The results of this trial show a huge potential for a new and effective treatment that can help millions of patients."

The device was fitted in three areas: a stimulation electrode was placed on the hypoglossal nerve, which provides innervation to the muscles of the tongue; a sensing lead was placed between rib muscles to detect breathing effort; and a neurostimulator was implanted in the upper right chest, just below the clavicle bone. Patients used a "controller" to turn on the device at night, so it was only used when the patient slept. The Inspire UAS therapy device was able to sense breathing patterns and to stimulate tongue muscles, thereby enlarging and stabilising the airway for improved breathing.

Kathy Gaberson, one of the study participants: "My short-term memory has improved significantly, and the surgery has made a huge difference in my quality of life. My apnea episodes went from 23 times an hour to just two."

 

inspire sleep deviceCredit: Inspire

 

  speech bubble Comments »
 

 

 

9th January 2014

IBM forms Watson Group to meet growing demand for cognitive innovations

Headquartered in New York City's "Silicon Alley", the new Watson Group formed by IBM will fuel innovative products and startups – introducing cloud solutions to accelerate research, visualise Big Data and enable analytics exploration.

 

 

IBM today announced it will establish the IBM Watson Group, a new business unit dedicated to the development and commercialisation of cloud-delivered cognitive innovations. The move signifies a strategic shift by IBM to accelerate into the marketplace a new class of software, services and apps that can "think", improve by learning, and discover answers and insights to complex questions from massive amounts of Big Data.

IBM will invest more than $1 billion into the Watson Group, focusing on research and development to bring cloud-delivered cognitive applications and services to market. This will include $100 million available for venture investments to support IBM's recently launched ecosystem of start-ups and businesses, which are building a new class of cognitive apps powered by Watson, in the IBM Watson Developers Cloud.

According to technology research firm Gartner, smart machines will be the most disruptive change ever brought about by information technology, and can make people more effective, empowering them to do "the impossible."

The IBM Watson Group will have a new headquarters at 51 Astor Place in New York City's "Silicon Alley" technology hub, leveraging the talents of 2,000 professionals, whose goal is to design, develop and accelerate the adoption of Watson cognitive technologies that transform industries and professions. The new group will tap subject matter experts from IBM's Research, Services, Software and Systems divisions, as well as industry experts who will identify markets that cognitive computing can disrupt and evolve, such as healthcare, financial services, retail, travel and telecommunications.

Nearly three years after its triumph on the TV show Jeopardy!, IBM has advanced Watson from a quiz game innovation into a commercial technology. Now delivered from the cloud and powering new consumer apps, Watson is 24 times faster and 90 percent smaller – IBM has shrunk Watson from the size of a master bedroom to three stacked pizza boxes.

Named after IBM founder Thomas J. Watson, the machine was developed in IBM’s Research labs. Using natural language processing and analytics, Watson handles information akin to how people think, representing a major shift in the ability to quickly analyse, understand and respond to Big Data. Watson’s ability to answer complex questions in natural language with speed, accuracy and confidence will transform decision making across a range of industries.

"Watson is one of the most significant innovations in IBM's 100 year history, and one that we want to share with the world," says IBM Senior Vice President Mike Rhodin (pictured below), who will lead the group. "These new cognitive computing innovations are designed to augment users’ knowledge – be it the researcher exploring genetic data to create new therapies, or a business executive who needs evidence-based insights to make a crucial decision."

 

mike rhodin IBM Watson

 

  speech bubble Comments »
 

 

 

9th January 2014

Next-gen graphics coming in 2015

A Polish company – Better Reality – has been developing a new graphics platform known as "Thorskan". This is able to scan real environments and recreate them in 3D with spectacular resolution and detail. Though it is mainly being used by advertisers and Hollywood studios like 20th Century Fox, Better Reality says it can also work in video games.

Another Polish company, The Farm 51, has indeed been using Thorskan to create a new PC/XB1/PS4 game – Get Even – that is set for release in 2015. This will feature "ambitious new dynamic and photorealistic graphics", a taste of which can be found in the preview below. A trailer was also released yesterday. We recommend watching in full screen HD.

 

 

  speech bubble Comments »
 

 

 

7th January 2014

Intel at CES 2014

At the Consumer Electronics Show (CES) in Las Vegas, Intel Corporation has been showing off its latest innovative technologies. These include an intelligent 3D camera system, a range of new wearable electronics, and a 22nm dual-core PC the size of an SD card.

 

intel edison 22nm dual core pc 2014 technology

 

Intel CEO Brian Krzanich has outlined a range of new products, initiatives and strategic relationships aimed at accelerating innovation across a range of mobile and wearable devices. He made the announcements during the pre-show keynote for the 2014 Consumer Electronics Show in Las Vegas, the biggest gathering of the tech industry in the USA.

Krzanich's keynote painted a vision of how the landscape of computing is being re-shaped and where security is too important not to have it embedded in all devices. The world is entering a new era of integrated computing defined not by the device, but the integration of technology into people's lifestyles in ways that offer new utility and value. As examples, he highlighted several immersive and intuitive technologies that Intel will begin offering in 2014, such as Intel RealSense – hardware and software that will bring human senses to Intel-based devices. This will include 3D cameras that deliver more intelligent experiences – improving the way people learn, collaborate and are entertained.

The first Intel RealSense 3D camera features a best-in-class depth sensor and a full 1080p colour camera. It can detect finger level movements enabling highly accurate gesture recognition, facial features for understanding movement and emotions. It can understand foregrounds and backgrounds to allow control, enhance interactive augmented reality (AR), simply scan items in three dimensions, and more.

This camera will be integrated into a growing spectrum of Intel-based devices including 2 in 1, tablet, Ultrabook, notebook, and all-in-one (AIO) designs. Systems with the new camera will be available beginning in the second half of 2014 from Acer, Asus, Dell, Fujitsu, HP, Lenovo and NEC.

To advance the computer's "hearing" sense, a new generation of speech recognition technology will be available on a variety of systems. This conversational personal assistant works with popular websites and applications. It comes with selectable personalities, and allows for ongoing dialogue with Intel-based devices. People can simply tell it to play music, get answers, connect with friends and find content – all by using natural language. This assistant is also capable of calendar checks, getting maps and directions, finding flights or booking a dinner reservation. Available offline, people can control their device, dictate notes and more without an Internet connection.

 

 

Krzanich then explained how Intel aims to accelerate wearable device innovation. A number of reference designs were highlighted including: smart earbuds providing biometric and fitness capabilities, a smart headset that is always ready and can integrate with existing personal assistant technologies, a smart wireless charging bowl, a smart baby onesie and a smart bottle warmer that will start warming milk when the onesie senses the baby is awake and hungry.

The smart earbuds (pictured below) provide full stereo audio, monitor heart rate and pulse all while the applications on the user's phone keep track of running distance and calories burned. The product includes software to precision-tune workouts by automatically choosing music that matches the target heart rate profile. As an added bonus, it harvests energy directly from the audio microphone jack, eliminating the need for a battery or additional power source to charge the product.

 

intel smart earbuds 2014 technology

 

The Intel CEO announced collaborations to increase dialogue and cooperation between fashion and technology industries to explore and bring to market new smart wearable electronics. He also kicked-off the Intel "Make it Wearable" challenge – a global effort aimed at accelerating creativity and innovation with technology. This effort will call upon the smartest and most creative minds to consider factors impacting the proliferation of wearable devices and ubiquitous computing, such as meaningful usages, aesthetics, battery life, security and privacy.

In addition to reference designs for wearable technology, Intel will offer a number of accessible, low-cost entry platforms aimed at lowering entry barriers for individuals and small companies, allowing them to create innovative web-connected wearables or other small form factor devices. Underscoring this point, Krzanich announced Intel Edison – a low-power, 22nm-based computer in an SD card form factor with built-in wireless abilities and support for multiple operating systems. From prototype to production, Intel Edison will enable rapid innovation and product development by a range of inventors, entrepreneurs and consumer product designers when available this summer.

 

intel edison 22nm dual core pc 2014 technology

 

"Wearables are not everywhere today, because they aren't yet solving real problems and they aren't yet integrated with our lifestyles," said Krzanich. "We're focused on addressing this engineering innovation challenge. Our goal is: if something computes and connects, it does it best with Intel inside."

Krzanich also discussed how Intel is addressing a critical issue for the industry as a whole: conflict minerals from the Democratic Republic of the Congo (DRC). Intel has achieved a critical milestone and the minerals used in microprocessor silicon and packages manufactured in Intel's factories are now "conflict-free", as confirmed by third-party audits.

"Two years ago, I told several colleagues that we needed a hard goal, a commitment to reasonably conclude that the metals used in our microprocessors are conflict-free," Krzanich said. "We felt an obligation to implement changes in our supply chain to ensure that our business and our products were not inadvertently funding human atrocities in the Democratic Republic of the Congo. Even though we have reached this milestone, it is just a start. We will continue our audits and resolve issues that are found."

 

intel conflict minerals

 

  speech bubble Comments »
 

 

 

4th January 2014

Ford unveils a solar-powered hybrid car

Ford Motor Company has announced the C-MAX Solar Energi Concept, a first-of-its-kind Sun-powered car with potential to deliver the best of what a plug-in hybrid offers – without depending on the electric grid for fuel.

 

 

Instead of powering its battery from an electrical outlet, the C-MAX Solar Energi harnesses power from the Sun by using a special concentrator that acts like a magnifying glass – directing intense rays to panels on the vehicle roof.

The result is a concept vehicle that takes a day’s worth of sunlight to deliver the same performance as the conventional C-MAX Energi plug-in hybrid, which draws its power from the electric grid. Ford C-MAX Energi gets a combined best miles per gallon equivalent in its class, with 108 MPGe city and 92 MPGe highway, for a combined average 100 MPGe. By using renewable power, it reduces the annual greenhouse gas emissions a typical owner would produce by four metric tons.

“Ford C-MAX Solar Energi Concept shines a new light on electric transportation and renewable energy,” said Mike Tinskey, Ford global director of vehicle electrification and infrastructure. “As an innovation leader, we want to further the public dialog about the art of the possible in moving the world toward a cleaner future.”

C-MAX Solar Energi Concept, which will be shown at the 2014 Consumer Electronics Show (CES) in Las Vegas, is a collaborative project of Ford, SunPower Corp and the Georgia Institute of Technology.

 

ford solar powered car 2014 technology

 

Strong electrified vehicle sales

The C-MAX Solar Energi Concept debuts as Ford caps a record year of electrified vehicle sales. The company expects to sell 85,000 hybrids, plug-in hybrids and all-electric vehicles for 2013 – the first full year its six new electrified vehicles were available in dealer showrooms.

Ford sold more plug-in vehicles in October and November than both Toyota and Tesla, and it outsold Toyota through the first 11 months of 2013. Plug-in hybrids continue to grow in sales as more customers discover the benefits of using electricity to extend their driving range.

Breakthrough clean technology

SunPower, which has been Ford’s solar technology partner since 2011, is providing high-efficiency solar cells for the roof of this concept car. Because of the extended time it takes to absorb enough energy to fully charge, Ford turned to the Georgia Institute of Technology for a way to amplify sunlight, to make a solar-powered hybrid feasible for daily use.

Researchers developed an off-vehicle solar concentrator (pictured below) with a special Fresnel lens to direct sunlight to the solar cells while boosting the impact of sunlight by a factor of eight. A Fresnel is a compact lens originally developed for use in lighthouses. Similar in concept to a magnifying glass, this patent-pending system tracks the Sun as it moves from east to west, drawing enough power each day to equal a four-hour battery charge (8 kilowatts).

 

ford solar power car technology 2014

 

With a full charge, the C-MAX Solar Energi Concept will achieve the same range as a conventional C-MAX Energi hybrid – up to 620 miles, including 21 electric-only miles. Additionally, the vehicle still has a charge port, and can be charged by connecting to a station via cord and plug, so that drivers retain the option to power up via the grid, if desired. 

After the C-MAX Solar Energi Concept is shown at CES, Ford and Georgia Tech will begin testing the vehicle in numerous real-world scenarios. The outcome of those tests will help to determine if the concept is feasible as a production car.  

Off-the-grid car

By tapping renewable solar energy with a rooftop solar panel system, the C-MAX Solar Energi Concept is not dependent on the traditional electric grid for its battery power. Research by Ford suggests the Sun could power up to 75 percent of all trips made by an average driver in a solar hybrid car. This could be especially important in places where the electric grid is underdeveloped, unreliable or expensive to use.

The vehicle also reinforces MyEnergi Lifestyle, a concept revealed by Ford and several partners at 2013 CES. MyEnergi Lifestyle uses math, science and computer modelling to help homeowners understand how they can take advantage of energy-efficient home appliances, solar power systems and plug-in hybrid vehicles to significantly reduce monthly expenses while also reducing their overall carbon footprint.

The positive environmental impact from Ford C-MAX Solar Energi could be significant. It would reduce yearly CO2 and other greenhouse gas emissions from the average U.S. car owner by as much as four metric tons – the equivalent of what a U.S. house produces in four months.

If all light-duty vehicles in the United States were to adopt Ford C-MAX Solar Energi Concept technology, annual greenhouse gas emissions could be reduced by approximately 1 billion metric tons.

 

ford solar power car technology 2014

 

  speech bubble Comments »
 

 

 

1st January 2014

Cloud mystery solved: Global temperatures to rise 4°C by 2100

Global average temperatures will rise at least 4°C by 2100 and potentially more than 8°C by 2200 if carbon dioxide emissions are not reduced, according to new research that shows our climate is more sensitive to CO2 than most previous estimates.

 

clouds climate change 2100 2200

 

This research could solve one of the great unknowns of climate sensitivity, the role of cloud formation and whether it will have a positive or negative effect on global warming.

Professor Steven Sherwood, from the University of New South Wales: "Our research has shown climate models indicating a low temperature response to a doubling of carbon dioxide from preindustrial times are not reproducing the correct processes that lead to cloud formation."

"When the processes are correct in the climate models, the level of climate sensitivity is far higher. Previously estimates of the sensitivity of global temperature to a doubling of carbon dioxide ranged from 1.5°C to 5°C. This new research takes away the lower end of climate sensitivity estimates, meaning that global average temperatures will increase by 3°C to 5°C with a doubling of carbon dioxide."

The key to this narrower but much higher estimate can be found in the observations around the role of water vapour in cloud formation. Observations show that when water vapour is taken up by the atmosphere through evaporation the updraughts often rise up to 15 km to form heavy rains, but can also rise just a few km before returning to the surface without forming such rains. In addition, where updraughts rise this smaller distance, they reduce total cloud cover because they pull more vapour away from the higher cloud forming regions than when only the deep ones are present.

Climate models showing a low temperature response to carbon dioxide do not include enough of this lower-level process. They instead simulate nearly all updraughts rising to 15 km. These deeper updraughts alone do not have the same effect, leading to increased reflection of sunlight and reduced sensitivity of the global climate to atmospheric carbon dioxide. However, real world observations show this behaviour is wrong.

When the processes are correct in the climate model, this produces cycles that take water vapour to a wider range of heights in the atmosphere, causing fewer clouds to form in a warmer climate. This increases the amount of sunlight and heat entering the atmosphere and increases the sensitivity of our climate to carbon dioxide or any other perturbation.

When water vapour processes are correctly represented, the sensitivity of the climate to a doubling of carbon dioxide – which will occur in the next 50 years – means we can expect a temperature increase of at least 3°C and more likely 4°C by 2100.

"Climate sceptics like to criticise climate models for getting things wrong, and we are the first to admit they are not perfect, but what we're finding is that the mistakes are being made by those models which predict less warming, not those that predict more," said Professor Sherwood. "Rises in global average temperatures of this magnitude will have profound impacts on the world and the economies of many countries if we don't urgently start to curb our emissions."

The study is published online today in Nature.

 

 

  speech bubble Comments »
 

 

 

 
     
       
     
   
« Previous Next »
   
     
   

 
     
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2015

 

2014

 

2013

 

2012

 

2011

 

2010

 

 
 
 
 

 


future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed