future timeline technology singularity humanity
 
   
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed
 
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2015

 

2014

 

2013

 

2012

 

2011

 

2010

 

 
     
     
     
 
       
   
 
     
 

27th March 2017

Critical step found in DNA repair and cellular aging

A new study on mice has found a possible treatment for DNA damage from aging and radiation. This finding could be especially helpful for astronauts in space, who are at greater risk of DNA damage from cosmic radiation.

 

dna space
Credit: David Sinclair, Harvard Medical School

 

An international team – including researchers from Harvard and the University of New South Wales (UNSW) – has made a discovery that could lead to a revolutionary drug for reversing aspects of the aging process, improving DNA repair and ensuring the long-term survival of colonists on Mars.

In a paper published by the journal Science, they describe a critical step in the molecular process that allows cells to repair damaged DNA. Their tests on mice suggest a treatment is possible for humans exposed to radiation. It is so promising that it has attracted the attention of NASA, which believes the treatment can help its Mars mission during the 2030s.

While our cells have an innate capability to repair DNA damage − which happens every time we go out into the Sun, for example – their ability to do this declines as we age. The scientists identified that the metabolite NAD+, which is naturally present in every cell of our body, has a key role as a regulator in protein-to-protein interactions that control DNA repair. Treating mice with a NAD+ precursor, or "booster," called NMN improved their cells' ability to repair DNA damage caused by radiation exposure or old age.

"The cells of the old mice were indistinguishable from the young mice, after just one week of treatment," said the lead author, Professor David Sinclair of UNSW School of Medical Sciences and Harvard Medical School. Human trials of NMN therapy will begin within six months. "This is the closest we are to a safe and effective anti-aging drug that's perhaps only three to five years away from being on the market if the trials go well," says Sinclair.

 

 

 

The work has excited NASA, which faces the challenge of keeping its astronauts healthy during a four-year mission to Mars. Even on short missions, humans can experience accelerated aging from cosmic radiation, and suffer muscle weakness, memory loss and other symptoms when they return. On a trip to Mars the situation would be far worse: five per cent of the astronauts' cells would die and their chances of cancer would approach 100 per cent.

Cosmic radiation is not only an issue for astronauts. We're all exposed to it aboard aircraft, with a London-Singapore-Melbourne flight roughly equivalent in radiation to a chest x-ray. In theory, the same treatment could mitigate any effects of DNA damage for frequent flyers.

The other group that could benefit from this work is survivors of childhood cancers. 96 per cent of childhood cancer survivors suffer a chronic illness by age 45, including cardiovascular disease, Type 2 diabetes, Alzheimer's disease, and cancers unrelated to the original cancer.

"All of this adds up to the fact they have accelerated ageing, which is devastating," explains Sinclair's colleague, Dr Lindsay Wu. "It would be great to do something about that, and we believe we can with this molecule."

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

24th March 2017

Aging partially reversed in mice by flushing out senescent cells

Dutch scientists have announced a new drug treatment able to reverse aspects of aging in old mice – restoring their stamina, coat of fur and even some organ function – by flushing out "senescent" cells in the body that have stopped dividing. Human trials are now planned.

 

aging cure mice future timeline

 

Researchers at the Erasmus University Medical Centre in Rotterdam, Netherlands, have found a way to turn back aging. By giving old mice a peptide that disrupts the binding between two proteins, the mice became fitter and more alert, their coat of fur became fuller again, and organ functions improved. This discovery was published yesterday in the leading scientific journal Cell.

Key player in the study is proxofim, a substance developed by the researchers themselves. It disrupts the binding between the proteins FOXO4 and p53. In contrast to existing substances used by researcher to intervene with aging, proxofim was found to have no adverse effects on the health of the mice. "The platelet count and the liver function, for example, remained normal," said Peter de Keizer, a researcher in Erasmus MC's department of Molecular Genetics and a lead author in this study.

Proxofim can deal with so-called "senescent" cells that play a significant role in aging. These are cells that have ceased to divide, but are not really dead: "In fact, their metabolism does continue, which means they continue to secrete all kinds of proteins, including inflammatory cytokines," says De Keizer. "These in turn cause more rapid aging of tissues and poorer organ function. They also play a role in cancer. The senescent cells make cancer less sensitive to chemotherapy and can accelerate the growth of tumours. In other words, we actually want to get rid of these cells."

Proxofim kills these senescent cells "and it stimulates the surrounding stem cells to create new tissue. It is a peptide, a small protein that can easily penetrate into cells."

When applied to mice, this had a major effect. After just three weeks, their running wheel activity nearly tripled, their organ function improved and after ten days their coat of fur became fuller again. The researchers would now like to start clinical trials on humans: "We first want to investigate the safety and efficacy further. We then hope to expand the study to patients with aggressive forms of cancer within one to two years, and then eventually to study geriatric ailments. We do not seek eternal life, but a longer life without ailments and in excellent health would be great."

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

21st March 2017

Could fast radio bursts be powering alien probes?

A speculative hypothesis by Harvard physicists argues that fast radio bursts from distant galaxies could be artificial in origin.

 

fast radio bursts aliens
Artist's illustration of a light-sail powered by radio beam (red) generated on the surface of a planet. The leakage from such beams as they sweep across the sky would appear as Fast Radio Bursts (FRBs), similar to the new population of sources that was discovered recently at cosmological distances. Credit: M. Weiss/CfA

 

The search for extraterrestrial intelligence has looked for many different signs of alien life, from radio broadcasts to laser flashes, without success. However, newly published research suggests that mysterious phenomena called fast radio bursts could be evidence of advanced alien technology. Specifically, these bursts might be leakage from planet-sized transmitters powering interstellar probes in distant galaxies.

"Fast radio bursts are exceedingly bright given their short duration and origin at great distances, and we haven't identified a possible natural source with any confidence," said theorist Avi Loeb of the Harvard-Smithsonian Centre for Astrophysics. "An artificial origin is worth contemplating and checking."

As the name implies, fast radio bursts are millisecond-long flashes of radio emission. First discovered in 2007, fewer than two dozen have been detected by gigantic radio telescopes like the Parkes Observatory in Australia or the Arecibo Observatory in Puerto Rico. They are inferred to originate from distant galaxies, billions of light-years away.

 

fast radio burst 2007 lorimer

Credit: Duncan Lorimer/West Virginia University

 

Loeb and his co-author Manasvi Lingam (Harvard University) examined the feasibility of creating a radio transmitter strong enough for it to be detectable across such immense distances. They found that, if the transmitter were solar powered, the sunlight falling on an area of a planet twice the size of the Earth would be enough to generate the needed energy. Such a vast construction project is well beyond our current technology – but within the realm of possibility according to the laws of physics.

Lingam and Loeb also considered whether such a transmitter would be viable from an engineering perspective, or whether the tremendous energies involved would melt any underlying structure. Again, they found that a water-cooled device twice the size of Earth could withstand the heat.

They then asked, why build such an instrument in the first place? They argue that the most plausible use of such power is driving interstellar light sails. The amount of power involved would be sufficient to push a payload of a million tons, or about 20 times the largest cruise ships on Earth.

"That's big enough to carry living passengers across interstellar or even intergalactic distances," added Lingam.

To power a light sail, the transmitter would need to focus a beam on it continuously. Observers on Earth would see a brief flash because the sail and its host planet, star and galaxy are all moving relative to us. As a result, the beam sweeps across the sky and only points in our direction for a moment. Repeated appearances of the beam, which were observed but cannot be explained by cataclysmic astrophysical events, might provide clues about its artificial origin.

Loeb admits that this work is speculative. When asked whether he really believes that any fast radio bursts are due to aliens, he replied, "Science isn't a matter of belief, it's a matter of evidence. Deciding what’s likely ahead of time limits the possibilities. It's worth putting ideas out there and letting the data be the judge."

A paper reporting this work is published in the Astrophysical Journal Letters.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

21st March 2017

Canada to get its first spaceport in 2020

Maritime Launch Services, a Canadian space transport services company founded last year, has announced that it will establish a launch site in Nova Scotia by 2020. This will be Canada's first spaceport, with up to eight launches occurring each year from 2022 onwards.

 

canada first spaceport 2020 future timeline
Credit: Maritime Launch Services

 

Maritime Launch Services (MLS), established in Halifax, Nova Scotia, is a joint venture of three U.S. firms managed by a group of aerospace experts with decades of combined experience in the space industry, including time spent working at NASA. With global demand for space launch services set to grow rapidly in the coming years, MLS sees potential for Canada to enter the race.

Following a study of candidate sites across North America, the company has now chosen a site for operations in the Guysborough Municipality near Canso and Hazel Hill in Nova Scotia, Canada. An exhaustive review was carried out during the last year, which assessed 14 potential locations – evaluating criteria such as access to polar/Sun synchronous orbit, very low population density, proximity to multimodal transportation, and level of interest from the community, province and government.

In cooperation with Ukrainian firms Yuzhnoye and Yuzhmash, MLS aims to develop a commercial launch complex for the Cyclone 4M orbital launch vehicle, pictured above. This will bring the long established and mature space technology of Ukraine to Canada, and it is hoped will form a highly competitive launch services company. The new facility will begin construction in 2018, with first launches planned for 2020. If all goes according to plan, up to eight rockets could be launched annually from 2022 onwards. MLS is seeking a 20-year lease for the site from the Canadian government. The US$110m facility will include a launch pad and a processing building, as well as a control centre positioned three kilometres away.

The rockets will have two launch options. Option 1 is a Sun-synchronous orbit launch between 600-800 km, for smaller satellites with a payload of up to 3350 kg for US$45 million. Option 2 is a Low Earth Orbit launch, below 600 km in altitude that will allow a payload up to 5000 kg, also for US$45 million. Providers of the launch vehicle, Yuzhnoye and Yuzhmash, have been in operation for 62 years and launched 875 rockets.

"While we have a number of challenges ahead to work through the regulatory processes, approvals and site planning, we are optimistic that we can break ground on the launch complex within a year and meet market demands with our first launch in 2020," said John Isella, CEO of MLS. "The timing is perfect for this venture. Ukraine's independent space industry, and the solid market for these launch services all add to our confidence in this program. The Cyclone 4M rocket will become the standard of the medium-class space launch industry."

"We are pleased that Maritime Launch Services has chosen to invest in our community and we look forward to continued dialogue," said Vernon Pitts, Warden of the Municipality of Guysborough. "Since we were first introduced to this development a few months ago, we have been impressed with the proponents' approach and we will continue to work collaboratively with MLS as the project evolves. It's going to be a great tourist draw."

 

   

 

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

18th March 2017

Artificial "power island" to be built in the North Sea

A trio of European energy firms is collaborating to build a gigantic wind power hub in the middle of the North Sea. Providing clean energy transmission between six neighbouring countries, this will be a major step towards meeting Europe's 2050 climate goals.

 

 

 

Denmark's Energinet and the German and Dutch arm of TenneT will sign an agreement on 23rd March to explore ways to build a giant artificial island in the middle of the North Sea. This would create a new "hub" for the generation and transmission of renewable energy across northern Europe that could provide up to 100,000 megawatts (MW) to Belgium, Denmark, Germany, the Netherlands, Norway and the UK.

Known as the North Sea Wind Power Hub, this project would be located in Dogger Bank, a large sandbank in a shallow area of sea about 100 km (62 mi) off the east coast of England. During the last ice age, around 10,000 BC, this bank was part of Doggerland, a large landmass connecting Europe and the British Isles. Today, the water remains of relatively low depth, which combined with optimal wind conditions and a central location makes it an ideal site for land reclamation, according to TenneT.

The artificial island could be surrounded with up to 7,000 wind turbines, providing green energy for 80 million Europeans – not only generating and transmitting energy from the North Sea, but simultaneously forming a power link between six countries, enabling them to trade electricity. With an area of 6 sq km, the island would have its own landing strip and harbour. Staff, components and assembly workshops would be stationed there. The exact schedule for construction is currently unknown, and will depend on feasibility studies, but Energinet and TenneT believe the artificial island could be built on Dogger Bank sometime between 2030 and 2050.

 

north sea wind power hub future timeline 2020 2030 2040 2050
Credit: TenneT

 

Mel Kroon, the CEO of TenneT, commented on the multi-billion euro plan: "This project can significantly contribute to a completely renewable supply of electricity in Northwest Europe. TenneT and Energinet both have extensive experience in the fields of onshore grids, the connection of offshore wind energy and cross-border connections. Transmission System Operators (TSOs) are best placed to play a leading role in the long-term development of the offshore infrastructure. I am happy that we are going to take this step with our Danish colleagues and I look forward to the participation of other TSOs and possibly other partners."

Peder Østermark Andreasen, the CEO of Energinet, said: "Offshore wind has in recent years proved to be increasingly competitive and it is important to us to constantly focus on further reduction in prices of grid connections and interconnections. We need innovative and large-scale projects so that offshore wind can play an even bigger part in our future energy supply."

The island "could make the wind power of the future a lot cheaper and more effective," said Torben Nielsen, technical director of Energinet. In an interview with the Copenhagen Post, he added: "We haven't let our fantasy gain the upper hand, although it may sound a little crazy and like something out of science fiction. We who have the responsibility of transporting the electricity generated by offshore wind turbines back to land and the consumers must constantly push and make sure that the price continues to fall. That requires innovative, big-scale solutions, and an energy hub in the North Sea is worth thoroughly looking into."

The European targets for reducing CO2 emissions cannot be accomplished by individual member states on their own. Cooperation and synergy will be required on a broader scale – and projects like the North Sea Wind Power Hub could achieve that. Wind power and solar energy complement each other: there is more Sun from spring to autumn, and stronger winds during the colder and darker months of the year. By taking advantage of these variations, Europe and indeed many other parts of the world could fully optimise their energy grids at all times of the year. Longer term, continent-wide supergrids may emerge, allowing the generation and transmission of clean energy on massive scales and very long distances.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

16th March 2017

New York 2140

The latest novel by Kim Stanley Robinson depicts a scarily plausible future, in which New York has been inundated by rising sea levels.

 

new york 2140 future timeline

 

Kim Stanley Robinson is an award-winning science fiction author, best known for his Mars trilogy (Red Mars, Green Mars and Blue Mars), which follows the terraforming efforts on Mars over a 200-year period. He also wrote 2312, depicting a number of futuristic concepts including asteroid terrariums and rewilding of extinct species on Earth. His many other books explore a wide range of scientific, environmental, cultural and political themes. He is noted for his use of "hard" science fiction to convey a sense of realism and plausibility.

Now, Robinson is back with his latest novel: New York 2140. With yet another futuristic storyline, it tells the tale of a 22nd century Manhattan that is struggling to survive amid rising sea levels. A synopsis of the book reads as follows:

"The waters rose, submerging New York City.
But the residents adapted and it remained the bustling, vibrant metropolis it had always been. Though changed forever.
Every street became a canal. Every skyscraper an island.
Through the eyes of the varied inhabitants of one building, Kim Stanley Robinson shows us how one of our great cities will change with the rising tides.
And how we too will change."

For those who might be wondering whether New York: 2140 is set in the same universe as the Mars trilogy or 2312, Robinson had this to say in an interview with sfsite.com:

"I don't like linking up my various projects into one larger future history. I've never done it, and so of course now it's too late, and I don't regret it. I don't see that the advantages of some larger macro-history are very large, compared to the flexibility that I've gained by making each novel have its own future history. Even within my Mars stories there are a couple alternative historical lines to the main one described in the trilogy. I think it's best to keep on updating one's views on what is "most likely to happen," and write accordingly. And doing it this way means each time I have a chance to invent a whole new history, and even if they are somewhat similar, there's still a lot of pleasure to be had there in the details."

New York: 2140 is published this week in hardcover, by Orbit. It is also available in E-book and audio format. A paperback version is scheduled for 2018.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

14th March 2017

Renewables are now Australia's cheapest energy option

A major new report claims that renewable energy is now Australia's cheapest energy option, even when the cost of storage to make the intermittent power sources reliable is added.

 

australia solar energy

 

A major new study into the cost of emissions reductions and energy storage in the Australian power sector indicates that the rising price of gas, coupled with the falling cost of energy storage, has now made renewable energy the cheapest source of "reliable" generation in Australia – surpassing gas as the 'least cost' source of energy supply – even if the Sun is not shining and the wind not blowing.

The study also shows that "clean coal" technologies such as Carbon Capture and Storage (CCS) will not be commercially mature before 2030 – and will therefore not help Australia to meet its 2030 emissions reduction target under the Paris Agreement.

The study, by energy and carbon advisory firm RepuTex, is expected to re-shape thinking on the role of renewable energy in providing affordable, clean, and reliable energy as Australia seeks to meet its 2030 emissions reduction target – referred to as the "energy trilemma".

RepuTex analysis, supported by extensive consultation with over 45 electricity generators, industrial and commercial consumers and investors, identifies emissions reduction activities in the power sector – such as retrofitting existing coal-fired plants, developing new wind, solar, gas and "clean coal" generation – with analysis mapping the size and cost of abatement through to 2030.

Their analysis also calculates the "full cost" for renewables to supply "reliable" power – including the cost of batteries, pumped hydro, or thermal storage – to determine which technologies can supply electricity at least cost, while improving security.

According to RepuTex, advancements in the cost of energy storage technology, coupled with significant rises in the domestic gas price, have now made wind and solar – with storage – competitive with gas in providing system reliability in the form of instantaneous peaking or load-following generation.

This means new renewable facilities, with storage, are the least cost source of firm power, and able to provide energy supply even if the Sun is not shining, or the wind not blowing.

 

australia wind energy

 

"Traditionally, gas-fired generators have been the least cost technology that could provide energy security, such as load-following and peaking services," explained Bret Harper, head of research at RepuTex. "However, the rising price of gas has increased the levelised cost of any new gas build in Australia. At the same time, the decline in capital costs for new wind and solar projects, and improvements in storage performance, have seen renewable project costs fall. When we consider the 'full cost' of renewables to supply dispatchable power – including storage costs to ensure supply even when the wind is not blowing or the Sun not shining – we find that renewables have overtaken gas as the least cost source of new firm supply," he said.

The analysis is significant for the federal debate on energy security, with findings indicating that load-following wind and solar may now be able to strengthen the grid – overcoming intermittency concerns – while strengthening the claims of state governments in South Australia, Queensland and Victoria as they seek to cash in on new renewable investment.

"As older coal and gas-fired generation leave the market, new dispatchable renewables will be able to provide energy during daily peaks, adjust as demand changes throughout the day, or provide reserve peaking generation capacity to alleviate critical situations such as those in South Australia and New South Wales," said Harper. "Moreover, they can now provide that service at 'least cost', surpassing gas. Our view is that this will create a decreasing need for baseload-only facilities, with potential for states to rely on new storage technologies to provide affordable, clean, and secure energy, while improving system reliability."

 

coal power australia
Yallourn coal-fired power plant in Victoria, Australia

 

Notably, the study also shows that "clean coal" technologies such as Carbon Capture and Storage (CCS) will not be commercially mature before 2030 – and will therefore not help Australia to meet its 2030 emissions reduction target under the Paris Agreement.

Findings indicate that four groups of measures have potential to deliver the vast majority of the power sector's emissions reductions by 2030, including distributed generation, the closure of emissions intensive generators, improving the greenhouse gas intensity of existing fossil fuel plants, and investing in renewables and energy storage.

However, while analysis shows there are many opportunities for emissions reductions in the sector, "clean coal" technology is not among the cheapest.

"While clean coal is promoted as a critical emissions reduction technology, findings indicate that cost will be a major barrier to the implementation of a commercial scale project in Australia," said Mr Harper. "We see costs for CCS coming down as low as $100/MWh around 2030, at which point a large-scale project may be feasible if there is any appetite for a baseload-only generator. On that timeline, we assume CCS will play little role to meet Australia's 2030 emissions reduction target."

"Moreover, with a premium placed on flexible generation that can ramp up or down, we see baseload-only generation as being too inflexible to compete in Australia's future electricity system. That is not good news for coal generation, irrespective of how clean it is," said Mr Harper.

The study is expected to provide a new reference point for the cost of emissions reductions, and energy storage technologies, as policymakers seek to solve the "energy trilemma" of providing affordable and reliable energy supply, while meeting Australia's 2030 emissions target. RepuTex notes that a clear market signal is needed to guide investment toward a long-term target well beyond 2030, to better match the investment timeframes of the sector.

"To determine a cost-effective pathway, Australia needs to define its long-term target to better match investment decisions that have lifetimes of 20 to 40 years" said Mr Harper. "Are we aiming for a 26 per cent target by 2030, or 100 per cent clean energy by 2050? Each of those targets have different least-cost pathways, but the same investment timeframe. Identifying a long-term target, with a clear signal on the rate and pace of change, will therefore help to guide the correct investment in the sector."

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

13th March 2017

Soil's contribution to climate change much higher than previously thought

Deeper soil layers are more sensitive to warming than previously thought, scientists have found. By 2100, this could release carbon to the atmosphere at a rate that is 30% of today's human-caused annual emissions

 

soil climate future

 

Soils could release much more CO2 than expected into the atmosphere as the climate warms, according to new research by scientists from the Department of Energy's Lawrence Berkeley National Laboratory.

Their findings are based on a field experiment that, for the first time, explored what happens to organic carbon trapped in soil when all soil layers are warmed, which in this case extend to a depth of 100 centimetres. The scientists discovered that warming both the surface and deeper soil layers at three experimental plots increased the plots' annual release of CO2 by 34 to 37 percent over non-warmed soil. Much of the CO2 originated from deeper layers, indicating that deeper stores of carbon are more sensitive to warming than previously thought.

The results shed light on what is potentially a big source of uncertainty in climate projections. Soil organic carbon harbours three times as much carbon as Earth's atmosphere. In addition, warming is expected to increase the rate at which microbes break down soil organic carbon, releasing more CO2 into the atmosphere and contributing to climate change.

But, until now, the majority of field-based soil warming experiments only focused on the top five to 20 centimetres of soil—which leaves a lot of carbon unaccounted for. Experts estimate soils below 20 centimetres in depth contain more than 50 percent of the planet's stock of soil organic carbon. The big questions have been: to what extent do the deeper soil layers respond to warming? And what does this mean for the release of CO2 into the atmosphere?

"We found the response is quite significant," says Caitlin Hicks Pries, a postdoctoral researcher in Berkeley Lab's Climate and Ecosystem Sciences Division. She conducted the research with co-corresponding author Margaret Torn, and Christina Castahna and Rachel Porras, who are also Berkeley Lab scientists.

"If our findings are applied to soils around the globe that are similar to what we studied, meaning soils that are not frozen or saturated, our calculations suggest that by 2100 the warming of deeper soil layers could cause a release of carbon to the atmosphere at a rate that is significantly higher than today, perhaps even as high as 30 percent of today's human-caused annual carbon emissions depending on the assumptions on which the estimate is based," adds Hicks Pries.

 

soil climate future
An innovative deep soil warming experiment in full swing. Scientist Caitlin Hicks Pries downloads soil temperature data, while fellow Berkeley Lab scientists Cristina Castanha (left) and Neslihan Tas (middle) work on an experimental plot in the background. (Credit: Berkeley Lab)

 

The need to better understand the response of all soil depths to warming is underscored by projections that, over the next century, deeper soils will warm at roughly the same rate as surface soils and the air. In addition, Intergovernmental Panel on Climate Change simulations of global average soil temperature, using a "business-as-usual" scenario in which carbon emissions rise in the decades ahead, predict that soil will warm 4° Celsius by 2100.

To study the potential impacts of this scenario, the Berkeley Lab scientists pioneered an innovative experimental setup at the University of California's Blodgett Forest Research Station, which is located in the foothills of California's Sierra Nevada mountains. The soil at the research station is representative of temperate forest soils, which in turn account for about 13.5 percent of soil area worldwide.

The scientists built their experiment around six soil plots that measure three metres in diameter. The perimeter of each plot was ringed with 22 heating cables that were vertically sunk more than two metres underground. They warmed three of the plots 4° Celsius for more than two years, leaving the other three plots unheated to serve as controls.

They monitored soil respiration three different ways over the course of the experiment. Each plot had an automated chamber that measured the flux of carbon at the surface every half hour. In addition, one day each month, Hicks Pries and the team measured surface carbon fluxes at seven different locations at each plot.

A third method probed the all-important underground realm. A set of stainless steel "straws" was installed below the surface at each plot. The scientists used the straws to measure CO2 concentrations once a month at five depths between 15 and 90 centimetres. By knowing these CO2 concentrations and other soil properties, they could model the extent to which each depth contributed to the amount of CO2 released at the surface.

 

soil climate future
One of the experimental heating plots. Credit: Berkeley Lab.

 

They discovered that, of the 34 to 37 percent increase in CO2 released at the three warmed plots, 40 percent of this increase was due to CO2 that came from below 15 centimetres. They also found the sensitivity of soil to warming was similar across the five depths.

The scientists say these findings suggest the degree to which soil organic carbon influences climate change may be currently underestimated.

"There's an assumption that carbon in the subsoil is more stable and not as responsive to warming as in the topsoil, but we've learned that's not the case," says Torn. "Deeper soil layers contain a lot of carbon, and our work indicates it's a key missing component in our understanding of the potential feedback of soils to the planet's climate."

Their work is published in the journal Science.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

10th March 2017

IBM unveils roadmap for quantum computers

IBM has announced "IBM Q", an initiative to build commercially available universal quantum computing systems.

 

IBM quantum computer system
Credit: IBM Research

 

IBM has announced an industry-first initiative to build commercially available universal quantum computing systems. “IBM Q” systems and services will be delivered via the IBM Cloud platform. Current technologies that run on classical computers, such as Watson, can help to identify patterns and insights buried in vast amounts of existing data. By contrast, quantum computers will deliver solutions to important problems where patterns cannot be seen because the data doesn’t exist and the calculations needed to answer questions are too enormous to ever be processed by classical computers.

IBM is also launching a new Application Program Interface (API) for the “IBM Quantum Experience” enabling anyone with an Internet connection to use the quantum processor (via the Cloud) for running algorithms and experiments, working with individual quantum bits, and exploring tutorials and simulations of what might be possible with quantum computing. In the first half of 2017, IBM plans to release a full Software Development Kit (SDK) for users to build simple quantum applications and software programs.

“IBM has invested over decades to growing the field of quantum computing and we are committed to expanding access to quantum systems and their powerful capabilities for the science and business communities,” said Arvind Krishna, senior vice president of Hybrid Cloud and director for IBM Research. “Following Watson and blockchain, we believe that quantum computing will provide the next powerful set of services delivered via the IBM Cloud platform, and promises to be the next major technology that has the potential to drive a new era of innovation across industries.”

 

IBM quantum computer system qubit
Credit: IBM Research

 

IBM intends to build IBM Q systems to expand the application domain of quantum computing. A key metric will be the power of a quantum computer expressed by the “Quantum Volume” – which includes the number of qubits, quality of operations, connectivity and parallelism. As a first step to increase Quantum Volume, IBM aims to build commercial IBM Q systems with around 50 qubits in the next few years to demonstrate capabilities beyond today’s classical systems, and plans to collaborate with key industry partners to develop applications that exploit the quantum speedup of the systems.

IBM Q systems will be designed to tackle problems that are currently too complex and exponential in nature for classical computing systems to handle. One of the first and most promising applications will be in the area of chemistry. Even for simple molecules like caffeine, the number of quantum states in the molecule can be astoundingly large; so complex that all the conventional computing memory and processing power scientists could ever build could not handle the problem.

IBM’s scientists have recently developed new techniques to efficiently explore the simulation of chemistry problems on quantum processors and experimental demonstrations of various molecules are in progress. In the future, the goal will be to scale to even more complex molecules and try to predict chemical properties with higher precision than possible with classical computers.

Future applications of quantum computing may include:

Artificial Intelligence: Making facets of artificial intelligence such as machine learning much more powerful when data sets can be too big such as searching images or video
Cloud Security: Making cloud computing more secure by using the laws of quantum physics to enhance private data safety
Drug & Materials Discovery: Untangling the complexity of molecular and chemical interactions leading to the discovery of new medicines and materials
Financial Services: Finding new ways to model financial data and isolating key global risk factors to make better investments
Supply Chain & Logistics: Finding the optimal path across global systems of systems for ultra-efficient logistics and supply chains, such as optimising fleet operations for deliveries during the holiday season

 

IBM quantum computer system qubit

 

“Classical computers are extraordinarily powerful and will continue to advance and underpin everything we do in business and society,” said Tom Rosamilia, senior vice president of IBM Systems. “But there are many problems that will never be penetrated by a classical computer. To create knowledge from much greater depths of complexity, we need a quantum computer. We envision IBM Q systems working in concert with our portfolio of classical high-performance systems to address problems that are currently unsolvable, but hold tremendous untapped value.”

IBM’s roadmap for scaling to practical quantum computers is based on a holistic approach to advancing all parts of the system. The company will leverage its deep expertise in superconducting qubits, complex high performance system integration, and scalable nanofabrication processes from the semiconductor industry to help advance the quantum mechanical capabilities. The developed software tools and environment will also leverage IBM’s world-class mathematicians, computer scientists, and software and system engineers.

"As Richard Feynman said in 1981, ‘…if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.’ This breakthrough technology has the potential to achieve transformational advancements in basic science, materials development, environmental and energy research, which are central to the missions of the Department of Energy (DOE),” said Steve Binkley, deputy director of science, US Department of Energy. “The DOE National Labs have always been at the forefront of new innovation, and we look forward to working with IBM to explore applications of their new quantum systems."

 

 

 

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

5th March 2017

AI beats top human players at poker

The University of Alberta has announced details of DeepStack, a new artificial intelligence program able to beat professional human players at poker for the first time.

 

deepstack poker ai

 

In 1952, Professor Sandy Douglas created a tic-tac-toe game on the EDSAC, a room-sized computer at the University of Cambridge. One of the first ever computer games, it was developed as part of a thesis on human-computer interaction. Forty-five years later, in 1997, another milestone occurred when IBM's Deep Blue machine defeated Garry Kasparov, the world chess champion. This was followed by Watson, again created by IBM, which appeared on the Jeopardy! game show and beat the top human players in 2011. Yet another breakthrough was Google's DeepMind AlphaGo, which in 2016 defeated the Go world champion Lee Se-dol at a tournament in South Korea.

Now, for the first time ever, an artificial intelligence program has beaten human professional players at heads-up, no-limit Texas hold 'em, a variation of the card game of poker. This historic result in AI has implications far beyond the poker table – from helping to make more decisive medical treatment recommendations to developing better strategic defence planning.

DeepStack has been created by the University of Alberta's Computer Poker Research Group. It bridges the gap between games of "perfect" information – like in checkers, chess, and Go, where both players can see everything on the board – and "imperfect" information games, by reasoning while it plays, using "intuition" honed through deep learning to reassess its strategy with each decision.

"Poker has been a long-standing challenge problem in artificial intelligence," said computer scientist Michael Bowling, principal investigator on the study. "It's the quintessential game of imperfect information, in the sense that players don't have the same information or share the same perspective while they're playing."

Artificial intelligence researchers have long used parlour games to test their theories because the games are mathematical models that describe how decision-makers interact.

"We need new AI techniques that can handle cases where decision-makers have different perspectives," said Bowling. "Think of any real-world problem. We all have a slightly different perspective of what's going on, much like each player only knowing their own cards in a game of poker."

 

deepstack poker ai

 

This latest discovery builds on previous research findings about artificial intelligence and imperfect information games stretching back to the creation of the Computer Poker Research Group in 1996. DeepStack extends the ability to think about each situation during play to imperfect information games using a technique called continual re-solving. This allows the AI to determine the correct strategy for a particular poker situation by using its "intuition" to evaluate how the game might play out in the near future, without thinking about the entire game.

"We train our system to learn the value of situations," said Bowling. "Each situation itself is a mini poker game. Instead of solving one big poker game, it solves millions of these little poker games, each one helping the system to refine its intuition of how the game of poker works. And this intuition is the fuel behind how DeepStack plays the full game."

Thinking about each situation as it arises is important for complex problems like heads-up no-limit hold'em, which has more unique situations than there are atoms in the universe, largely due to players' ability to wager different amounts including the dramatic "all-in." Despite the game's complexity, DeepStack takes action at human speed – with an average of only three seconds of "thinking" time – and runs on a simple gaming laptop.

To test the approach, DeepStack played against a pool of professional human players recruited by the International Federation of Poker. A total of 33 players from 17 countries were asked to play in a 3,000-hand match, over a period of four weeks. DeepStack beat each of the 11 players who finished their match, with only one outside the margin of statistical significance.

A paper on this study, DeepStack: Expert-Level Artificial Intelligence in Heads-Up No-Limit Poker, is published in the journal Science.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

1st March 2017

Two-way communication in brain-machine interface achieved for the first time

Optical techniques for imaging and stimulating brain activity could lead to a new generation of more precise, bidirectional neural prostheses.

 

brain machine interface bmi future timeline
Credit: © Daniel Huber, UNIGE

 

Since the early 1970s, scientists have been developing brain-machine interfaces; the main application being the use of neural prosthesis in paralysed patients or amputees. A prosthetic limb directly controlled by brain activity can partially recover the lost motor function. This is achieved by decoding neuronal activity recorded with electrodes and translating it into robotic movements. However, such systems have limited precision, due to the absence of sensory feedback from the artificial limb.

Neuroscientists at the University of Geneva (UNIGE), Switzerland, looked at whether it was possible to transmit this missing sensation back to the brain by stimulating neural activity in the cortex. They discovered that not only was it possible to create an artificial sensation of neuroprosthetic movements, but that the underlying learning process occurs very rapidly. These findings, published in the scientific journal Neuron, were obtained by using modern imaging and optical stimulation tools, an alternative to the classical electrode approach.

Motor function is at the heart of all behaviour and allows us to interact with the world. Therefore, replacing a lost limb with a robotic prosthesis is the subject of much research, yet successful outcomes are rare. Why is that? Until now, brain-machine interfaces have been operated by relying largely on visual perception: the robotic arm is controlled by looking at it. The direct flow of information between the brain and machine thus remains unidirectional. However, movement perception is not only based on vision, but mostly on proprioception – the sensation of where the limb is located in space.

“We have therefore asked whether it was possible to establish a bidirectional communication in a brain-machine interface: to simultaneously read out neural activity, translate it into prosthetic movement and reinject sensory feedback of this movement back in the brain,” explains Daniel Huber, professor in the Department of Basic Neurosciences at UNIGE.

In contrast to traditional invasive approaches using electrodes, Huber’s team specialises in optical techniques for imaging and stimulating brain activity. Using a method called two-photon microscopy, they routinely measure the activity of hundreds of neurons with single cell resolution: “We wanted to test whether mice could learn to control a neural prosthesis by relying uniquely on an artificial sensory feedback signal”, explains Mario Prsa, researcher at UNIGE and the first author of the study. “We imaged neural activity in the motor cortex. When the mouse activated a specific neuron, the one chosen for neuroprosthetic control, we simultaneously applied stimulation proportional to this activity to the sensory cortex using blue light.”

Neurons of the sensory cortex were rendered photosensitive to this light, allowing them to be activated by a series of optical flashes and thus integrate the artificial sensory feedback signal. The mouse was rewarded upon every above-threshold activation, and just 20 minutes later, once the association was learned, the rodent was able to more frequently generate the correct neuronal activity.

This means that the artificial sensation was not only perceived, but that it was successfully integrated as a feedback of the prosthetic movement. So in this manner, the brain-machine interface functions bidirectionally. The Geneva researchers think that the reason why this fabricated sensation is so rapidly assimilated is because it most likely taps into very basic brain functions. Feeling the position of our limbs occurs automatically, without much thought and probably reflects fundamental neural circuit mechanisms. In the future, this type of bidirectional interface could allow more precisely displacing robotic arms, feeling touched objects or perceiving the necessary force to grasp them.

At present, the neuroscientists at UNIGE are examining how to produce a more efficient sensory feedback. They are currently capable of doing it for a single movement – but is it also possible to provide multiple feedback channels in parallel? This research sets the groundwork for developing a new generation of more precise, bidirectional neural prostheses.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

 
     
   
« Previous  
   
     
   

 
     
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2015

 

2014

 

2013

 

2012

 

2011

 

2010

 

 
 
 
 

 


future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed

Privacy Policy