future timeline technology singularity humanity
 
   
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed
 
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2014

 

2013

 

2012

 

2011

 

2010

 

 
     
     
     
 
       
   
 
     
 

28th October 2014

Reducing human population to a sustainable level could take centuries

A new multi-scenario modelling of world human population concludes that even draconian fertility restrictions or a catastrophic mass mortality won't be enough to solve issues of global sustainability by 2100.

 

overpopulation

 

Published today in the Proceedings of the National Academy of Sciences of the USA, ecologists Professor Corey Bradshaw and Professor Barry Brook from the University of Adelaide's Environment Institute say that our "virtually locked-in" population growth means the world must focus on policies and technologies that reverse rising consumption of natural resources and enhance recycling, for more immediate sustainability gains.

Fertility reduction efforts, however, through increased family-planning assistance and education, should still be pursued, as this will lead to hundreds of millions fewer people to feed by mid-century.

"Global population has risen so fast over the past century that roughly 14% of all the human beings that have ever existed are still alive today. That's a sobering statistic," says Professor Bradshaw, Director of Ecological Modelling. "This is considered unsustainable for a range of reasons – not least being able to feed everyone, as well as the impact on the climate and environment.

"We examined various scenarios for global human population change to the year 2100 by adjusting fertility and mortality rates to determine the plausible range of population sizes at the end of this century. Even a worldwide one-child policy like China's, implemented over the coming century, or catastrophic mortality events like global conflict or a disease pandemic, would still likely result in 5-10 billion people by 2100."

The team constructed nine different scenarios for continuing population, ranging from "business as usual" through various fertility reductions, to highly unlikely broad-scale catastrophes resulting in billions of deaths.

"We were surprised that a five-year WWIII scenario – mimicking the same proportion of people killed in the First and Second World Wars combined – barely registered a blip on the human population trajectory this century," says Professor Barry Brook.

"Often when I give public lectures about policies to address global change, someone will claim that we are ignoring the 'elephant in the room' of human population size. Yet, as our models show clearly, while there needs to be more policy discussion on this issue, the current inexorable momentum of the global human population precludes any demographic 'quick fixes' to our sustainability problems.

"Our work reveals that effective family planning and reproduction education worldwide have great potential to constrain the size of the human population and alleviate pressure on resource availability over the longer term. Our great-great-great-great grandchildren might ultimately benefit from such planning, but people alive today will not.

"The corollary of these findings is that society's efforts towards sustainability would be directed more productively towards reducing our impact as much as possible through technological and social innovation."

 

  speech bubble Comments »
 

 

 

28th October 2014

Blood vessels grown from stem cells in just seven days

Using stem cells from only 25 millilitres of blood, researchers have grown new blood vessels in just seven days – compared to a month for the same process using bone marrow.

 

blood vessels from stem cells

 

Technology for making new tissues from stem cells has taken a huge leap forward. Two tablespoons of blood are all that is needed to grow a brand new blood vessel in just seven days. This breakthrough is reported from Sahlgrenska Academy and Sahlgrenska University Hospital in Gothenburg, Sweden and published in the journal EBioMedicine.

Three patients, all young children, were missing a vein that goes from the gastrointestinal tract to the liver. The procedure was planned and carried out by Suchitra Sumitran-Holgersson (Professor of Transplantation Biology at Sahlgrenska Academy), and Michael Olausson (Surgeon/Medical Director of the Transplant Centre and Professor at Sahlgrenska Academy).

"We used the stem cells of the patients to grow a new blood vessel that would permit the two organs to collaborate properly," says Michael Olausson.

In developing their new technique, however, they found a way to extract stem cells without taking them from the bone marrow.

"Drilling in the bone marrow is very painful," explains Professor Sumitran-Holgersson. "It occurred to me that there must be a way to obtain the cells from the blood instead."

 

Michael Olausson and Suchitra Sumitran-Holgersson
Michael Olausson and Suchitra Sumitran-Holgersson. Credit: University of Gothenburg

 

The fact that the patients were so young fuelled her passion to look for a new approach. The method involved taking 25 millilitres (about 2 tablespoons) of blood, the minimum quantity needed to obtain enough stem cells. Sumitran-Holgersson's idea turned out to surpass her wildest expectations – the extraction procedure worked perfectly the very first time.

"Not only that, but the blood itself accelerated growth of the new vein," she says. "The entire process took only a week, as opposed to a month in the [case of bone marrow]. The blood contains substances that naturally promote growth."

Perhaps in the future, these substances might be exploited more fully, to reduce growth times even further.

So far, the team has treated three patients. Two of the three are still doing well and have veins that are functioning as they should. In the third case, the child is under medical surveillance and the outcome is more uncertain. The team is confident they can make further progress.

"We believe that this technological progress can lead to dissemination of the method for the benefit of additional groups of patients, such as those with varicose veins or myocardial infarction, who need new blood vessels," says Professor Holgersson. "Our dream is to be able to grow complete organs as a way of overcoming the current shortage from donors."

 

blood vessels grown from stem cells in blood

 

  speech bubble Comments »
 

 

 

27th October 2014

Toxic stem cells to fight brain tumours

Scientists at Harvard have announced a new method of using toxic stem cells to fight brain tumours, without killing normal cells or themselves. This procedure could be ready for human clinical trials within five years.

 

toxic stem cells brain tumour
Toxin-producing stem cells (blue) attacking brain tumour cells (green) in a mouse

 

Brain cancer has a five-year survival rate of only 35% (see "When will cancer be cured?"). Harvard Stem Cell Institute scientists at Massachusetts General Hospital have devised a new way to use stem cells in the fight against this disease. A team led by neuroscientist Khalid Shah, PhD, now has a way to genetically engineer stem cells able to produce tumour-killing toxins.

In the AlphaMed Press journal STEM CELLS, Shah’s team shows how the toxin-secreting stem cells can be used to eradicate cancer cells remaining in mouse brains after their main tumour has been removed. The stem cells are placed at the site encapsulated in a biodegradable gel. This method solves the delivery issue that probably led to the failure of recent clinical trials aimed at delivering purified cancer-killing toxins into patients’ brains. Shah and his team are currently pursuing FDA approval to bring this and other stem cell approaches developed by them to clinical trials.

“Cancer-killing toxins have been used with great success in a variety of blood cancers – but they don’t work as well in solid tumours, because the cancers aren’t as accessible and the toxins have a short half-life,” explains Shah. “A few years ago, we recognised that stem cells could be used to continuously deliver these therapeutic toxins to tumours in the brain, but first we needed to genetically engineer stem cells that could resist being killed themselves by the toxins. Now, we have toxin-resistant stem cells that can make and release cancer-killing drugs.”

 

brain illustration

 

Cytotoxins are deadly to all cells – but since the late 1990s, researchers have been able to “tag” toxins in such a way that they only enter cancer cells with specific surface molecules; making it possible to get a toxin into a cancer cell without posing a risk to normal cells. Once inside of a cell, the toxin disrupts the cell’s ability to make proteins and, within days, the cell starts to die.

Shah’s stem cells escape this fate because they are made with a mutation that doesn’t allow the toxin to act inside the cell.  The toxin-resistant stem cells also have an extra bit of genetic code that allows them to make and secrete the toxins. Any cancer cells that these toxins encounter do not have this natural defense and therefore die. Shah and his team induced toxin resistance in human neural stem cells and subsequently engineered them to produce targeted toxins.

“We tested these stem cells in a clinically relevant mouse model of brain cancer, where you resect the tumours and then implant the stem cells encapsulated in a gel into the resection cavity,” he said. “After doing all of the molecular analysis and imaging to track the inhibition of protein synthesis within brain tumours, we do see the toxins kill the cancer cells and eventually prolonging the survival in animal models of resected brain tumours.”

Chris Mason, professor of regenerative medicine at University College London, says: "This is a clever study, which signals the beginning of the next wave of therapies. It shows you can attack solid tumours by putting 'mini pharmacies' inside the patient, which deliver the toxic payload direct to the tumour. Cells can do so much. This is the way the future is going to be."

Shah next plans to rationally combine the toxin-secreting stem cells with a number of different therapeutic stem cells developed by his team to further enhance their positive results in mouse models of glioblastoma, the most common brain tumour in human adults. Shah predicts that he will bring these therapies into clinical trials within the next five years.

 

  speech bubble Comments »
 

 

 

26th October 2014

Cheaper silicon means cheaper solar cells

A new method of producing solar cells could reduce the amount of silicon per unit area by 90 per cent compared to the current standard. With the high prices of pure silicon, this could help cut the cost of solar power.

 

solar power

 

Researchers at the Norwegian University of Science and Technology (NUST) have pioneered a new approach to manufacturing solar cells that requires less silicon and can accommodate silicon 1,000 times less pure than is currently the standard. This breakthrough means that solar cells could be made much more cheaply than at present.

“We're using less expensive raw materials, and smaller amounts of them, we have fewer production steps and our total energy consumption is potentially lower,” explains PhD candidate Fredrik Martinsen and Professor Ursula Gibson, from NUST's Department of Physics.

The researchers’ solar cells are composed of silicon fibres coated in glass. A silicon core is inserted into a glass tube about 30 mm in diameter. This is then heated so that the silicon melts and the glass softens. The tube is stretched out into a thin glass fibre filled with silicon. The process of heating and stretching makes the fibre up to 100 times thinner.

This is the widely accepted industrial method used to produce fibre optic cables. But the NUST researchers – in collaboration with Clemson University in the USA – are the first to use silicon-core fibres made this way in solar cells. The active part of these solar cells is the silicon core, with a diameter of about 100 micrometres.

 

silicon fibres

 

This production method also enabled them to solve another problem: traditional solar cells require very pure silicon. Manufacturing pure silicon wafers is laborious, energy intensive and expensive. Using their new process, it takes only one-third of the energy to manufacture solar cells compared to the traditional approach of producing silicon wafers.

“We can use relatively dirty silicon – and the purification occurs naturally as part of the process of melting and re-solidifying in fibre form. This means that you save energy, and several steps in production,” says Gibson.

These new solar cells are based on the vertical rod radial-junction design, a relatively new approach.

“The vertical rod design still isn’t common in commercial use. Currently, silicon rods are produced using advanced and expensive nano-techniques that are difficult to scale,” says Martinsen. “But we’re using a tried-and-true industrial bulk process, which can make production a lot cheaper.”

The power produced by these prototype cells is not yet up to commercial standards. The efficiency of modern solar cells is typically about 20 per cent, while the NTNU's version has only managed 3.6 per cent. However, Martinsen claims their work has great potential for improvement – so this new production method is something we might see appearing in future decades, as nanotechnology continues to advance.

“These are the first solar cells produced this way, using impure silicon. So it isn’t surprising that the power output isn’t very high. It’s a little unfair to compare our method to conventional solar cells, which have had 40 years to fine-tune the entire production process. We’ve had a steep learning curve, but not all the steps of our process are fully developed yet. We’re the first to show that you can make solar cells this way. The results are published and the process is set in motion.”

 

  speech bubble Comments »
 

 

 

25th October 2014

Dunes visible on comet 67P

In August, the European Space Agency (ESA) achieved a major success when its Rosetta probe rendezvoused with comet 67P. The spacecraft has been returning spectacular images of this strange little world, located midway between the orbits of Mars and Jupiter. Among its latest photographs is the image below showing what appear to be dunes like those found on the deserts and beaches of Earth. Seen from a distance of about 8.8 km (5.5 miles), the scale here is 92 cm/pixel, meaning the dunes are roughly the width of a jumbo jet (as shown in this helpful illustration from Reddit).

The comet is already becoming more active as it approaches the Sun, with jets of dust shooting outward in slowly increasing quantities. Data from the mass spectrometers on Rosetta show that its coma (or atmosphere) contains a surprisingly rich variety of chemicals. Although low density, this would smell terrible, were humans able to experience it.

A surface lander is due to touch down on 12th November. Philae will take seven hours to land and is equipped with a 1024 x 1024 pixel CCD. If all goes according to plan, this will take images both during its final descent phase and on the surface, when the camera will be 30 cm above the ground. Its field of view will be roughly 30 x 30 cm, giving a resolution of 0.3 mm/pixel. You can follow the latest developments on the ESA blog (where many more images can be found) and via Twitter @ ESA_Rosetta.

*UPDATE*: See also this short sci-fi movie to promote the mission.

 

rosetta comet 67p dunes

 

  speech bubble Comments »
 

 

 

23rd October 2014

Coffee may protect the liver

Three cups of coffee a day can reduce the risk of abnormal liver enzyme levels by 25 percent, regardless of how much caffeine it contains.

 

coffee cup and laptop screen

 

If you're looking for ways to extend your lifespan, then coffee might be a good choice. Researchers at the National Cancer Institute report that it may significantly benefit liver health. Their study, published this month in Hepatology, shows that higher coffee consumption – regardless of how much caffeine it contains – results in lower levels of abnormal liver enzymes. This suggests that chemical compounds in coffee other than caffeine may help to protect the liver.

Coffee consumption is highly prevalent, with more than half of all Americans over 18 drinking on average three cups per day, according to a 2010 report from the National Coffee Association. Moreover, consumption has increased by between 1-2% each year since the 1980s. Previous studies have found that coffee may lower the risk of developing diabetes, cardiovascular disease, non-alcoholic fatty liver disease, cirrhosis and liver cancer.

"Prior research found that drinking coffee may have a possible protective effect on the liver," said lead author Dr. Qian Xiao. "However, the evidence is not clear if that benefit may extend to decaffeinated coffee."

For this study, researchers examined the coffee-drinking habits of 28,000 people, using data from a national health survey conducted from 1999-2010. 14,000 of the subjects drank coffee. Several markers were compared to determine liver function, including blood levels of four enzymes. After adjusting for age, sex, race, education, smoking, alcohol consumption and other factors, the researchers found that compared with people who drank no coffee, those who drank three cups a day were about 25 percent less likely to have abnormal liver enzyme levels. Among the 2,000 or so who drank only decaffeinated coffee, the results were similar.

Dr. Xiao concludes: "Our findings link total and decaffeinated coffee intake to lower liver enzyme levels. These data suggest that ingredients in coffee, other than caffeine, may promote liver health. Further studies are needed to identify these components."

In a related development, researchers last month sequenced the coffee genome.

 

  speech bubble Comments »
 

 

 

21st October 2014

Tractor beam can move objects 100 times further

Laser physicists at the Australian National University have built a reversible tractor beam, able to move objects 0.2 mm in diameter a distance of up to 20 cm (7.9"). This is 100 times further than was possible in previous experiments.

 

tractor beam

 

Tractor beam technology – as depicted in science fiction movies like Star Trek – might become a reality sooner than we think. Following a number of successful experiments in recent years, it is moving further and further into the macro-scale. Laser physicists at the Australian National University (ANU) have now demonstrated the first long-distance optical tractor beam, able to repel and attract objects using a "hollow" beam that is bright around the edges and dark in its centre. It can move particles 0.2 mm in diameter a distance of up to 20 cm (7.9"), about 100 times further than previous attempts.

“Demonstration of a large-scale laser beam like this is a kind of holy grail for laser physicists,” said Prof. Wieslaw Krolikowski, from the Research School of Physics and Engineering at ANU.

The new technique is versatile because it requires only a single laser beam. It could be used, for example, in controlling atmospheric pollution or for the retrieval of tiny, delicate or dangerous particles for sampling. The researchers can also imagine the effect being scaled up.

“Because lasers retain their beam quality for such long distances, this could work over metres,” said co-author Dr Vladlen Shvedov. “Our lab just was not big enough to show it.”

 

Dr Vladlen Shvedov (L) and Dr Cyril Hnatovsky adjusting the hollow laser beam in their lab at RSPE. Image Stuart Hay, ANU
Dr Vladlen Shvedov and Dr Cyril Hnatovsky adjusting the hollow laser beam in their lab. Credit: Stuart Hay, ANU

 

Unlike previous techniques, which used photon momentum to impart motion, the ANU tractor beam relies on the energy of the laser heating up the particles and the air around them. The ANU team demonstrated the effect on gold-coated hollow glass particles. These are trapped in the dark centre of the beam. Energy from the laser hits a particle and travels across its surface, where it is absorbed, creating hotspots on the surface. Air particles colliding with hotspots heat up and shoot away from the surface, which causes the particle to recoil, in the opposite direction.

To manipulate the particle, the team move the position of the hotspot by carefully controlling the polarisation of the laser beam.

“We have devised a technique that can create unusual states of polarisation in the doughnut-shaped laser beam, such as star-shaped (axial) or ring polarised (azimuthal),” said co-author Dr Cyril Hnatovsky. “We can move smoothly from one polarisation to another and thereby stop the particle or reverse its direction at will.”

The work is published this week in Nature Photonics.

 

  speech bubble Comments »
 

 

 

21st October 2014

2014 on track for hottest year ever

Globally, 2014 is on track for the hottest year ever. September 2014 was the hottest September on record, after the hottest August, which was part of the hottest summer on record. The past 12 months — October 2013–September 2014 — were the warmest 12-month period among all months since records began in 1880.

 

2014 global temperature records

 

The National Oceanic and Atmospheric Administration (NOAA) has released its latest State of the Climate Report. Highlights include:

  • The combined average temperature over global land and ocean surfaces for September 2014 was the highest on record for September, at 0.72°C (1.30°F) above the 20th century average of 15.0°C (59.0°F).

  • The global land surface temperature was 0.89°C (1.60°F) above the 20th century average of 12.0°C (53.6°F), the sixth highest for September on record. For the ocean, the September global sea surface temperature was 0.66°C (1.19°F) above the 20th century average of 16.2°C (61.1°F), the highest on record for September and also the highest on record for any month.

  • The combined global land and ocean average surface temperature for the January–September period (year-to-date) was 0.68°C (1.22°F) above the 20th century average of 14.1°C (57.5°F), tying with 1998 as the warmest such period on record.

Last month, Britain had its driest September since national records began in 1910, with just 20% of the average rainfall for the month. Besides breaking the record itself, this rainfall deficit is especially notable as the preceding eight-month period (January–August) was the wettest such period on record. Meanwhile, 30.6% of the contiguous USA was in drought, with conditions worsening in many regions. Nearly 100% of California and Nevada were in "moderate-to-exceptional" drought.

If 2014 maintains its current trend for the remainder of the year, it will be the warmest calendar year on record, says NOAA. The agency's findings are in strong agreement with both NASA and the JMA, who both reported a record warm September earlier this month too. It also seems quite likely that we'll see an El Niño event during the winter, which could send global temperature anomalies even higher.

 

2014 global warming trend

 

  speech bubble Comments »
 

 

 

21st October 2014

World's first carbon-capture coal power plant

The world’s first commercial-scale carbon capture and storage (CCS) process on a coal-fired power plant has been officially opened at Canada's Boundary Dam Power Station. This $1.4 billion project will cut CO2 emissions from the plant by 90% and sulphur dioxide emissions by 100%.

 

worlds first commercial scale carbon capture coal power plant 2014

 

Electric utility company SaskPower’s new process involves retrofitting an old 110-megawatt (MW) coal-fired plant (that was first commissioned in 1959), adding solvent-based processors to strip away carbon dioxide, and then piping the CO2 to a nearby oil field. When fully optimised, it will capture up to a million tonnes of carbon dioxide annually, the equivalent of taking 250,000 cars off the road. The power unit equipped with CCS technology will continue to use coal to power approximately 100,000 homes and businesses in Saskatchewan, near the Canada-U.S. border. The captured CO2 will be used for enhanced oil recovery, with the remainder stored safely and permanently deep underground and continuously monitored.

The Canadian federal government paid $240 million towards the project. The launch was attended by more than 250 people from over 20 countries representing governments, industries and media. Attendees at the event toured the facility and learned how they can access SaskPower’s expertise and knowledge to develop their own CCS initiatives.

“This project is important because it is applicable to 95% of the world’s coal plants,” said Bill Boyd, Saskatchewan Minister of the Economy. “As nations develop emission regulations, they will come to us to see how we continue to provide affordable coal power to customers, but in an environmentally sustainable way.”

This follows news last month of a similar project being developed in Jacksonville, USA. The Environmental Protection Agency (EPA) approved permits allowing the FutureGen Industrial Alliance to capture and store CO2 deep underground – the first project of its kind in the U.S.

“The opening of this new SaskPower plant reinforces the great innovation and development that can take place if you have strong investment and partnerships from the government and industry,” said U.S. Senator Heidi Heitkamp (D-ND). “From my more than a decade working at Dakota Gasification in North Dakota, and from visiting the construction of the SaskPower facility just over a year ago, I understand just how important it is that we look to the future in how we harness our energy. Coal is a key resource in both Canada and the U.S., and through the development of clean coal technology, we can create North American independence and energy security, while also reducing emissions. We need to develop more clean coal plants to make that possible, and in the U.S., we can learn from the steps Canada has taken to find a realistic path forward for coal.”

The economics of CCS are still a major issue, however. At present, SaskPower's project is expensive and depends on having a nearby source of coal alongside an additional revenue stream from the enhanced oil recovery. Environmentalists have also continued to express concerns.

“At the end of the day, many people are going to wonder why SaskPower is investing $1.4-billion in 'clean coal' technology instead of wind, solar or geothermal energy,” said Victor Lau, Saskatchewan Greens Leader. “Our party will be monitoring future developments of this project very carefully.”

 

  speech bubble Comments »
 

 

 

20th October 2014

Gay marriage in the United States is progressing faster than expected

Back in 2011, we predicted that same-sex marriage would be allowed in every part of the United States by 2024. At the time, some of our readers claimed this was unrealistic and the process would take considerably longer. We chose that year based on the number of states where it had already become legal projected onto a future trend, combined with a reference from Des Moines Register that seemed to agree with our forecast.

Only six states (plus the District of Columbia) permitted same-sex marriages in 2011. Since then, another 25 have legalised it, bringing the total to 31, which is now a clear majority of the 50 states. This year alone has seen 14 states passing new laws. From 6th-12th October 2014, the Supreme Court declined to hear cases on same-sex marriage appeals – thus legalising gay marriage in Virginia, Utah, Indiana, Oklahoma and Wisconsin. This action was followed by legalisation of same-sex marriage in Nevada, Colorado, West Virginia, Idaho, North Carolina and Alaska.

There are even more cases to follow. The Sixth Circuit Court of Appeals is now expected to rule on challenges to the denial of same-sex marriage in Kentucky, Michigan, Ohio and Tennessee. Public support has grown at an increasing pace since the 1990s. According to a recent Gallup poll, it now stands at 52%, with 43% against and 5% with no opinion. Support tends to be higher among the younger generations, with 69% of 18-34 year olds in favour and only 38% of those aged 55 or above.

Below is a graph showing the number of states where gay marriage has been legalised (green) and the original trend we predicted back in 2011 (red). Half of the remaining states lie in the southern Bible Belt, a traditional conservative stronghold (see this excellent map and slider from Pew Research). Nevertheless, it seems our prediction will need revising.

 

gay marriage trend in the usa

 

  speech bubble Comments »
 

 

 

18th October 2014

The first direct detection of dark matter particles may have been achieved

Astronomers have detected what appears to be a signature of "axions" – dark matter particle candidates. If confirmed, this would be the first direct detection and identification of the elusive substance, which has been a mystery in physics for over 30 years.

 

dark matter particles axions detected
XMM-Newton observatory. Credit: ESA

 

A landmark paper by Professor George Fraser – who tragically died earlier this year – and colleagues from the University of Leicester offers what is potentially the first direct detection of dark matter. This hypothetical form of matter comprises 85% of the Universe, but neither emits nor absorbs light or other electromagnetic radiation in any significant way. Its existence is only known because of the gravitational pull it has on objects. In other words, it is what holds everything together, and without it, galaxies would unravel and fly apart.

The study – to be published on 20th October in the Monthly Notices of the Royal Astronomical Society – looked at 15 years of measurements taken by the European Space Agency's orbiting XMM-Newton observatory; almost its entire archive of data. A curious signal was seen in the X-ray sky which had no conventional explanation, but is now believed to have been the result of axions. Previous searches for these particles, notably at CERN, and with other spacecraft in Earth orbit, have so far proved unsuccessful.

“The X-ray background – the sky, after the bright X-ray sources are removed – appears to be unchanged whenever you look at it,” says Dr. Andy Read from the University of Leicester's Department of Physics and Astronomy and now leading the paper. “However, we have discovered a seasonal signal in this X-ray background, which has no conventional explanation, but is consistent with the discovery of axions.”

As the late Professor Fraser explains in the paper: “It appears plausible that axions – dark matter particle candidates – are indeed produced in the core of the Sun and do indeed convert to X-rays in the magnetic field of the Earth.”

 

dark matter particles axions detected
A sketch (not to scale) showing axions (blue) streaming out from the Sun, converting in the Earth's magnetic field (red) into X-rays (orange), which are then detected by the XMM-Newton observatory. Credit: University of Leicester

 

It is predicted that the X-ray signal due to axions will be greatest when looking through the sunward side of the magnetic field, because this is where the field is strongest. Each of these ghostly particles is extraordinarily light, with a vanishingly small mass just 1/100 billionth that of an electron or a million times less than a neutrino.

Dr. Read concludes: “These exciting discoveries, in George's final paper, could be truly ground-breaking, potentially opening a window to new physics, and could have huge implications, not only for our understanding of the true X-ray sky, but also for identifying the dark matter that dominates the mass content of the cosmos.”

President of the Royal Astronomical Society, Professor Martin Barstow: “This is an amazing result. If confirmed, it will be first direct detection and identification of the elusive dark matter particles and will have a fundamental impact on our theories of the Universe.”

We may know a lot more about dark matter in the coming years – thanks to a string of new observatories including the Euclid Space Telescope (2020), the European Extremely Large Telescope (2022) and the Advanced Technology Large-Aperture Space Telescope (2025). Dr. Read's team also plans to double the dataset from XMM-Newton and look at the results with more precision over the next few years.

 

  speech bubble Comments »
 

 

 

18th October 2014

Lockheed Martin planning a compact fusion reactor within 10 years

This week, Lockheed Martin announced plans for a small-scale fusion power plant to be developed in as little as 10 years. A number of experts have expressed doubts over its viability.

 

lockheed martin compact fusion reactor design 2019 2024 technology

 

If it ever became a reality, fusion power would be truly world-altering – a clean, safe and essentially limitless supply of energy allowing humanity's continued survival for centuries and millennia to come. The international project known as ITER is planned for operation in 2022 and its eventual successor may emerge in the 2040s. Widespread deployment of fusion is not expected until 2070.

U.S. defence giant Lockheed Martin hopes to accelerate progress in this area, by developing what it calls a compact fusion reactor (CFR). This would be around 10 times smaller than conventional tokamak designs, small enough to fit on the back of a truck and generating 100 megawatts (MW) of power. The company intends to build a prototype within five years – according to its press release – with commercial introduction five years after that. It has several patents pending for the work and is looking for partners in academia, industry and among government laboratories.

As illustrated above, the main improvement over ITER would be the use of a superconducting torus to create a differently shaped magnetic field, able to contain plasma far better than previous configurations. These small reactors could be fitted in U.S. Navy warships and submarines while eliminating the need for other fuel types. They could power small cities of up to 100,000 people, allow planes to fly with unlimited range, or even be used in spacecraft to cut journey times to Mars from six months to a single month. Using a CFR, the cost of desalinated water could fall by 60 percent.

 

 

If this sounds too good to be true, it may well be. Although Lockheed has been successful in its magnetised ion confinement experiments, a number of significant challenges remain for a working prototype with plasma confinement – let alone a commercialised version.

"I think it's very overplayed," University of California nuclear engineering professor Dr. Edward Morse told The Register. "They are being very cagey about divulging details."

"Getting net energy from fusion is such a goddamn difficult undertaking," said University of Texas physicist Dr. Swadesh M. Mahajan, in an interview with Mother Jones. "We know of no materials that would be able to handle anywhere near that amount of heat."

"The nuclear engineering clearly fails to be cost effective," Tom Jarboe told Business Insider in an email.

For these reasons, it is perhaps best to wait for more news and developments before adding the CFR to our timeline. We will, of course, keep you updated on Lockheed's progress as it emerges. You can also discuss this project on our forum.

 

  speech bubble Comments »
 

 

 

17th October 2014

Wi-Fi up to five times faster coming in 2015

Samsung Electronics has developed a new way of transmitting Wi-Fi data five times faster than was previously possible. The new technology is expected to be available in consumer devices as early as 2015.

 

wifi

 

If you've been to a cafe or other public place recently and been frustrated at the slow speed of Wi-Fi, a new breakthrough by Samsung Electronics may soon change that. Researchers at the company have this week achieved the development of 60GHz Wi-Fi allowing transfer rates of 4.6Gbps, or 575MB per second. That is 5.3 times faster than the previous maximum speed for consumer devices (866Mbps, or 108MB per second).

Today's generation of Wi-Fi uses the 2.4Ghz and 5Ghz areas of the radio spectrum. The 60GHz band is currently unlicensed and offers major potential, but previous attempts to exploit it have failed to send data over significant distances, due to path loss and weak penetration properties. Samsung has overcome these issues through a combination of millimetre-wave circuit design, a high performance modem and wide-coverage beam-forming antenna. This eliminates co-channel interference, regardless of the number of devices using the same network.

Commercialisation is expected in 2015, with Samsung planning integration into a wide variety of products – including audio visual, medical devices and telecommunications equipment. It will also help to spur the Internet of Things.

“Samsung prides itself at being of the forefront of technology innovation, and is delighted to have overcome the barriers to the commercialisation of 60GHz millimetre-wave band Wi-Fi technology,” said Paul Templeton, General Manager of Samsung Networks UK. “This breakthrough has opened the door to exciting possibilities for Samsung’s next-generation devices, and has also changed the face of the future development of Wi-Fi technology, promising innovations that were not previously within reach.”

To give an idea of the speed: a 1GB movie will take less than three seconds to transfer between devices, while uncompressed high-definition videos could easily be streamed from mobile devices to TVs in real-time without any delay.

 

  speech bubble Comments »
 

 

 

17th October 2014

Beyond Pluto: New Horizons targets identified

NASA has announced finding several Kuiper Belt Objects that may be targeted by the New Horizons spacecraft, following its flyby of the Pluto system in July 2015.

 

pluto kuiper belt new horizons mission

 

Peering into the dim, outer reaches of our Solar System, NASA's Hubble Space Telescope has uncovered three Kuiper Belt Objects (KBOs) that the agency's New Horizons spacecraft could potentially visit after it flies by Pluto in July 2015. The KBOs were detected by a search team who were awarded telescope time for this purpose, following a committee recommendation earlier this year.

"This has been a very challenging search, and it's great that in the end Hubble could accomplish a detection — one NASA mission helping another," said Alan Stern of the Southwest Research Institute (SwRI) in Boulder, Colorado, principal investigator of the New Horizons mission.

The Kuiper Belt is a vast rim of primordial debris encircling our Solar System. KBOs belong to a unique class of Solar System objects that has never been visited by spacecraft and which contain clues to the origin of our Solar System.

The KBOs that Hubble found are each about 10 times larger than typical comets, but only about 1-2 percent of the size of Pluto. Unlike asteroids, KBOs have not been heated by the Sun, and are thought to represent a pristine, well preserved, deep-freeze sample of what the outer Solar System was like following its birth 4.6 billion years ago. The KBOs found in the Hubble data are thought to be the building blocks of dwarf planets such as Pluto.

The New Horizons team started to look for suitable KBOs in 2011 using some of the largest ground-based telescopes on Earth. They found several dozen KBOs, but none were reachable within the fuel supply available aboard the New Horizons spacecraft.

"We started to get worried that we could not find anything suitable – even with Hubble – but in the end, the space telescope came to the rescue," said team member John Spencer of SwRI. "There was a huge sigh of relief when we found suitable KBOs; we are 'over the moon' about this detection."

 

kbo kuiper belt objects pluto new horizons mission hubble space telescope images

 

Following an initial proof of concept of the Hubble pilot observing program in June, the New Horizons team was awarded telescope time by the Space Telescope Science Institute for a wider survey in July. When the search was completed in early September, the team identified one KBO that is "definitely reachable" and two other potentially accessible KBOs that will require more tracking over several months to know whether they too are accessible by the New Horizons spacecraft.

This was a needle-in-a-haystack search for the New Horizons team, because the elusive KBOs are extremely small, faint, and difficult to pick out against myriad background stars in the constellation Sagittarius, which is in the present direction of Pluto. The three KBOs identified are each 1 billion miles beyond Pluto. Two of the KBOs are estimated to be as large as 34 miles (55 km) across, and the third is perhaps as small as 15 miles (25 km).

The New Horizons spacecraft, launched in 2006 from Florida, is the first mission in NASA's New Frontiers Program. Once a NASA mission completes its prime mission, the agency conducts an extensive science and technical review to determine whether extended operations are warranted.

The New Horizons team expects to submit such a proposal to NASA in late 2016 for an extended mission to fly by one of the newly identified KBOs. Hurtling across the Solar System, the New Horizons spacecraft would reach the distance of 4 billion miles from the Sun roughly three to four years after its July 2015 Pluto encounter. Accomplishing such a KBO flyby would substantially increase the science return from the New Horizons mission.

 

Kuiper Belt Object 1110113Y
Kuiper Belt Object 1110113Y

 

  speech bubble Comments »
 

 

 

14th October 2014

Onshore wind is cheaper than coal, gas and nuclear

Generating electricity from onshore wind is cheaper than gas, coal and nuclear when externalities are stacked with the levelised cost of energy and subsidies, according to a new study ordered and endorsed by the European Commission.

 

onshore wind power

 

A new report by the energy consultancy firm Ecofys has been analysed by the European Wind Energy Association (EWEA). Data in the report shows that onshore wind now has an approximate cost of €105 per megawatt hour (MWh) which is cheaper than gas (up to €164), nuclear (€133) and coal (between €162-233). Offshore wind comes in at €186 and solar PV has a cost of around €217 per MWh.

The total cost of energy production – which factors in externalities such as air quality, climate change and human toxicity among others – shows that coal is more expensive than the highest retail electricity price in the EU. The report puts the figure of external costs of the EU's energy mix in 2012 at between €150 and €310 billion (US$190 and US$394 billion).

Justin Wilkes, deputy chief executive officer of the European Wind Energy Association, said: "This report highlights the true cost of Europe's dependence on fossil fuels. Renewables are regularly denigrated for being too expensive and a drain on the taxpayer. Not only does the Commission's report show the alarming cost of coal but it also presents onshore wind as both cheaper and more environmentally-friendly."

Onshore and offshore wind technologies also have room for significant cost reduction. Coal on the other hand is a fully mature technology and is unlikely to reduce costs any further.

He added: "We are heavily subsidising the dirtiest form of electricity generation while proponents use coal's supposed affordability as a justification for its continued use. The irony is that coal is the most expensive form of energy in the European Union. This report shows that we should use the 2030 climate and energy package as a foundation for increasing the use of wind energy in Europe to improve our competitiveness, security and environment."

 

  speech bubble Comments »
 

 

 

13th October 2014

Half as much dark matter in Milky Way galaxy as previously thought

New measurements reveal there is half as much dark matter in our galaxy as previously thought, solving the 15-year-old "missing satellite galaxy" problem.

 

milky way galaxy
Credit: ESO/L. Calçada

 

New measurements of dark matter in our own Milky Way galaxy reveal there is half as much of the mysterious substance as previously thought. Astronomers from the International Centre for Radio Astronomy Research (ICRAR) used a method developed almost 100 years ago to discover that the weight of dark matter in our galaxy is 800 billion (or 8 x 1011) times the mass of the Sun. They probed the edge of the Milky Way, looking closely, for the first time, at the fringes about 5 million trillion kilometres from Earth.

Astrophysicist Dr Prajwal Kafle said we have known for a while that most of the Universe is hidden:“Stars, dust, you and me, all the things that we see, only make up about 4 per cent of the entire Universe. About 25 per cent is dark matter and the rest is dark energy.”

Dr Kafle was able to measure the mass of the dark matter in the Milky Way by studying the speed of stars throughout the galaxy, including the edges, which had never been studied in this detail before. He used a robust technique developed by British astronomer James Jeans in 1915 – decades before the discovery of dark matter. This new calculation helps to solve a mystery that has been haunting theorists for almost two decades.

“The current idea of galaxy formation and evolution – called the Lambda Cold Dark Matter theory – predicts that there should be a handful of big satellite galaxies around the Milky Way that are visible with the naked eye, but we don’t see that,” Dr Kafle said. “When you use our measurement of the mass of dark matter, the theory predicts that there should only be three satellite galaxies out there, which is exactly what we see; the Large Magellanic Cloud, the Small Magellanic Cloud and the Sagittarius Dwarf Galaxy.”

 

 

 

University of Sydney astrophysicist Prof. Geraint Lewis, who was also involved in the research, said the missing satellite problem had been “a thorn in the cosmological side for almost 15 years.”

“Dr Kafle’s work has shown that it might not be as bad as everyone thought, although there are still problems to overcome," he said.

The study also presented a holistic model of the Milky Way, which allowed the scientists to calculate several interesting factors, such as the speed required to leave the galaxy.

“Be prepared to hit 550 kilometres per second if you want to escape the gravitational clutches of our galaxy,” Dr Kafle said. “A rocket launched from Earth needs just 11 kilometres per second to leave its surface.”

 

  speech bubble Comments »
 

 

 

10th October 2014

A cure for type 1 diabetes may be imminent

Researchers at Harvard University have turned human embryonic stem cells into cells that produce insulin, a potentially major advance for sufferers of diabetes.

 

stem cells insulin

 

Harvard researchers have made a giant leap forward in the quest to find a truly effective treatment for type 1 diabetes, a condition that affects an estimated 22 million people worldwide. With human embryonic stem cells as a starting point, the scientists produced for the first time – in the kind of massive quantities needed for cell transplantation and pharmaceutical uses – human insulin-producing beta cells equivalent in most every way to normally functioning beta cells.

“We are now just one pre-clinical step away from the finish line,” says Prof. Douglas Melton, who led the work and has been researching the disease for nearly 25 years. “You never know for sure that something like this is going to work until you’ve tested it numerous ways. We’ve given these cells three separate challenges with glucose in mice and they’ve responded appropriately; that was really exciting. It was gratifying to know that we could do something that we always thought was possible, but many people felt it wouldn’t work. If we had shown this was not possible, then I would have had to give up on this whole approach. Now I’m really energised.”

Elaine Fuchs, a Professor at Rockefeller University, who is not involved in the research, hailed it as “one of the most important advances to date in the stem cell field, and I join the many people throughout the world in applauding my colleague for this remarkable achievement.”

“For decades, researchers have tried to generate human pancreatic beta cells that could be cultured and passaged long term under conditions where they produce insulin.” Fuchs continued. “Melton and his colleagues have now overcome this hurdle and opened the door for drug discovery and transplantation therapy in diabetes.”

Jose Oberholzer, Associate Professor at the University of Illinois at Chicago, said the work “will leave a dent in the history of diabetes. Doug Melton has put in a life-time of hard work in finding a way of generating human islet cells in vitro. He made it. This is a phenomenal accomplishment.”

 

doug melton
Prof. Doug Melton, Harvard University

 

Type 1 diabetes is an autoimmune metabolic condition in which the body kills off all the pancreatic beta cells that produce the insulin needed for glucose regulation in the body. Thus, the final pre-clinical step in the development of a treatment involves protecting from immune system attack the approximately 150 million cells that would have to be transplanted into each patient being treated. Melton is collaborating with colleagues on the development of an implantation device to protect the cells. The device currently being tested has thus far protected beta cells implanted in mice from immune attack for many months. “They are still producing insulin,” Melton said.

Cell transplantation as a treatment for diabetes is still essentially experimental, uses cells from cadavers, requires the use of powerful immunosuppressive drugs, and has been available to only a very small number of patients.

Daniel G. Anderson from MIT, who is working with Melton on the implantation device, said the new work by Melton’s lab is “an incredibly important advance for diabetes. There is no question that ability to generate glucose-responsive, human beta cells through controlled differentiation of stem cells will accelerate the development of new therapeutics. In particular, this advance opens the doors to an essentially limitless supply of tissue for diabetic patients awaiting cell therapy.”

“There have been previous reports of other labs deriving beta cell types from stem cells,” said Melton. “No other group has produced mature beta cells as suitable for use in patients. The biggest hurdle has been to get to glucose sensing, insulin-secreting beta cells, and that’s what our group has done.”

Human transplantation trials using the cells are expected to start in the next few years. Melton's work was published yesterday in the journal Cell.

 

  speech bubble Comments »
 

 

 

9th October 2014

Fusion reactor concept could be cheaper than coal

The University of Washington is developing a new fusion reactor design that could be one-tenth the cost of ITER – while producing five times the amount of energy.

 

HIT-SI3

 

Fusion energy sounds almost too good to be true – zero greenhouse gas emissions, no long-lived radioactive waste, and a nearly unlimited fuel supply. Perhaps the biggest roadblock to adopting fusion energy is that the economics haven't worked out. Fusion power designs aren't cheap enough to outperform systems that use fossil fuels such as coal and natural gas.

Engineers at the University of Washington (UW) hope to change that. They have designed a concept for a fusion reactor that, when scaled up to the size of a large electrical power plant, would rival costs for a new coal-fired plant with similar electrical output. The team will present its reactor design and cost-analysis findings on 17th October at the Fusion Energy Conference in St. Petersburg, Russia.

“Right now, this design has the greatest potential of producing economical fusion power of any current concept,” says Thomas Jarboe, a UW professor of aeronautics and astronautics and an adjunct professor in physics.

The reactor – called the dynomak – began as a class project taught by Jarboe two years ago. After the class had ended, Jarboe and doctoral student Derek Sutherland (who previously worked on a reactor design at MIT) continued to develop and refine the concept.

The design builds on existing technology and creates a magnetic field within a closed space to hold plasma in place long enough for fusion to occur, allowing the hot plasma to react and burn. The reactor itself would be largely self-sustaining, meaning it would continuously heat the plasma to maintain thermonuclear conditions. Heat generated from the reactor would heat up a coolant that is used to spin a turbine and generate electricity, similar to how a typical power reactor works.

“This is a much more elegant solution, because the medium in which you generate fusion is the medium in which you’re also driving all the current required to confine it,” Sutherland says.

 

fusion design

 

There are several ways to create a magnetic field, which is crucial to keeping a fusion reactor going. The UW’s design is known as a spheromak – meaning it generates the majority of magnetic fields by driving electrical currents into the plasma itself. This reduces the amount of required materials and actually allows researchers to shrink the overall size of the reactor.

Other designs, such as the ITER experimental fusion reactor being built in France – due to be operational in 2022 – have to be much larger than UW’s because they rely on superconducting coils that circle around the outside of the device to provide a similar magnetic field. When compared with the fusion reactor concept in France, the UW’s is much less expensive – about one-tenth the cost of ITER – while producing five times the amount of energy.

The UW researchers factored the cost of building a fusion reactor power plant using their design and compared that with building a coal power plant. They used a metric called “overnight capital costs,” which includes all costs, particularly startup infrastructure fees. A fusion power plant producing a gigawatt (1 billion watts) of power would cost $2.7 billion, while a coal plant of the same output would cost $2.8 billion, according to their analysis.

“If we do invest in this type of fusion, we could be rewarded because the commercial reactor unit already looks economical,” Sutherland said. “It’s very exciting.”

Right now, the UW’s concept is about one-tenth the size and power output of a final product, which is still years away. The researchers have successfully tested the prototype’s ability to sustain plasma efficiently, and as they further develop and expand the size of the device, they can ramp up to higher-temperature plasma and get significant fusion power output. The team has filed patents on the concept with the UW’s Centre for Commercialisation and plans to continue developing and scaling up its prototypes. The research was funded by the U.S. Department of Energy.

 

  speech bubble Comments »
 

 

 

8th October 2014

Ocean warming in Southern Hemisphere has been greatly underestimated

The evidence for global warming continues to pour in. A new study of ocean heat content shows that temperatures have been greatly underestimated in the Southern Hemisphere. As a result, the world's oceans are now absorbing between 24 and 58 per cent more energy than previously thought.

 

ocean heat content southern hemisphere global warming underestimated
Like a fleet of miniature research vessels, more than 3,600 robotic floats provide data on upper layers of the world's ocean currents.

 

Scientists from Lawrence Livermore National Laboratory in California, using satellite observations and a large suite of climate models, have found that long-term ocean warming in the upper 700 metres of Southern Hemisphere oceans has been greatly underestimated.

"This underestimation is a result of poor sampling prior to the last decade, and limitations of the analysis methods that conservatively estimated temperature changes in data-sparse regions," said LLNL oceanographer Paul Durack, lead author of a paper in the 5th October issue of the journal Nature Climate Change.

Ocean heat storage is important because it accounts for over 90 percent of excess heat associated with global warming. The observed ocean and atmosphere warming is a result of continuing greenhouse gas emissions. The Southern Hemisphere oceans make up 60 percent of the world's oceans.

The researchers found that climate models simulating the relative increase in sea surface height between Northern and Southern hemispheres were consistent with highly accurate altimeter observations. However, the simulated upper-ocean warming in Northern and Southern hemispheres was inconsistent with observed estimates of ocean heat content change. These sea level and ocean heat content changes should have been consistent, suggesting that until recent improvements in observational data, Southern Hemisphere ocean heat content changes were underestimated.

Since 2004, automated profiling floats called Argo (pictured above) have been used to measure global ocean temperatures from the surface down to 2,000 m (6,560 ft). These 3,600 floats currently observing the global ocean provide systematic coverage of the Southern Hemisphere for the first time. Argo float data over the last decade, as well as earlier measurements, show that the ocean has been steadily warming, according to Durack.

"The Argo data is really critical," he said. "Estimates that we had until now have been pretty systematically underestimating the changes. Prior to 2004, research has been very limited by poor measurement coverage. Our results suggest that ocean warming has been underestimated by 24 to 58 percent. The conclusion that warming has been underestimated agrees with previous studies. However, it's the first time that scientists have tried to estimate how much heat we've missed."

 

ocean heat content global warming map

 

Given that most of the excess heat associated with global warming is in the oceans, this study has important implications for how scientists view the Earth's overall energy budget. Heat currently stored by the oceans will eventually be released, causing land temperatures to accelerate and triggering more extreme climate events.

"We continue to be stunned at how rapidly the ocean is warming," said Sarah Gille, a Scripps Institution of Oceanography professor who was not involved in the study. "Even if we stopped all greenhouse gas emissions today, we'd still have an ocean that is warmer than the ocean of 1950, and that heat commits us to a warmer climate. Extra heat means extra sea level rise, since warmer water is less dense, so a warmer ocean expands."

"An important result of this paper is the demonstration that the oceans have continued to warm over the past decade, at a rate consistent with estimates of Earth’s net energy imbalance," says Prof. Steve Rintoul, from Australia’s Commonwealth Scientific and Industrial Research Organisation. "While the rate of increase in surface air temperatures slowed in the last 10 to 15 years, the heat stored by the planet, which is heavily dominated by the oceans, has steadily increased as greenhouse gases have continued to rise."

These new results are consistent with another new paper that appears in the same issue of Nature Climate Change. Co-author Felix Landerer of NASA's Jet Propulsion Laboratory, who contributed to both studies, says, "Our other new study on deep-ocean warming found that from 2005 to the present, Argo measurements recorded a continuing warming of the upper-ocean. Using the latest available observations, we're able to show that this upper-ocean warming and satellite measurements are consistent."

In related news, a report by Edinburgh's Heriot-Watt University – based on the work of 30 experts – finds that ocean acidification has increased by 26% since pre-industrial times. It is now causing nearly $1 trillion of damage to coral reefs each year, threatening the livelihoods of 400 million people.

 

  speech bubble Comments »
 

 

 

7th October 2014

Premature deaths could be reduced by 40% over next 20 years

New research published in The Lancet suggests that, with sustained international efforts, the number of premature deaths could be reduced by 40% over the next two decades (2010-2030), halving under–50 mortality and preventing a third of the deaths at ages 50–69 years.

 

coffin at funeral

 

The Lancet reveals that, between 2000 and 2010, child deaths fell by one-third worldwide, helped by the fourth Millennium Development Goal (MDG) to reduce child deaths by two-thirds; and premature deaths among adults fell by one-sixth, helped by MDG 5 to reduce maternal mortality and MDG 6 to fight AIDS, malaria and other diseases. With expanded international efforts against a wider range of causes, these rates of decrease could accelerate, say the study authors.

The most striking change during 2000–2010 was a two-thirds reduction in childhood deaths from the diseases now controlled by vaccination (diphtheria, pertussis, tetanus, polio, and measles), highlighting what targeted international efforts can achieve.

“Death in old age is inevitable, but death before old age is not”, said co-author Richard Peto, Professor of medical statistics at the University of Oxford, UK. “In all major countries, except where the effects of HIV or political disturbances predominated, the risk of premature death has been decreasing in recent decades, and it will fall even faster over the next few decades if the new UN Sustainable Development Goals get the big causes of death taken even more seriously.”

The United Nations General Assembly has been discussing 17 Sustainable Development Goals for 2016–2030 to replace the MDGs that expire at the end of 2015. The new health goal is “Ensure healthy lives and promote well-being for all at all ages”. The group of 16 authors, writing in The Lancet, call for this new health goal to be accompanied by a specific target to avoid in each country 40% of all premature deaths (of the deaths that would occur in the 2030 population of that country, if its 2010 death rates continued).

The 40% reduction from 2010 to 2030 in deaths before age 70 would involve reductions of two-thirds in the causes already being targeted by the MDGs, and a one-third reduction in other causes of premature death, such as non-communicable diseases and injuries.

 

  deaths graph
 


(A) Risk of death versus age for the world in 1970 and 2010
(B) and for country income groupings in 2010.
For historical comparison, the 1910 and 2010 risks for England and Wales are given.

 

Lead author Ole Norheim, Professor of global public health at the University of Bergen, Norway, explained, “Based on realistically moderate improvements in current trends, our proposed targets are a two-thirds reduction in child and maternal deaths and in HIV, tuberculosis, and malaria, and a one-third reduction in deaths from non-communicable diseases and injuries. For this, we are going to need improved healthcare, intensified international efforts to control communicable diseases, and more effective prevention and treatment of non-communicable diseases and injuries.”

“The most important cause of non-communicable disease is tobacco use – and one of the key determinants of smoking is the price of cigarettes”, says co-author Prabhat Jha, Director of the Centre for Global Health Research in St Michael’s Hospital, Toronto. “WHO is calling for a 30% reduction in smoking by 2025, and in many countries major increases in excise taxes that double the price of cigarettes are still possible. Such an increase would reduce smoking by about a third, but would increase the total Government tax yield from smoking by about a third.”

With political commitment and sustained efforts to improve health, the current rate of decline in premature death can be further accelerated. “We conclude that a 40% reduction in premature deaths is realistic in each country where mortality in 2030 is not dominated by new epidemics, political disturbances or disasters”, adds Professor Norheim.

Writing in a linked Comment, the Norwegian Ministers of Foreign Affairs and of Health and Care say, “[This] study shows what an important part science could play in the negotiations at the 69th Session of the UN General Assembly. We strongly urge the medical community to develop a common position that can enable the international community to arrive at a single health SDG with a limited number of simple, understandable and measurable targets.”

In another linked Comment, Professor Sir George Alleyne, Director Emeritus of the Pan American Health Organization (PAHO), Washington, DC, USA, and colleagues, write that, “The significant advance in this paper is to introduce quantification to the target-setting process, based on rigorous analysis of mortality trends by age as well as by disease category. The proposed targets focus on premature mortality and avoid more complex metrics which are much harder to measure and track over time. The authors stress the importance of countries adapting the targets to their own circumstances.”

This study was funded by the UK Medical Research Council, Norwegian Agency for Development Co-operation, University of Toronto Centre for Global Health Research, and Bill and Melinda Gates Foundation.

 

  speech bubble Comments »
 

 

 

6th October 2014

Autonomous swarm boats to defend U.S. Navy

The Office of Naval Research (ONR) has announced a technological breakthrough that allows unmanned surface vehicles (USV) to not only protect Navy ships, but also, for the first time, autonomously “swarm” offensively on hostile vessels.

 

autonomous swarm boat

 

First-of-its-kind technology – demonstrated on the James River in Virginia – allows unmanned, self-guided vessels to overwhelm an adversary. This is achieved using a combination of sensors and software called CARACaS (Control Architecture for Robotic Agent Command and Sensing). The hardware is small and light enough to be portable and can be installed on almost any boat. It is also inexpensive, at just $2000 for each kit.

These automated patrols could leave warships they're protecting and swarm around potential threats on the water. This technology could be utilised by the U.S. Navy within a year, defence officials say, adding it could help stop attacks like the deadly 2000 bombing of the USS Cole.

“Our Sailors and Marines can’t fight tomorrow’s battles using yesterday’s technology,” said Chief of Naval Research, Matthew Klunder. “This kind of breakthrough is the result of the Navy’s long-term support for innovative research in science and technology.”

Without a human physically needing to be at the controls, the boats can operate in sync with other unmanned vessels – choosing their own routes; swarming to interdict enemy vessels; and escorting/protecting naval assets.

 

autonomous swarm boats

 

“This networking unmanned platforms demonstration was a cost-effective way to integrate many small, cheap, and autonomous capabilities that can significantly improve our warfighting advantage,” said Admiral Jonathan Greenert, Chief of Naval Operations.

“This multiplies combat power by allowing CARACaS-enabled boats to do some of the dangerous work,” said Dr. Robert Brizzolara, program manager at the ONR. “It will remove our Sailors and Marines from many dangerous situations; for instance when they need to approach hostile or suspicious vessels. If an adversary were to fire on the USVs, no humans would be at risk.”

In the tests, as many as 13 Navy boats were operating together. First they escorted a high-value Navy ship, and then, when a simulated enemy vessel was detected, the boats sped into action, swarming around the threat. This demonstration comes near the anniversary of the USS Cole bombing off the coast of Yemen. In that October 2000 terrorist attack, a small boat laden with explosives was able to get near a guided-missile destroyer and detonate, killing 17 Sailors and injuring 39 others.

Autonomous unmanned surface vehicles could play a vital role in protecting people, ports and commerce. In the future, the capability could be scaled up to include even greater numbers of USVs – and even to other platforms such as drones, helicopters and jet fighters.

"This is something that you might find not only just on our naval vessels. We could certainly see this utilised to protect merchant vessels, to protect ports and harbours; used also to protect offshore oil rigs," Klunder said.

 

 

  speech bubble Comments »
 

 

 

6th October 2014

New AI program interacts like a human

Software company IPsoft has announced a new artificial intelligence platform named “Amelia” that makes it possible to automate knowledge distribution over a wide range of functions. Exposed to the same information as any new hire, she instantly applies information to solve queries. With Amelia able to shoulder the burden of tedious, often laborious tasks, she partners with human co-workers to achieve new levels of productivity and service quality.

 

 

Whereas most other technologies demand that humans adapt their behaviour in order to interact with ‘smart machines’, Amelia is intelligent enough to interact like a human herself. She learns using the same natural language manuals as her colleagues, but in a matter of seconds. She understands the full semantic meaning of what she reads – rather than simply recognising individual words – by applying context, logic and inferring implications. Independently, rather than through time-intensive programming, Amelia creates her own process map of the information she is given so that she can work out for herself exactly what actions to take, depending on the specific problem being solved. Like a human worker, she learns from her colleagues and by observing their work, is able to continually build up knowledge.

In a fraction of the time it takes traditionally to train someone in a new role, Amelia is able to perform at a high level. What is more, as she already speaks over 20 languages, she is able to support international operations with ease. Her core knowledge of a process needs only to be learned once for her to be able to communicate with customers in their language.

Much like machines transformed agriculture and manufacturing, cognitive technologies will drive the next evolution of the global workforce. In the future, companies will compete in the digital economy with a digital workforce that comprises a balance of human and virtual employees. Research firm Gartner predicts that by 2017, autonomics and cognitive platforms like Amelia will drive a 60 percent reduction in the cost of managed services. This technology is already being piloted within a number of Fortune 1000 companies and IPsoft expects to announce new customers and prominent industry partners before the end of this year.

"We want to make sure that human beings can dedicate their time to more valuable tasks. Taking out the more repetitive tasks is I think a noble aspiration for a company," said Frank Lansink, EU CEO of IPsoft, at a briefing in the firm's HQ at 30 St Mary Axe (the Gherkin). "Our purpose is to elevate human beings into a more meaningful role, adding value to society, or to enterprise, or the customer."

 

  speech bubble Comments »
 

 

 

4th October 2014

The first baby born from a womb transplant

Doctors in Sweden have announced the first baby born to a mother with a womb transplant. This pioneering operation offers hope to thousands of couples who are unable to conceive children.

 

first baby born from womb transplant

 

In 2013, researchers at the University of Gothenburg completed a series of nine womb transplants on women in Sweden. Among the patients was an unnamed 36-year-old with Mayer-Rokitansky-Küster-Hauser syndrome (MRKH), a rare condition that prevents the uterus from developing. Her ovaries were intact, however, so she could ovulate. This female became the recipient of a uterus donation from her 61-year-old family friend, the latter having gone through the menopause around seven years earlier.

Drugs were needed to suppress the immune system, which otherwise would have resulted in the organ being rejected. Alongside this, IVF was used to produce 11 embryos, frozen and stored for later use. In January 2014, a year after the transplant, doctors successfully implanted one of these embryos into the patient, transferring it to her new womb. There were concerns over how well a transplanted uterus would cope with the strains of pregnancy, during which it swells greatly in size. The procedure had been attempted by scientists in the past – but in each case, it led to either a miscarriage or organ failure caused by disease.

On this occasion, however, the operation was successful. There were problems in the 31st week of pregnancy – as the mother developed a condition known as pre-eclampsia (characterised by high blood pressure) – but a caesarean section delivered a healthy baby boy weighing 3.9 pounds (1.8 kg); normal for that stage of pregnancy. British medical journal The Lancet has released a photo below and is due to publish a report on the case shortly.

 

first baby born from womb transplant
Credit: The Lancet

 

This milestone in reproductive medicine – the culmination of more than 10 years' research and surgical training – offers hope to thousands of couples who are unable to conceive children. The doctor who led the work, Prof. Mats Brännström, has issued a note of caution, however. In an interview he stated it will be "many, many years" before this operation becomes routine. This is partly because of the extremely high cost, but also because it remains a new and somewhat experimental procedure, only performed by certain specialist surgeons in select centres and requiring various further studies.

Dr Allan Pacey, of the British Fertility Society says: "I think it is brilliant and revolutionary, and opens the door to many infertile women. The scale of it feels a bit like IVF. It feels like a step change. The question is can it be done repeatedly, reliably and safely."

"He’s no different from any other child – but he will have a good story to tell," the father says. "One day, he can look at the newspaper articles about how he was born and know that he was the first in the world to be born this way."

 

  speech bubble Comments »
 

 

 

4th October 2014

Elon Musk: Tesla 90% autonomous in 2015

In this interview with CNN Money, Elon Musk says that a Tesla car able to self-drive up to 90% of the time will be launched in 2015. The company will also reveal its next electric vehicle – the model "D" – on 9th October, according to a tweet.

 

 

 

  speech bubble Comments »
 

 

 

3rd October 2014

Eastern basin of the Aral Sea has completely dried up

This year marks another milestone for the Aral Sea — a once huge lake in Central Asia that has been shrinking rapidly since the 1960s. For the first time in modern history, its eastern basin has completely dried up.

 

aral sea eastern basin dried up 2000 2014

 

These images, taken by NASA's flagship Terra satellite, show how the Aral Sea has changed in just 14 years. It is now apparent that its eastern basin has completely dried up. The transformation is especially stark when compared to the approximate shoreline location in 1960 (black outline).

"This is the first time the eastern basin has completely dried in modern times," says Philip Micklin, a geographer from Western Michigan University and expert on the Aral Sea. "And it is likely the first time it has completely dried in 600 years, since Medieval desiccation associated with diversion of Amu Darya to the Caspian Sea."

In the 1950s and 60s, the government of the former Soviet Union diverted the Amu Darya and the Syr Darya – the region's two major rivers – in order to irrigate farmland. This diversion began the lake's gradual retreat. By the year 2000, the lake had separated into the North (Small) Aral Sea in Kazakhstan and the South (Large) Aral Sea in Uzbekistan. The South Aral had further split into western and eastern lobes.

 

abandoned boats in the aral sea
The rusting remains of abandoned boats in the Aral Sea, Kazakhstan.

 

The eastern lobe of the South Aral nearly dried in 2009, then saw a huge rebound in 2010. Water levels continued to fluctuate annually in alternately dry and wet years.

According to Micklin, the desiccation in 2014 occurred because there has been less rain and snow in the watershed that starts in the Pamir Mountains; this has greatly reduced water flow on the Amu Darya. In addition, huge amounts of river water continue to be withdrawn for irrigation. The Kok-Aral Dam across the Berg Strait – a channel that connects the northern Aral Sea with the southern part – played some role, but has not been a major factor this year, he said.

Formerly the world's fourth largest lake (pictured below in 1964), the Aral Sea is often described as the worst ecological disaster on the planet. With its eastern half now gone, what remains of the western half is expected to vanish by 2019.

 

aral sea in 1964
Satellite view of the Aral Sea in 1964.

 

  speech bubble Comments »
 

 

 

 
     
   
« Previous  
   
     
   

 
     
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2014

 

2013

 

2012

 

2011

 

2010

 

 
 
 
 

 


future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed