Laser physicists at the Australian National University have built a reversible tractor beam, able to move objects 0.2 mm in diameter a distance of up to 20 cm (7.9"). This is 100 times further than was possible in previous experiments.
Tractor beam technology – as depicted in science fiction movies like Star Trek – might become a reality sooner than we think. Following a number of successful experiments in recent years, it is moving further and further into the macro-scale. Laser physicists at the Australian National University (ANU) have now demonstrated the first long-distance optical tractor beam, able to repel and attract objects using a "hollow" beam that is bright around the edges and dark in its centre. It can move particles 0.2 mm in diameter a distance of up to 20 cm (7.9"), about 100 times further than previous attempts.
“Demonstration of a large-scale laser beam like this is a kind of holy grail for laser physicists,” said Prof. Wieslaw Krolikowski, from the Research School of Physics and Engineering at ANU.
The new technique is versatile because it requires only a single laser beam. It could be used, for example, in controlling atmospheric pollution or for the retrieval of tiny, delicate or dangerous particles for sampling. The researchers can also imagine the effect being scaled up.
“Because lasers retain their beam quality for such long distances, this could work over metres,” said co-author Dr Vladlen Shvedov. “Our lab just was not big enough to show it.”
Dr Vladlen Shvedov and Dr Cyril Hnatovsky adjusting the hollow laser beam in their lab. Credit: Stuart Hay, ANU
Unlike previous techniques, which used photon momentum to impart motion, the ANU tractor beam relies on the energy of the laser heating up the particles and the air around them. The ANU team demonstrated the effect on gold-coated hollow glass particles. These are trapped in the dark centre of the beam. Energy from the laser hits a particle and travels across its surface, where it is absorbed, creating hotspots on the surface. Air particles colliding with hotspots heat up and shoot away from the surface, which causes the particle to recoil, in the opposite direction.
To manipulate the particle, the team move the position of the hotspot by carefully controlling the polarisation of the laser beam.
“We have devised a technique that can create unusual states of polarisation in the doughnut-shaped laser beam, such as star-shaped (axial) or ring polarised (azimuthal),” said co-author Dr Cyril Hnatovsky. “We can move smoothly from one polarisation to another and thereby stop the particle or reverse its direction at will.”
Globally, 2014 is on track for the hottest year ever. September 2014 was the hottest September on record, after the hottest August, which was part of the hottest summer on record. The past 12 months — October 2013–September 2014 — were the warmest 12-month period among all months since records began in 1880.
The combined average temperature over global land and ocean surfaces for September 2014 was the highest on record for September, at 0.72°C (1.30°F) above the 20th century average of 15.0°C (59.0°F).
The global land surface temperature was 0.89°C (1.60°F) above the 20th century average of 12.0°C (53.6°F), the sixth highest for September on record. For the ocean, the September global sea surface temperature was 0.66°C (1.19°F) above the 20th century average of 16.2°C (61.1°F), the highest on record for September and also the highest on record for any month.
The combined global land and ocean average surface temperature for the January–September period (year-to-date) was 0.68°C (1.22°F) above the 20th century average of 14.1°C (57.5°F), tying with 1998 as the warmest such period on record.
Last month, Britain had its driest September since national records began in 1910, with just 20% of the average rainfall for the month. Besides breaking the record itself, this rainfall deficit is especially notable as the preceding eight-month period (January–August) was the wettest such period on record. Meanwhile, 30.6% of the contiguous USA was in drought, with conditions worsening in many regions. Nearly 100% of California and Nevada were in "moderate-to-exceptional" drought.
If 2014 maintains its current trend for the remainder of the year, it will be the warmest calendar year on record, says NOAA. The agency's findings are in strong agreement with both NASA and the JMA, who both reported a record warm September earlier this month too. It also seems quite likely that we'll see an El Niño event during the winter, which could send global temperature anomalies even higher.
The world’s first commercial-scale carbon capture and storage (CCS) process on a coal-fired power plant has been officially opened at Canada's Boundary Dam Power Station. This $1.4 billion project will cut CO2 emissions from the plant by 90% and sulphur dioxide emissions by 100%.
Electric utility company SaskPower’s new process involves retrofitting an old 110-megawatt (MW) coal-fired plant (that was first commissioned in 1959), adding solvent-based processors to strip away carbon dioxide, and then piping the CO2 to a nearby oil field. When fully optimised, it will capture up to a million tonnes of carbon dioxide annually, the equivalent of taking 250,000 cars off the road. The power unit equipped with CCS technology will continue to use coal to power approximately 100,000 homes and businesses in Saskatchewan, near the Canada-U.S. border. The captured CO2 will be used for enhanced oil recovery, with the remainder stored safely and permanently deep underground and continuously monitored.
The Canadian federal government paid $240 million towards the project. The launch was attended by more than 250 people from over 20 countries representing governments, industries and media. Attendees at the event toured the facility and learned how they can access SaskPower’s expertise and knowledge to develop their own CCS initiatives.
“This project is important because it is applicable to 95% of the world’s coal plants,” said Bill Boyd, Saskatchewan Minister of the Economy. “As nations develop emission regulations, they will come to us to see how we continue to provide affordable coal power to customers, but in an environmentally sustainable way.”
This follows news last month of a similar project being developed in Jacksonville, USA. The Environmental Protection Agency (EPA) approved permits allowing the FutureGen Industrial Alliance to capture and store CO2 deep underground – the first project of its kind in the U.S.
“The opening of this new SaskPower plant reinforces the great innovation and development that can take place if you have strong investment and partnerships from the government and industry,” said U.S. Senator Heidi Heitkamp (D-ND). “From my more than a decade working at Dakota Gasification in North Dakota, and from visiting the construction of the SaskPower facility just over a year ago, I understand just how important it is that we look to the future in how we harness our energy. Coal is a key resource in both Canada and the U.S., and through the development of clean coal technology, we can create North American independence and energy security, while also reducing emissions. We need to develop more clean coal plants to make that possible, and in the U.S., we can learn from the steps Canada has taken to find a realistic path forward for coal.”
The economics of CCS are still a major issue, however. At present, SaskPower's project is expensive and depends on having a nearby source of coal alongside an additional revenue stream from the enhanced oil recovery. Environmentalists have also continued to express concerns.
“At the end of the day, many people are going to wonder why SaskPower is investing $1.4-billion in 'clean coal' technology instead of wind, solar or geothermal energy,” said Victor Lau, Saskatchewan Greens Leader. “Our party will be monitoring future developments of this project very carefully.”
Back in 2011, we predicted that same-sex marriage would be allowed in every part of the United States by 2024. At the time, some of our readers claimed this was unrealistic and the process would take considerably longer. We chose that year based on the number of states where it had already become legal projected onto a future trend, combined with a reference from Des Moines Register that seemed to agree with our forecast.
Only six states (plus the District of Columbia) permitted same-sex marriages in 2011. Since then, another 25 have legalised it, bringing the total to 31, which is now a clear majority of the 50 states. This year alone has seen 14 states passing new laws. From 6th-12th October 2014, the Supreme Court declined to hear cases on same-sex marriage appeals – thus legalising gay marriage in Virginia, Utah, Indiana, Oklahoma and Wisconsin. This action was followed by legalisation of same-sex marriage in Nevada, Colorado, West Virginia, Idaho, North Carolina and Alaska.
There are even more cases to follow. The Sixth Circuit Court of Appeals is now expected to rule on challenges to the denial of same-sex marriage in Kentucky, Michigan, Ohio and Tennessee. Public support has grown at an increasing pace since the 1990s. According to a recent Gallup poll, it now stands at 52%, with 43% against and 5% with no opinion. Support tends to be higher among the younger generations, with 69% of 18-34 year olds in favour and only 38% of those aged 55 or above.
Below is a graph showing the number of states where gay marriage has been legalised (green) and the original trend we predicted back in 2011 (red). Half of the remaining states lie in the southern Bible Belt, a traditional conservative stronghold (see this excellent map and slider from Pew Research). Nevertheless, it seems our prediction will need revising.
Astronomers have detected what appears to be a signature of "axions" – dark matter particle candidates. If confirmed, this would be the first direct detection and identification of the elusive substance, which has been a mystery in physics for over 30 years.
XMM-Newton observatory. Credit: ESA
A landmark paper by Professor George Fraser – who tragically died earlier this year – and colleagues from the University of Leicester offers what is potentially the first direct detection of dark matter. This hypothetical form of matter comprises 85% of the Universe, but neither emits nor absorbs light or other electromagnetic radiation in any significant way. Its existence is only known because of the gravitational pull it has on objects. In other words, it is what holds everything together, and without it, galaxies would unravel and fly apart.
The study – to be published on 20th October in the Monthly Notices of the Royal Astronomical Society – looked at 15 years of measurements taken by the European Space Agency's orbiting XMM-Newton observatory; almost its entire archive of data. A curious signal was seen in the X-ray sky which had no conventional explanation, but is now believed to have been the result of axions. Previous searches for these particles, notably at CERN, and with other spacecraft in Earth orbit, have so far proved unsuccessful.
“The X-ray background – the sky, after the bright X-ray sources are removed – appears to be unchanged whenever you look at it,” says Dr. Andy Read from the University of Leicester's Department of Physics and Astronomy and now leading the paper. “However, we have discovered a seasonal signal in this X-ray background, which has no conventional explanation, but is consistent with the discovery of axions.”
As the late Professor Fraser explains in the paper: “It appears plausible that axions – dark matter particle candidates – are indeed produced in the core of the Sun and do indeed convert to X-rays in the magnetic field of the Earth.”
A sketch (not to scale) showing axions (blue) streaming out from the Sun, converting in the Earth's magnetic field (red) into X-rays (orange), which are then detected by the XMM-Newton observatory. Credit: University of Leicester
It is predicted that the X-ray signal due to axions will be greatest when looking through the sunward side of the magnetic field, because this is where the field is strongest. Each of these ghostly particles is extraordinarily light, with a vanishingly small mass just 1/100 billionth that of an electron or a million times less than a neutrino.
Dr. Read concludes: “These exciting discoveries, in George's final paper, could be truly ground-breaking, potentially opening a window to new physics, and could have huge implications, not only for our understanding of the true X-ray sky, but also for identifying the dark matter that dominates the mass content of the cosmos.”
President of the Royal Astronomical Society, Professor Martin Barstow: “This is an amazing result. If confirmed, it will be first direct detection and identification of the elusive dark matter particles and will have a fundamental impact on our theories of the Universe.”
We may know a lot more about dark matter in the coming years – thanks to a string of new observatories including the Euclid Space Telescope (2020), the European Extremely Large Telescope (2022) and the Advanced Technology Large-Aperture Space Telescope (2025). Dr. Read's team also plans to double the dataset from XMM-Newton and look at the results with more precision over the next few years.
This week, Lockheed Martin announced plans for a small-scale fusion power plant to be developed in as little as 10 years. A number of experts have expressed doubts over its viability.
If it ever became a reality, fusion power would be truly world-altering – a clean, safe and essentially limitless supply of energy allowing humanity's continued survival for centuries and millennia to come. The international project known as ITER is planned for operation in 2022 and its eventual successor may emerge in the 2040s. Widespread deployment of fusion is not expected until 2070.
U.S. defence giant Lockheed Martin hopes to accelerate progress in this area, by developing what it calls a compact fusion reactor (CFR). This would be around 10 times smaller than conventional tokamak designs, small enough to fit on the back of a truck and generating 100 megawatts (MW) of power. The company intends to build a prototype within five years – according to its press release – with commercial introduction five years after that. It has several patents pending for the work and is looking for partners in academia, industry and among government laboratories.
As illustrated above, the main improvement over ITER would be the use of a superconducting torus to create a differently shaped magnetic field, able to contain plasma far better than previous configurations. These small reactors could be fitted in U.S. Navy warships and submarines while eliminating the need for other fuel types. They could power small cities of up to 100,000 people, allow planes to fly with unlimited range, or even be used in spacecraft to cut journey times to Mars from six months to a single month. Using a CFR, the cost of desalinated water could fall by 60 percent.
If this sounds too good to be true, it may well be. Although Lockheed has been successful in its magnetised ion confinement experiments, a number of significant challenges remain for a working prototype with plasma confinement – let alone a commercialised version.
"I think it's very overplayed," University of California nuclear engineering professor Dr. Edward Morse told The Register. "They are being very cagey about divulging details."
"Getting net energy from fusion is such a goddamn difficult undertaking," said University of Texas physicist Dr. Swadesh M. Mahajan, in an interview with Mother Jones. "We know of no materials that would be able to handle anywhere near that amount of heat."
"The nuclear engineering clearly fails to be cost effective," Tom Jarboe told Business Insider in an email.
For these reasons, it is perhaps best to wait for more news and developments before adding the CFR to our timeline. We will, of course, keep you updated on Lockheed's progress as it emerges. You can also discuss this project on our forum.
Samsung Electronics has developed a new way of transmitting Wi-Fi data five times faster than was previously possible. The new technology is expected to be available in consumer devices as early as 2015.
If you've been to a cafe or other public place recently and been frustrated at the slow speed of Wi-Fi, a new breakthrough by Samsung Electronics may soon change that. Researchers at the company have this week achieved the development of 60GHz Wi-Fi allowing transfer rates of 4.6Gbps, or 575MB per second. That is 5.3 times faster than the previous maximum speed for consumer devices (866Mbps, or 108MB per second).
Today's generation of Wi-Fi uses the 2.4Ghz and 5Ghz areas of the radio spectrum. The 60GHz band is currently unlicensed and offers major potential, but previous attempts to exploit it have failed to send data over significant distances, due to path loss and weak penetration properties. Samsung has overcome these issues through a combination of millimetre-wave circuit design, a high performance modem and wide-coverage beam-forming antenna. This eliminates co-channel interference, regardless of the number of devices using the same network.
Commercialisation is expected in 2015, with Samsung planning integration into a wide variety of products – including audio visual, medical devices and telecommunications equipment. It will also help to spur the Internet of Things.
“Samsung prides itself at being of the forefront of technology innovation, and is delighted to have overcome the barriers to the commercialisation of 60GHz millimetre-wave band Wi-Fi technology,” said Paul Templeton, General Manager of Samsung Networks UK. “This breakthrough has opened the door to exciting possibilities for Samsung’s next-generation devices, and has also changed the face of the future development of Wi-Fi technology, promising innovations that were not previously within reach.”
To give an idea of the speed: a 1GB movie will take less than three seconds to transfer between devices, while uncompressed high-definition videos could easily be streamed from mobile devices to TVs in real-time without any delay.
NASA has announced finding several Kuiper Belt Objects that may be targeted by the New Horizons spacecraft, following its flyby of the Pluto system in July 2015.
Peering into the dim, outer reaches of our Solar System, NASA's Hubble Space Telescope has uncovered three Kuiper Belt Objects (KBOs) that the agency's New Horizons spacecraft could potentially visit after it flies by Pluto in July 2015. The KBOs were detected by a search team who were awarded telescope time for this purpose, following a committee recommendation earlier this year.
"This has been a very challenging search, and it's great that in the end Hubble could accomplish a detection — one NASA mission helping another," said Alan Stern of the Southwest Research Institute (SwRI) in Boulder, Colorado, principal investigator of the New Horizons mission.
The Kuiper Belt is a vast rim of primordial debris encircling our Solar System. KBOs belong to a unique class of Solar System objects that has never been visited by spacecraft and which contain clues to the origin of our Solar System.
The KBOs that Hubble found are each about 10 times larger than typical comets, but only about 1-2 percent of the size of Pluto. Unlike asteroids, KBOs have not been heated by the Sun, and are thought to represent a pristine, well preserved, deep-freeze sample of what the outer Solar System was like following its birth 4.6 billion years ago. The KBOs found in the Hubble data are thought to be the building blocks of dwarf planets such as Pluto.
The New Horizons team started to look for suitable KBOs in 2011 using some of the largest ground-based telescopes on Earth. They found several dozen KBOs, but none were reachable within the fuel supply available aboard the New Horizons spacecraft.
"We started to get worried that we could not find anything suitable – even with Hubble – but in the end, the space telescope came to the rescue," said team member John Spencer of SwRI. "There was a huge sigh of relief when we found suitable KBOs; we are 'over the moon' about this detection."
Following an initial proof of concept of the Hubble pilot observing program in June, the New Horizons team was awarded telescope time by the Space Telescope Science Institute for a wider survey in July. When the search was completed in early September, the team identified one KBO that is "definitely reachable" and two other potentially accessible KBOs that will require more tracking over several months to know whether they too are accessible by the New Horizons spacecraft.
This was a needle-in-a-haystack search for the New Horizons team, because the elusive KBOs are extremely small, faint, and difficult to pick out against myriad background stars in the constellation Sagittarius, which is in the present direction of Pluto. The three KBOs identified are each 1 billion miles beyond Pluto. Two of the KBOs are estimated to be as large as 34 miles (55 km) across, and the third is perhaps as small as 15 miles (25 km).
The New Horizons spacecraft, launched in 2006 from Florida, is the first mission in NASA's New Frontiers Program. Once a NASA mission completes its prime mission, the agency conducts an extensive science and technical review to determine whether extended operations are warranted.
The New Horizons team expects to submit such a proposal to NASA in late 2016 for an extended mission to fly by one of the newly identified KBOs. Hurtling across the Solar System, the New Horizons spacecraft would reach the distance of 4 billion miles from the Sun roughly three to four years after its July 2015 Pluto encounter. Accomplishing such a KBO flyby would substantially increase the science return from the New Horizons mission.
Generating electricity from onshore wind is cheaper than gas, coal and nuclear when externalities are stacked with the levelised cost of energy and subsidies, according to a new study ordered and endorsed by the European Commission.
A new report by the energy consultancy firm Ecofys has been analysed by the European Wind Energy Association (EWEA). Data in the report shows that onshore wind now has an approximate cost of €105 per megawatt hour (MWh) which is cheaper than gas (up to €164), nuclear (€133) and coal (between €162-233). Offshore wind comes in at €186 and solar PV has a cost of around €217 per MWh.
The total cost of energy production – which factors in externalities such as air quality, climate change and human toxicity among others – shows that coal is more expensive than the highest retail electricity price in the EU. The report puts the figure of external costs of the EU's energy mix in 2012 at between €150 and €310 billion (US$190 and US$394 billion).
Justin Wilkes, deputy chief executive officer of the European Wind Energy Association, said: "This report highlights the true cost of Europe's dependence on fossil fuels. Renewables are regularly denigrated for being too expensive and a drain on the taxpayer. Not only does the Commission's report show the alarming cost of coal but it also presents onshore wind as both cheaper and more environmentally-friendly."
Onshore and offshore wind technologies also have room for significant cost reduction. Coal on the other hand is a fully mature technology and is unlikely to reduce costs any further.
He added: "We are heavily subsidising the dirtiest form of electricity generation while proponents use coal's supposed affordability as a justification for its continued use. The irony is that coal is the most expensive form of energy in the European Union. This report shows that we should use the 2030 climate and energy package as a foundation for increasing the use of wind energy in Europe to improve our competitiveness, security and environment."
New measurements reveal there is half as much dark matter in our galaxy as previously thought, solving the 15-year-old "missing satellite galaxy" problem.
Credit: ESO/L. Calçada
New measurements of dark matter in our own Milky Way galaxy reveal there is half as much of the mysterious substance as previously thought. Astronomers from the International Centre for Radio Astronomy Research (ICRAR) used a method developed almost 100 years ago to discover that the weight of dark matter in our galaxy is 800 billion (or 8 x 1011) times the mass of the Sun. They probed the edge of the Milky Way, looking closely, for the first time, at the fringes about 5 million trillion kilometres from Earth.
Astrophysicist Dr Prajwal Kafle said we have known for a while that most of the Universe is hidden:“Stars, dust, you and me, all the things that we see, only make up about 4 per cent of the entire Universe. About 25 per cent is dark matter and the rest is dark energy.”
Dr Kafle was able to measure the mass of the dark matter in the Milky Way by studying the speed of stars throughout the galaxy, including the edges, which had never been studied in this detail before. He used a robust technique developed by British astronomer James Jeans in 1915 – decades before the discovery of dark matter. This new calculation helps to solve a mystery that has been haunting theorists for almost two decades.
“The current idea of galaxy formation and evolution – called the Lambda Cold Dark Matter theory – predicts that there should be a handful of big satellite galaxies around the Milky Way that are visible with the naked eye, but we don’t see that,” Dr Kafle said. “When you use our measurement of the mass of dark matter, the theory predicts that there should only be three satellite galaxies out there, which is exactly what we see; the Large Magellanic Cloud, the Small Magellanic Cloud and the Sagittarius Dwarf Galaxy.”
University of Sydney astrophysicist Prof. Geraint Lewis, who was also involved in the research, said the missing satellite problem had been “a thorn in the cosmological side for almost 15 years.”
“Dr Kafle’s work has shown that it might not be as bad as everyone thought, although there are still problems to overcome," he said.
The study also presented a holistic model of the Milky Way, which allowed the scientists to calculate several interesting factors, such as the speed required to leave the galaxy.
“Be prepared to hit 550 kilometres per second if you want to escape the gravitational clutches of our galaxy,” Dr Kafle said. “A rocket launched from Earth needs just 11 kilometres per second to leave its surface.”
Researchers at Harvard University have turned human embryonic stem cells into cells that produce insulin, a potentially major advance for sufferers of diabetes.
Harvard researchers have made a giant leap forward in the quest to find a truly effective treatment for type 1 diabetes, a condition that affects an estimated 22 million people worldwide. With human embryonic stem cells as a starting point, the scientists produced for the first time – in the kind of massive quantities needed for cell transplantation and pharmaceutical uses – human insulin-producing beta cells equivalent in most every way to normally functioning beta cells.
“We are now just one pre-clinical step away from the finish line,” says Prof. Douglas Melton, who led the work and has been researching the disease for nearly 25 years. “You never know for sure that something like this is going to work until you’ve tested it numerous ways. We’ve given these cells three separate challenges with glucose in mice and they’ve responded appropriately; that was really exciting. It was gratifying to know that we could do something that we always thought was possible, but many people felt it wouldn’t work. If we had shown this was not possible, then I would have had to give up on this whole approach. Now I’m really energised.”
Elaine Fuchs, a Professor at Rockefeller University, who is not involved in the research, hailed it as “one of the most important advances to date in the stem cell field, and I join the many people throughout the world in applauding my colleague for this remarkable achievement.”
“For decades, researchers have tried to generate human pancreatic beta cells that could be cultured and passaged long term under conditions where they produce insulin.” Fuchs continued. “Melton and his colleagues have now overcome this hurdle and opened the door for drug discovery and transplantation therapy in diabetes.”
Jose Oberholzer, Associate Professor at the University of Illinois at Chicago, said the work “will leave a dent in the history of diabetes. Doug Melton has put in a life-time of hard work in finding a way of generating human islet cells in vitro. He made it. This is a phenomenal accomplishment.”
Prof. Doug Melton, Harvard University
Type 1 diabetes is an autoimmune metabolic condition in which the body kills off all the pancreatic beta cells that produce the insulin needed for glucose regulation in the body. Thus, the final pre-clinical step in the development of a treatment involves protecting from immune system attack the approximately 150 million cells that would have to be transplanted into each patient being treated. Melton is collaborating with colleagues on the development of an implantation device to protect the cells. The device currently being tested has thus far protected beta cells implanted in mice from immune attack for many months. “They are still producing insulin,” Melton said.
Cell transplantation as a treatment for diabetes is still essentially experimental, uses cells from cadavers, requires the use of powerful immunosuppressive drugs, and has been available to only a very small number of patients.
Daniel G. Anderson from MIT, who is working with Melton on the implantation device, said the new work by Melton’s lab is “an incredibly important advance for diabetes. There is no question that ability to generate glucose-responsive, human beta cells through controlled differentiation of stem cells will accelerate the development of new therapeutics. In particular, this advance opens the doors to an essentially limitless supply of tissue for diabetic patients awaiting cell therapy.”
“There have been previous reports of other labs deriving beta cell types from stem cells,” said Melton. “No other group has produced mature beta cells as suitable for use in patients. The biggest hurdle has been to get to glucose sensing, insulin-secreting beta cells, and that’s what our group has done.”
Human transplantation trials using the cells are expected to start in the next few years. Melton's work was published yesterday in the journal Cell.
The University of Washington is developing a new fusion reactor design that could be one-tenth the cost of ITER – while producing five times the amount of energy.
Fusion energy sounds almost too good to be true – zero greenhouse gas emissions, no long-lived radioactive waste, and a nearly unlimited fuel supply. Perhaps the biggest roadblock to adopting fusion energy is that the economics haven't worked out. Fusion power designs aren't cheap enough to outperform systems that use fossil fuels such as coal and natural gas.
Engineers at the University of Washington (UW) hope to change that. They have designed a concept for a fusion reactor that, when scaled up to the size of a large electrical power plant, would rival costs for a new coal-fired plant with similar electrical output. The team will present its reactor design and cost-analysis findings on 17th October at the Fusion Energy Conference in St. Petersburg, Russia.
“Right now, this design has the greatest potential of producing economical fusion power of any current concept,” says Thomas Jarboe, a UW professor of aeronautics and astronautics and an adjunct professor in physics.
The reactor – called the dynomak – began as a class project taught by Jarboe two years ago. After the class had ended, Jarboe and doctoral student Derek Sutherland (who previously worked on a reactor design at MIT) continued to develop and refine the concept.
The design builds on existing technology and creates a magnetic field within a closed space to hold plasma in place long enough for fusion to occur, allowing the hot plasma to react and burn. The reactor itself would be largely self-sustaining, meaning it would continuously heat the plasma to maintain thermonuclear conditions. Heat generated from the reactor would heat up a coolant that is used to spin a turbine and generate electricity, similar to how a typical power reactor works.
“This is a much more elegant solution, because the medium in which you generate fusion is the medium in which you’re also driving all the current required to confine it,” Sutherland says.
There are several ways to create a magnetic field, which is crucial to keeping a fusion reactor going. The UW’s design is known as a spheromak – meaning it generates the majority of magnetic fields by driving electrical currents into the plasma itself. This reduces the amount of required materials and actually allows researchers to shrink the overall size of the reactor.
Other designs, such as the ITER experimental fusion reactor being built in France – due to be operational in 2022 – have to be much larger than UW’s because they rely on superconducting coils that circle around the outside of the device to provide a similar magnetic field. When compared with the fusion reactor concept in France, the UW’s is much less expensive – about one-tenth the cost of ITER – while producing five times the amount of energy.
The UW researchers factored the cost of building a fusion reactor power plant using their design and compared that with building a coal power plant. They used a metric called “overnight capital costs,” which includes all costs, particularly startup infrastructure fees. A fusion power plant producing a gigawatt (1 billion watts) of power would cost $2.7 billion, while a coal plant of the same output would cost $2.8 billion, according to their analysis.
“If we do invest in this type of fusion, we could be rewarded because the commercial reactor unit already looks economical,” Sutherland said. “It’s very exciting.”
Right now, the UW’s concept is about one-tenth the size and power output of a final product, which is still years away. The researchers have successfully tested the prototype’s ability to sustain plasma efficiently, and as they further develop and expand the size of the device, they can ramp up to higher-temperature plasma and get significant fusion power output. The team has filed patents on the concept with the UW’s Centre for Commercialisation and plans to continue developing and scaling up its prototypes. The research was funded by the U.S. Department of Energy.
The evidence for global warming continues to pour in. A new study of ocean heat content shows that temperatures have been greatly underestimated in the Southern Hemisphere. As a result, the world's oceans are now absorbing between 24 and 58 per cent more energy than previously thought.
Like a fleet of miniature research vessels, more than 3,600 robotic floats provide data on upper layers of the world's ocean currents.
Scientists from Lawrence Livermore National Laboratory in California, using satellite observations and a large suite of climate models, have found that long-term ocean warming in the upper 700 metres of Southern Hemisphere oceans has been greatly underestimated.
"This underestimation is a result of poor sampling prior to the last decade, and limitations of the analysis methods that conservatively estimated temperature changes in data-sparse regions," said LLNL oceanographer Paul Durack, lead author of a paper in the 5th October issue of the journal Nature Climate Change.
Ocean heat storage is important because it accounts for over 90 percent of excess heat associated with global warming. The observed ocean and atmosphere warming is a result of continuing greenhouse gas emissions. The Southern Hemisphere oceans make up 60 percent of the world's oceans.
The researchers found that climate models simulating the relative increase in sea surface height between Northern and Southern hemispheres were consistent with highly accurate altimeter observations. However, the simulated upper-ocean warming in Northern and Southern hemispheres was inconsistent with observed estimates of ocean heat content change. These sea level and ocean heat content changes should have been consistent, suggesting that until recent improvements in observational data, Southern Hemisphere ocean heat content changes were underestimated.
Since 2004, automated profiling floats called Argo (pictured above) have been used to measure global ocean temperatures from the surface down to 2,000 m (6,560 ft). These 3,600 floats currently observing the global ocean provide systematic coverage of the Southern Hemisphere for the first time. Argo float data over the last decade, as well as earlier measurements, show that the ocean has been steadily warming, according to Durack.
"The Argo data is really critical," he said. "Estimates that we had until now have been pretty systematically underestimating the changes. Prior to 2004, research has been very limited by poor measurement coverage. Our results suggest that ocean warming has been underestimated by 24 to 58 percent. The conclusion that warming has been underestimated agrees with previous studies. However, it's the first time that scientists have tried to estimate how much heat we've missed."
Given that most of the excess heat associated with global warming is in the oceans, this study has important implications for how scientists view the Earth's overall energy budget. Heat currently stored by the oceans will eventually be released, causing land temperatures to accelerate and triggering more extreme climate events.
"We continue to be stunned at how rapidly the ocean is warming," said Sarah Gille, a Scripps Institution of Oceanography professor who was not involved in the study. "Even if we stopped all greenhouse gas emissions today, we'd still have an ocean that is warmer than the ocean of 1950, and that heat commits us to a warmer climate. Extra heat means extra sea level rise, since warmer water is less dense, so a warmer ocean expands."
"An important result of this paper is the demonstration that the oceans have continued to warm over the past decade, at a rate consistent with estimates of Earth’s net energy imbalance," says Prof. Steve Rintoul, from Australia’s Commonwealth Scientific and Industrial Research Organisation. "While the rate of increase in surface air temperatures slowed in the last 10 to 15 years, the heat stored by the planet, which is heavily dominated by the oceans, has steadily increased as greenhouse gases have continued to rise."
These new results are consistent with another new paper that appears in the same issue of Nature Climate Change. Co-author Felix Landerer of NASA's Jet Propulsion Laboratory, who contributed to both studies, says, "Our other new study on deep-ocean warming found that from 2005 to the present, Argo measurements recorded a continuing warming of the upper-ocean. Using the latest available observations, we're able to show that this upper-ocean warming and satellite measurements are consistent."
In related news, a report by Edinburgh's Heriot-Watt University – based on the work of 30 experts – finds that ocean acidification has increased by 26% since pre-industrial times. It is now causing nearly $1 trillion of damage to coral reefs each year, threatening the livelihoods of 400 million people.
New research published in The Lancet suggests that, with sustained international efforts, the number of premature deaths could be reduced by 40% over the next two decades (2010-2030), halving under–50 mortality and preventing a third of the deaths at ages 50–69 years.
The Lancet reveals that, between 2000 and 2010, child deaths fell by one-third worldwide, helped by the fourth Millennium Development Goal (MDG) to reduce child deaths by two-thirds; and premature deaths among adults fell by one-sixth, helped by MDG 5 to reduce maternal mortality and MDG 6 to fight AIDS, malaria and other diseases. With expanded international efforts against a wider range of causes, these rates of decrease could accelerate, say the study authors.
The most striking change during 2000–2010 was a two-thirds reduction in childhood deaths from the diseases now controlled by vaccination (diphtheria, pertussis, tetanus, polio, and measles), highlighting what targeted international efforts can achieve.
“Death in old age is inevitable, but death before old age is not”, said co-author Richard Peto, Professor of medical statistics at the University of Oxford, UK. “In all major countries, except where the effects of HIV or political disturbances predominated, the risk of premature death has been decreasing in recent decades, and it will fall even faster over the next few decades if the new UN Sustainable Development Goals get the big causes of death taken even more seriously.”
The United Nations General Assembly has been discussing 17 Sustainable Development Goals for 2016–2030 to replace the MDGs that expire at the end of 2015. The new health goal is “Ensure healthy lives and promote well-being for all at all ages”. The group of 16 authors, writing in The Lancet, call for this new health goal to be accompanied by a specific target to avoid in each country 40% of all premature deaths (of the deaths that would occur in the 2030 population of that country, if its 2010 death rates continued).
The 40% reduction from 2010 to 2030 in deaths before age 70 would involve reductions of two-thirds in the causes already being targeted by the MDGs, and a one-third reduction in other causes of premature death, such as non-communicable diseases and injuries.
(A) Risk of death versus age for the world in 1970 and 2010
(B) and for country income groupings in 2010.
For historical comparison, the 1910 and 2010 risks for England and Wales are given.
Lead author Ole Norheim, Professor of global public health at the University of Bergen, Norway, explained, “Based on realistically moderate improvements in current trends, our proposed targets are a two-thirds reduction in child and maternal deaths and in HIV, tuberculosis, and malaria, and a one-third reduction in deaths from non-communicable diseases and injuries. For this, we are going to need improved healthcare, intensified international efforts to control communicable diseases, and more effective prevention and treatment of non-communicable diseases and injuries.”
“The most important cause of non-communicable disease is tobacco use – and one of the key determinants of smoking is the price of cigarettes”, says co-author Prabhat Jha, Director of the Centre for Global Health Research in St Michael’s Hospital, Toronto. “WHO is calling for a 30% reduction in smoking by 2025, and in many countries major increases in excise taxes that double the price of cigarettes are still possible. Such an increase would reduce smoking by about a third, but would increase the total Government tax yield from smoking by about a third.”
With political commitment and sustained efforts to improve health, the current rate of decline in premature death can be further accelerated. “We conclude that a 40% reduction in premature deaths is realistic in each country where mortality in 2030 is not dominated by new epidemics, political disturbances or disasters”, adds Professor Norheim.
Writing in a linked Comment, the Norwegian Ministers of Foreign Affairs and of Health and Care say, “[This] study shows what an important part science could play in the negotiations at the 69th Session of the UN General Assembly. We strongly urge the medical community to develop a common position that can enable the international community to arrive at a single health SDG with a limited number of simple, understandable and measurable targets.”
In another linked Comment, Professor Sir George Alleyne, Director Emeritus of the Pan American Health Organization (PAHO), Washington, DC, USA, and colleagues, write that, “The significant advance in this paper is to introduce quantification to the target-setting process, based on rigorous analysis of mortality trends by age as well as by disease category. The proposed targets focus on premature mortality and avoid more complex metrics which are much harder to measure and track over time. The authors stress the importance of countries adapting the targets to their own circumstances.”
This study was funded by the UK Medical Research Council, Norwegian Agency for Development Co-operation, University of Toronto Centre for Global Health Research, and Bill and Melinda Gates Foundation.
The Office of Naval Research (ONR) has announced a technological breakthrough that allows unmanned surface vehicles (USV) to not only protect Navy ships, but also, for the first time, autonomously “swarm” offensively on hostile vessels.
First-of-its-kind technology – demonstrated on the James River in Virginia – allows unmanned, self-guided vessels to overwhelm an adversary. This is achieved using a combination of sensors and software called CARACaS (Control Architecture for Robotic Agent Command and Sensing). The hardware is small and light enough to be portable and can be installed on almost any boat. It is also inexpensive, at just $2000 for each kit.
These automated patrols could leave warships they're protecting and swarm around potential threats on the water. This technology could be utilised by the U.S. Navy within a year, defence officials say, adding it could help stop attacks like the deadly 2000 bombing of the USS Cole.
“Our Sailors and Marines can’t fight tomorrow’s battles using yesterday’s technology,” said Chief of Naval Research, Matthew Klunder. “This kind of breakthrough is the result of the Navy’s long-term support for innovative research in science and technology.”
Without a human physically needing to be at the controls, the boats can operate in sync with other unmanned vessels – choosing their own routes; swarming to interdict enemy vessels; and escorting/protecting naval assets.
“This networking unmanned platforms demonstration was a cost-effective way to integrate many small, cheap, and autonomous capabilities that can significantly improve our warfighting advantage,” said Admiral Jonathan Greenert, Chief of Naval Operations.
“This multiplies combat power by allowing CARACaS-enabled boats to do some of the dangerous work,” said Dr. Robert Brizzolara, program manager at the ONR. “It will remove our Sailors and Marines from many dangerous situations; for instance when they need to approach hostile or suspicious vessels. If an adversary were to fire on the USVs, no humans would be at risk.”
In the tests, as many as 13 Navy boats were operating together. First they escorted a high-value Navy ship, and then, when a simulated enemy vessel was detected, the boats sped into action, swarming around the threat. This demonstration comes near the anniversary of the USS Cole bombing off the coast of Yemen. In that October 2000 terrorist attack, a small boat laden with explosives was able to get near a guided-missile destroyer and detonate, killing 17 Sailors and injuring 39 others.
Autonomous unmanned surface vehicles could play a vital role in protecting people, ports and commerce. In the future, the capability could be scaled up to include even greater numbers of USVs – and even to other platforms such as drones, helicopters and jet fighters.
"This is something that you might find not only just on our naval vessels. We could certainly see this utilised to protect merchant vessels, to protect ports and harbours; used also to protect offshore oil rigs," Klunder said.
Software company IPsoft has announced a new artificial intelligence platform named “Amelia” that makes it possible to automate knowledge distribution over a wide range of functions. Exposed to the same information as any new hire, she instantly applies information to solve queries. With Amelia able to shoulder the burden of tedious, often laborious tasks, she partners with human co-workers to achieve new levels of productivity and service quality.
Whereas most other technologies demand that humans adapt their behaviour in order to interact with ‘smart machines’, Amelia is intelligent enough to interact like a human herself. She learns using the same natural language manuals as her colleagues, but in a matter of seconds. She understands the full semantic meaning of what she reads – rather than simply recognising individual words – by applying context, logic and inferring implications. Independently, rather than through time-intensive programming, Amelia creates her own process map of the information she is given so that she can work out for herself exactly what actions to take, depending on the specific problem being solved. Like a human worker, she learns from her colleagues and by observing their work, is able to continually build up knowledge.
In a fraction of the time it takes traditionally to train someone in a new role, Amelia is able to perform at a high level. What is more, as she already speaks over 20 languages, she is able to support international operations with ease. Her core knowledge of a process needs only to be learned once for her to be able to communicate with customers in their language.
Much like machines transformed agriculture and manufacturing, cognitive technologies will drive the next evolution of the global workforce. In the future, companies will compete in the digital economy with a digital workforce that comprises a balance of human and virtual employees. Research firm Gartner predicts that by 2017, autonomics and cognitive platforms like Amelia will drive a 60 percent reduction in the cost of managed services. This technology is already being piloted within a number of Fortune 1000 companies and IPsoft expects to announce new customers and prominent industry partners before the end of this year.
"We want to make sure that human beings can dedicate their time to more valuable tasks. Taking out the more repetitive tasks is I think a noble aspiration for a company," said Frank Lansink, EU CEO of IPsoft, at a briefing in the firm's HQ at 30 St Mary Axe (the Gherkin). "Our purpose is to elevate human beings into a more meaningful role, adding value to society, or to enterprise, or the customer."
Doctors in Sweden have announced the first baby born to a mother with a womb transplant. This pioneering operation offers hope to thousands of couples who are unable to conceive children.
In 2013, researchers at the University of Gothenburg completed a series of nine womb transplants on women in Sweden. Among the patients was an unnamed 36-year-old with Mayer-Rokitansky-Küster-Hauser syndrome (MRKH), a rare condition that prevents the uterus from developing. Her ovaries were intact, however, so she could ovulate. This female became the recipient of a uterus donation from her 61-year-old family friend, the latter having gone through the menopause around seven years earlier.
Drugs were needed to suppress the immune system, which otherwise would have resulted in the organ being rejected. Alongside this, IVF was used to produce 11 embryos, frozen and stored for later use. In January 2014, a year after the transplant, doctors successfully implanted one of these embryos into the patient, transferring it to her new womb. There were concerns over how well a transplanted uterus would cope with the strains of pregnancy, during which it swells greatly in size. The procedure had been attempted by scientists in the past – but in each case, it led to either a miscarriage or organ failure caused by disease.
On this occasion, however, the operation was successful. There were problems in the 31st week of pregnancy – as the mother developed a condition known as pre-eclampsia (characterised by high blood pressure) – but a caesarean section delivered a healthy baby boy weighing 3.9 pounds (1.8 kg); normal for that stage of pregnancy. British medical journal The Lancet has released a photo below and is due to publish a report on the case shortly.
Credit: The Lancet
This milestone in reproductive medicine – the culmination of more than 10 years' research and surgical training – offers hope to thousands of couples who are unable to conceive children. The doctor who led the work, Prof. Mats Brännström, has issued a note of caution, however. In an interview he stated it will be "many, many years" before this operation becomes routine. This is partly because of the extremely high cost, but also because it remains a new and somewhat experimental procedure, only performed by certain specialist surgeons in select centres and requiring various further studies.
Dr Allan Pacey, of the British Fertility Society says: "I think it is brilliant and revolutionary, and opens the door to many infertile women. The scale of it feels a bit like IVF. It feels like a step change. The question is can it be done repeatedly, reliably and safely."
"He’s no different from any other child – but he will have a good story to tell," the father says. "One day, he can look at the newspaper articles about how he was born and know that he was the first in the world to be born this way."
In this interview with CNN Money, Elon Musk says that a Tesla car able to self-drive up to 90% of the time will be launched in 2015. The company will also reveal its next electric vehicle – the model "D" – on 9th October, according to a tweet.
This year marks another milestone for the Aral Sea — a once huge lake in Central Asia that has been shrinking rapidly since the 1960s. For the
first time in modern history, its eastern basin has completely dried up.
These images, taken by NASA's flagship Terra satellite, show how the Aral Sea has changed in just 14 years. It is now apparent that its eastern basin has completely dried up. The transformation is especially stark when compared to the approximate shoreline location in 1960 (black outline).
"This is the first time the eastern basin has completely dried in modern times," says Philip Micklin, a geographer from Western Michigan University and expert on the Aral Sea. "And it is likely the first time it has completely dried in 600 years, since Medieval desiccation associated with diversion of Amu Darya to the Caspian Sea."
In the 1950s and 60s, the government of the former Soviet Union diverted the Amu Darya and the Syr Darya – the region's two major rivers – in order to irrigate farmland. This diversion began the lake's gradual retreat. By the year 2000, the lake had separated into the North (Small) Aral Sea in Kazakhstan and the South (Large) Aral Sea in Uzbekistan. The South Aral had further split into western and eastern lobes.
The rusting remains of abandoned boats in the Aral Sea, Kazakhstan.
The eastern lobe of the South Aral nearly dried in 2009, then saw a huge rebound in 2010. Water levels continued to fluctuate annually in alternately dry and wet years.
According to Micklin, the desiccation in 2014 occurred because there has been less rain and snow in the watershed that starts in the Pamir Mountains; this has greatly reduced water flow on the Amu Darya. In addition, huge amounts of river water continue to be withdrawn for irrigation. The Kok-Aral Dam across the Berg Strait – a channel that connects the northern Aral Sea with the southern part – played some role, but has not been a major factor this year, he said.
Formerly the world's fourth largest lake (pictured below in 1964), the Aral Sea is often described as the worst ecological disaster on the planet. With its eastern half now gone, what remains of the western half is expected to vanish by 2019.