5th December 2016
Construction of practical quantum computers radically simplified
Scientists at the University of Sussex have invented a ground-breaking new method that puts the construction of large-scale quantum computers within reach of current technology.
Quantum computers could solve certain problems – that would take the fastest supercomputer millions of years to calculate – in just a few milliseconds. They have the potential to create new materials and medicines, as well as solve long-standing scientific and financial problems.
Universal quantum computers can be built in principle, but the technological challenges are tremendous. The engineering required to build one is considered more difficult than manned space travel to Mars – until now.
Quantum computing experiments on a small scale using trapped ions (charged atoms) are carried out by aligning individual laser beams onto individual ions with each ion forming a quantum bit. However, a large-scale quantum computer would need billions of quantum bits, therefore requiring billions of precisely aligned lasers, one for each ion.
Instead, scientists at the University of Sussex have invented a simple method where voltages are applied to a quantum computer microchip (without having to align laser beams) – to the same effect. The team also succeeded in demonstrating the core building block of this new method with an impressively low error rate.
Credit: University of Sussex
"This development is a game changer for quantum computing making it accessible for industrial and government use," said Professor Winfried Hensinger, who heads the Ion Quantum Technology Group at the university and is director of the Sussex Centre for Quantum Technologies. "We will construct a large-scale quantum computer at Sussex making full use of this exciting new technology."
Quantum computers may revolutionise society in a similar way as the emergence of classical computers. "Developing this step-changing new technology has been a great adventure and it is absolutely amazing observing it actually work in the laboratory," said Hensinger's colleague, Dr Seb Weidt.
The Ion Quantum Technology Group forms part of the UK's National Quantum Technology Programme, a £270 million investment by the government to accelerate the introduction of quantum technologies into the marketplace.
A paper on this latest research, 'Trapped-ion quantum logic with global radiation fields', is published in the journal Physical Review Letters.
Professor Winfried Hensinger (left) and Dr Seb Weidt (right).
• Follow us on Twitter
• Follow us on Facebook
29th November 2016
The speed of light could be variable, say researchers
Scientists behind a theory that the speed of light is variable – and not constant as Einstein suggested – have produced a model with an exact figure on the spectral index, which they say is testable.
Scientists behind a theory that the speed of light is variable – and not constant as Einstein suggested – have made a prediction that could be tested.
Einstein observed that the speed of light remains the same in any situation, and this meant that space and time could be different in different situations.
The assumption that the speed of light is fixed, and always has been, underpins many theories in physics, such as Einstein's theory of general relativity. It plays an especially important role in models of what happened during the very early universe, seconds after the Big Bang.
But some researchers have suggested that the speed of light could have been much higher in this early universe. Now, one of this theory's originators, Professor João Magueijo from Imperial College London, working with Dr Niayesh Afshordi at the Perimeter Institute in Canada, has made a prediction that could be used to test the theory's validity.
Large structures, such as galaxies, all formed from fluctuations in the early universe – tiny differences in density from one region to another. A record of these early fluctuations is imprinted on the cosmic microwave background – a map of the oldest light in the universe – in the form of a 'spectral index'.
Working with their theory that the fluctuations were influenced by a varying speed of light in the early universe, Professor Magueijo and Dr Afshordi have now used a model to put an exact figure on the spectral index. The predicted figure and model it is based on are published this month in the peer-reviewed journal Physical Review D.
Cosmologists have been getting ever more precise readings of this figure, so the prediction could soon be tested – either confirming or ruling out the team's model of the early universe. Their figure is a very precise 0.96478. This is close to the current estimate of readings of the cosmic microwave background, which puts it around 0.968, with some margin of error.
"The theory, which we first proposed in the late-1990s, has now reached a maturity point – it has produced a testable prediction. If observations in the near future do find this number to be accurate, it could lead to a modification of Einstein's theory of gravity," explains Professor Magueijo. "The idea that the speed of light could be variable was radical when first proposed – but with a numerical prediction, it becomes something physicists can actually test. If true, it would mean that the laws of nature were not always the same as they are today."
The testability of the varying speed of light theory sets it apart from the more mainstream rival theory: inflation. Inflation says that the early universe went through an extremely rapid expansion phase, much faster than the current rate of expansion of the universe.
Credit: By Yinweichen, [CC BY-SA 3.0], via Wikimedia Commons
These theories are necessary to overcome what physicists call the 'horizon problem'. The universe as we see it today appears to be everywhere broadly the same. For example, it has a relatively homogenous density.
This could only be true if all regions of the universe were able to influence each other. However, if the speed of light has always been the same, then not enough time has passed for light to have travelled to the edge of the universe, and 'even out' the energy.
As an analogy, to heat up a room evenly, the warm air from radiators at either end has to travel across the room and mix fully. The problem for the universe is that the 'room' – the observed size of the universe – appears to be too large for this to have happened in the time since it was formed.
The varying speed of light theory suggests that the speed of light was much higher in the early universe, allowing the distant edges to be connected as the universe expanded. The speed of light would have then dropped in a predictable way as the density of the universe changed. This variability led the team to their prediction published this month.
The alternative theory is inflation, which attempts to solve this problem by saying that the very early universe "evened out" while incredibly small, and then suddenly expanded, with the uniformity already imprinted on it. While this means the speed of light and the other laws of physics as we know them are preserved, it requires the invention of an 'inflation field' – a set of conditions that only existed at the time.
• Follow us on Twitter
• Follow us on Facebook
3rd November 2016
1,000-fold increase in 3-D scanning speed
Researchers at Penn State University report a 1,000-fold increase in the scanning speed for 3-D printing, using a space-charge-controlled KTN beam deflector with a large electro-optic effect.
A major technological advance in the field of high-speed beam-scanning devices has resulted in a speed boost of up to 1000 times, according to researchers in Penn State's College of Engineering. Using a space-charge-controlled KTN beam deflector – a kind of crystal made of potassium tantalate and potassium niobate – with a large electro-optic effect, researchers have found that scanning at a much higher speed is possible.
"When the crystal materials are applied to an electric field, they generate uniform reflecting distributions, that can deflect an incoming light beam," said Professor Shizhuo Yin, from the School of Electrical Engineering and Computer Science. "We conducted a systematic study on indications of speed and found out the phase transition of the electric field is one of the limiting factors."
To overcome this issue, Yin and his team of researchers eliminated the electric field-induced phase transition in a nanodisordered KTN crystal by making it work at a higher temperature. They not only went beyond the Curie temperature (at which certain materials lose their magnetic properties, replaced by induced magnetism), they went beyond the critical end point (in which a liquid and its vapour can co-exist).
Credit: Penn State
This increased the scanning speed from the microsecond range to the nanosecond range, and led to improved high-speed imaging, broadband optical communications and ultrafast laser display and printing. The researchers believe this could lead to a new generation of 3-D printers, with objects that once took an hour to print now taking a matter of seconds.
Yin said technology like this would be especially useful in the medical industry – high-speed imaging will now be possible in real-time. For example, optometrists who use a non-invasive test that uses light waves to take cross-section pictures of a person's retina, would be able to have a 3-D image of their patients' retinas as they are performing the surgery, so they can see what needs to be corrected during the procedure.
The group's findings are published in the journal Nature Scientific Reports.
• Follow us on Twitter
• Follow us on Facebook
29th October 2016
New world record for fusion reactor
Massachusetts Institute of Technology has announced a new record for plasma pressure in an Alcator C-Mod tokamak nuclear fusion reactor – achieving over two atmospheres of pressure for the first time.
Credit: Bob Mumgaard/Plasma Science and Fusion Center
Scientists and engineers from the Plasma Science and Fusion Center at the Massachusetts Institute of Technology (MIT) have made a leap forward in the pursuit of clean energy. They report a new record for plasma pressure in the Institute's Alcator C-Mod tokamak nuclear fusion reactor, pictured above. Plasma pressure is the key ingredient to producing energy from nuclear fusion, and MIT's new result achieves over two atmospheres of pressure for the first time. Senior researcher, Earl Marmar, presented the results at the IAEA Fusion Energy Conference, in Kyoto, Japan, which ran from 17–22 October.
Nuclear fusion has the potential to produce nearly unlimited supplies of clean, safe, carbon-free energy. Fusion is the same process that powers the Sun, and it can be realised in reactors that simulate the conditions of ultrahot miniature "stars" of plasma – superheated gas – that are contained within a magnetic field.
For over half a century it has been known that to make fusion viable on Earth's surface, the plasma must be very hot (more than 50 million degrees), must be stable under intense pressure, and contained in a fixed volume. Successful fusion also requires that the product of three factors – a plasma's particle density, confinement time, and temperature – reaches a certain value. Above this value (the so-called "triple product"), the energy released from a reactor exceeds the energy required to keep the reaction going.
Pressure, which is the product of density and temperature, accounts for about two-thirds of the challenge. The amount of power produced increases with the square of the pressure – so doubling the pressure leads to a fourfold increase in energy production.
During the 23 years Alcator C-Mod has been in operation, it has repeatedly advanced the record for plasma pressure in a magnetic confinement device. The previous record of 1.77 atmospheres was in 2005 (also at Alcator C-Mod). While setting the new record of 2.05 atmospheres, a 15% improvement, the temperature inside Alcator C-Mod reached over 35 million degrees Celsius, or twice as hot as the centre of the sun. The plasma produced 300 trillion fusion reactions per second and had a central magnetic field strength of 5.7 tesla. It carried 1.4 million amps of electrical current and was heated with over 4 million watts of power. The reaction occurred in a volume of approximately 1 cubic metre (not much larger than a coat closet) and the plasma lasted for two full seconds.
Other fusion experiments conducted in reactors similar to Alcator have reached these temperatures before, but at pressures closer to 1 atmosphere; MIT's result exceeded the next highest pressure achieved in non-Alcator devices by 70 percent.
"This is a remarkable achievement that highlights the highly successful Alcator C-Mod program at MIT," says Dale Meade, former deputy director at the Princeton Plasma Physics Laboratory, who was not directly involved in the experiments. "The record plasma pressure validates the high-magnetic-field approach as an attractive path to practical fusion energy."
"This result confirms that the high pressures required for burning plasma can be best achieved with high-magnetic-field tokamaks such as Alcator C-Mod," says Riccardo Betti, Professor of Mechanical Engineering and Physics and Astronomy at the University of Rochester.
Alcator C-Mod is the world's only compact, high-magnetic-field fusion reactor with advanced shaping in a design called a tokamak, which confines the superheated plasma in a doughnut-shaped chamber. Its high-intensity magnetic field – up to eight tesla, or 160,000 times the Earth's magnetic field – allows the device to create the dense, hot plasmas and keep them stable at such incredibly high temperatures. Its magnetic field is more than double what is typically used in other reactor designs, which quadruples its ability to contain plasma pressure.
Unfortunately, while Alcator C-Mod's contributions to the advancement of fusion energy have been significant, the facility has now been officially closed following this latest experiment. In 2012, the Department of Energy (DOE) decided to cease funding, due to budget pressures from the construction of the ITER project, which is due to be switched on in 2022. Following that decision, Congress restored funding for a few more years – but that funding has now ended.
C-Mod was third in the line of high-magnetic-field tokamaks built and operated at MIT. Unless a new device is announced and constructed, the pressure record just set in C-Mod will likely stand for the next 15 years. ITER will be approximately 800 times larger in volume than Alcator C-Mod, but will operate at a lower magnetic field. ITER is expected to reach 2.6 atmospheres when it reaches full operation by 2032.
The Alcator C-Mod team celebrates the record setting plasma discharge on its last day of operation.
Credit: Jim Irby/Plasma Science and Fusion Center
• Follow us on Twitter
• Follow us on Facebook
20th October 2016
Quantum computers: 10-fold boost in stability achieved
A team at Australia's University of New South Wales has created a new quantum bit that remains in a stable superposition for 10 times longer than previously achieved.
Credit: Arne Laucht/UNSW
Australian engineers have created a new quantum bit which remains in a stable superposition for 10 times longer than previously achieved, dramatically expanding the time during which calculations could be performed in a silicon quantum computer.
The new quantum bit, consisting of the spin of a single atom in silicon and merged with an electromagnetic field – known as 'dressed qubit' – retains quantum information for much longer than 'undressed' atoms, opening up new avenues to build and operate the superpowerful quantum computers of the future.
"We have created a new quantum bit where the spin of a single electron is merged together with a strong electromagnetic field," comments Arne Laucht from the School of Electrical Engineering & Telecommunications at University of New South Wales (UNSW), lead author of the paper. "This quantum bit is more versatile and more long-lived than the electron alone, and will allow us to build more reliable quantum computers."
Building a quantum computer is a difficult and ambitious challenge, but has potential to deliver revolutionary tools for otherwise impossible calculations – such as the design of complex drugs and advanced materials, or the rapid search of massive, unsorted databases. Its speed and power lie in the fact that quantum systems can host multiple 'superpositions' of different initial states, which in a computer are treated as inputs which, in turn, all get processed at the same time.
"The greatest hurdle in using quantum objects for computing is to preserve their delicate superpositions long enough to allow us to perform useful calculations," said Andrea Morello, Program Manager in the Centre for Quantum Computation & Communication Technology at UNSW. "Our decade-long research program had already established the most long-lived quantum bit in the solid state, by encoding quantum information in the spin of a single phosphorus atom inside a silicon chip placed in a static magnetic field," he said.
What Laucht and colleagues did was push this further: "We have now implemented a new way to encode the information: we have subjected the atom to a very strong, continuously oscillating electromagnetic field at microwave frequencies, and thus we have 'redefined' the quantum bit as the orientation of the spin with respect to the microwave field."
Tuning gates (red), microwave antenna (blue), and single electron transistor used for spin readout (yellow).
Credit: Guilherme Tosi & Arne Laucht/UNSW
The results are striking: since the electromagnetic field steadily oscillates at a very high frequency, any noise or disturbance at a different frequency results in a zero net effect. The UNSW researchers achieved an improvement by a factor of 10 in the time span during which a quantum superposition can be preserved, with a dephasing time of T2*=2.4 milliseconds.
"This new 'dressed qubit' can be controlled in a variety of ways that would be impractical with an 'undressed qubit'," adds Morello. "For example, it can be controlled by simply modulating the frequency of the microwave field, just like an FM radio. The 'undressed qubit' instead requires turning the amplitude of the control fields on and off, like an AM radio. In some sense, this is why the dressed qubit is more immune to noise: the quantum information is controlled by the frequency, which is rock-solid, whereas the amplitude can be more easily affected by external noise."
Since the device is built upon standard silicon technology, this result paves the way to the construction of powerful and reliable quantum processors based on the same fabrication process already used for today's computers. The UNSW team leads the world in developing silicon quantum computing and Morello's team is part of a consortium who have struck a A$70 million deal between UNSW, researchers, business, and the Australian government to develop a prototype silicon quantum integrated circuit – a major step in building the world's first quantum computer in silicon.
A functional quantum computer would allow massive increases in speed and efficiency for certain computing tasks – even when compared with today's fastest silicon-based 'classical' computers. In a number of key areas – such as searching enormous databases, solving complicated sets of equations, and modelling atomic systems such as biological molecules or drugs – they would far surpass today's computers. They would also be extremely useful in the finance and healthcare industries, and for government, security and defence organisations.
Quantum computers could identify and develop new medicines by vastly accelerating the computer-aided design of pharmaceutical compounds (minimising lengthy trial and error testing), and develop new, lighter and stronger materials spanning consumer electronics to aircraft. They would also make possible new types of computing applications and solutions that are beyond our ability to foresee.
The UNSW study appears this week in the peer-reviewed journal, Nature Nanotechnology.
• Follow us on Twitter
• Follow us on Facebook
30th July 2016
Vortex laser offers hope for Moore's Law
A new laser that travels in a corkscrew pattern is shown to carry ten times or more the information of conventional lasers, potentially offering a way to extend Moore's Law.
Like a whirlpool, a new light-based communication tool carries data in swift, circular motions. This optics advancement could become a central component of the next generation of computers designed to handle society's growing demand for information sharing. It may also help to ease concerns for those worried about the predicted end of Moore's Law – the idea that researchers will find new ways to make computers ever smaller, faster and cheaper.
"To transfer more data while using less energy, we need to rethink what's inside these machines," says Liang Feng, PhD, assistant professor in the Department of Electrical Engineering at the University at Buffalo's (UB) School of Engineering and Applied Sciences.
For decades, researchers have been able to cram exponentially increasing numbers of components onto silicon-based chips. Their success explains why a typical handheld smartphone has more computing power than the world's most powerful computers of the 1980s, which cost millions in today's dollars and were the size of a large filing cabinet.
But researchers are approaching a bottleneck, in which existing technology may no longer meet society's demand for data. Predictions vary, but many suggest this could happen within the next five years. This problem is being addressed in numerous ways, including optical communications, which use light to carry information. Examples of optical communications vary from old lighthouses to modern fibre optic cables used to watch television and browse the web. Lasers are a key part of today's optical communication systems and researchers have been manipulating them in various ways, most commonly by funnelling different signals into one path, to pack more information together. But these techniques are also reaching their limits.
The UB-led research team is pushing laser technology forward using another light control method, known as orbital angular momentum. This distributes the laser in a corkscrew pattern with a vortex at the centre, as pictured above. Usually too large to work on today's computers, they were able to shrink the vortex laser to the point where it is compatible with modern chips. Because the laser beam travels in a corkscrew pattern, encoding information into different vortex twists, it can deliver at least 10 times the information of conventional lasers, which move linearly.
However, the vortex laser is just one component of many – such as advanced transmitters and receivers – which will ultimately be needed to continue building more powerful computers and data centres in the future.
The study was published yesterday in the peer-reviewed journal Science. The research was supported with grants from the U.S. Army Research Office, the U.S. Department of Energy and National Science Foundation.
• Follow us on Twitter
• Follow us on Facebook
12th June 2016
Four new additions to the periodic table
The International Union of Pure and Applied Chemistry (IUPAC) has proposed the final names of four new additions to the periodic table.
Following earlier reports that the claims for discovery of these elements have been fulfilled, the discoverers have been invited to propose names and they are now disclosed for public review:
• Nihonium and symbol Nh, for the element 113
• Moscovium and symbol Mc, for the element 115
• Tennessine and symbol Ts, for the element 117
• Oganesson and symbol Og, for the element 118
By Sandbh (Own work) [CC BY-SA 4.0]
The IUPAC Inorganic Chemistry Division has reviewed and considered these proposals and recommends them for acceptance. A five-month public review is now being held, expiring on 8th November 2016, prior to formal approval by the IUPAC Council.
The guidelines for naming elements were recently revised and shared with discoverers to assist in their proposals. Keeping with tradition, a newly discovered element can be named after:
(a) a mythological concept or character (including an astronomical object),
(b) a mineral or similar substance,
(c) a place, or geographical region,
(d) a property of the element, or
(e) a scientist.
Nihonium, with atomic number 113, is a synthetic element (that can be created in a laboratory, but is not found in nature) and is extremely radioactive; its most stable known isotope, ununtrium-286, has a half-life of just 20 seconds. The name comes from one of the pronunciations of the Japanese word for Japan (nihon) that literally means "the Land of Rising Sun". Its research team hopes that pride and faith in science will displace the lost trust of those who suffered from the 2011 Fukushima nuclear disaster.
Moscovium, with atomic number 115, is in recognition of the Moscow region and honours the ancient Russian land that is the home of the Joint Institute for Nuclear Research, where the discovery experiments were conducted using the Dubna Gas-Filled Recoil Separator, in combination with the heavy ion accelerator capabilities of the Flerov Laboratory of Nuclear Reactions. Like nihoniuim, it is extremely radioactive; its most stable isotope has a half-life of only 220 milliseconds. About 100 atoms of moscovium have been observed to date.
Tennessine, with atomic number 117, recognises the contribution of the Tennessee region, including Oak Ridge National Laboratory, Vanderbilt University, and the University of Tennessee at Knoxville, to superheavy element research, including the production and chemical separation of unique actinide target materials for superheavy element synthesis at the High Flux Isotope Reactor (HFIR) and Radiochemical Engineering Development Centre.
Oganesson, with atomic number 118, was discovered by teams at the Joint Institute for Nuclear Research, Dubna (Russia) and Lawrence Livermore National Laboratory (USA). The name is in line with the tradition of honouring a scientist and recognises Professor Yuri Oganessian (born 1933) who played a leading role in discovering the heaviest elements of the periodic table, made significant advances in the nuclear physics of superheavy nuclei and produced experimental evidence for the "island of stability". Oganesson has the highest atomic number and mass of all known elements. It is extremely unstable, due to its high mass, and since 2005, only three or possibly four atoms of the isotope 294Uuo have been detected.
"It is a pleasure to see that specific places and names (country, state, city, and scientist) related to the new elements is recognised in these four names. Although these choices may perhaps be viewed by some as slightly self-indulgent, the names are completely in accordance with IUPAC rules", commented Jan Reedijk, who corresponded with each team and invited the discoverers to make proposals. "In fact, I see it as thrilling to recognise that international collaborations were at the core of these discoveries and that these new names also make the discoveries somewhat tangible."
Ultimately, and after the lapse of the public review in November, the final recommendations will be published in the journal Pure and Applied Chemistry.
• Follow us on Twitter
• Follow us on Facebook
12th February 2016
Gravitational waves detected for the first time
In a historical scientific landmark, researchers have announced the first detection of gravitational waves, as predicted by Einstein's general theory of relativity 100 years ago. This major discovery opens a new era of astronomy.
Credits: R. Hurt/Caltech-JPL
For the first time, scientists have directly observed "ripples" in the fabric of spacetime called gravitational waves, arriving at the Earth from a cataclysmic event in the distant universe. This confirms a major prediction of Einstein’s 1915 general theory of relativity and opens an unprecedented new window onto the cosmos.
The observation was made at 09:50:45 GMT on 14th September 2015, when two black holes collided. However, given the enormous distance involved and the time required for light to reach us, this event actually occurred some 1.3 billion years ago, during the mid-Proterozoic Eon. For context, this is so far back that multicellular life here on Earth was only just beginning to spread. The signal came from the Southern Celestial Hemisphere, in the rough direction of (but much further away than) the Magellanic Clouds.
The two black holes were spinning together as a binary pair, turning around each other several tens of times a second, until they eventually collided at half the speed of light. These objects were 36 and 29 times the mass of our Sun. As their event horizons merged, they became one – like two soap bubbles in a bath. During the fraction of a second that this happened, three solar masses were converted to gravitational waves, and for a brief instant the event hit a peak power output 50 times that of the entire visible universe.
The gravitational waves were detected by both of the twin Laser Interferometer Gravitational-wave Observatory (LIGO) detectors, located in Livingston, Louisiana, and Hanford, Washington, USA. The LIGO Observatories are funded by the National Science Foundation (NSF), and were conceived, built, and are operated by Caltech and MIT. The discovery was published yesterday in the journal Physical Review Letters.
Prof. Stephen Hawking told BBC News: "Gravitational waves provide a completely new way of looking at the Universe. The ability to detect them has the potential to revolutionise astronomy. This discovery is the first detection of a black hole binary system and the first observation of black holes merging. Apart from testing General Relativity, we could hope to see black holes through the history of the Universe. We may even see relics of the very early Universe during the Big Bang at some of the most extreme energies possible."
"There is a Nobel Prize in it – there is no doubt," said Prof. Karsten Danzmann, from the Max Planck Institute for Gravitational Physics and Leibniz University in Hannover, Germany, who collaborated on the study. In an interview with the BBC, he claimed the significance of this discovery is on a par with the determination of the structure of DNA.
"It is the first ever direct detection of gravitational waves; it's the first ever direct detection of black holes and it is a confirmation of General Relativity because the property of these black holes agrees exactly with what Einstein predicted almost exactly 100 years ago."
"We found a beautiful signature of the merger of two black holes and it agrees exactly – fantastically – with the numerical solutions to Einstein equations ... it looked too beautiful to be true."
LIGO measurement of gravitational waves at the Hanford (left) and Livingston (right) detectors, compared to the theoretical predicted values.
By Abbott et al. [CC BY 3.0]
"Scientists have been looking for gravitational waves for decades – but we’ve only now been able to achieve the incredibly precise technologies needed to pick up these very, very faint echoes from across the universe," said Danzmann. "This discovery would not have been possible without the efforts and the technologies developed by the Max Planck, Leibniz Universität, and UK scientists working in the GEO collaboration."
Researchers at the LIGO Observatories were able to measure tiny and subtle disturbances the waves made to space and time as they passed through the Earth, with machines detecting changes just fractions of the width of an atom. At each observatory, the two-and-a-half-mile (4-km) long L-shaped LIGO interferometer uses laser light split into two beams that travel back and forth along tubes kept at a near-perfect vacuum. The beams are used to monitor the distance between mirrors precisely positioned at the ends of the arms. According to Einstein’s theory, the distance between the mirrors will change by an infinitesimal amount when gravitational waves pass by the detector. A change in the lengths of the arms smaller than one-ten-thousandth the diameter of a proton can be detected; equivalent to a human hair's diameter over three light years from Earth.
"The Advanced LIGO detectors are a tour de force of science and technology, made possible by a truly exceptional international team of technicians, engineers, and scientists," says David Shoemaker of MIT. "We are very proud that we finished this NSF-funded project on time and on budget."
"We spent years modelling the gravitational-wave emission from one of the most extreme events in the universe: pairs of massive black holes orbiting with each other and then merging. And that’s exactly the kind of signal we detected!" says Prof. Alessandra Buonanno, director at the Max Planck Institute for Gravitational Physics in Potsdam.
"With this discovery, we humans are embarking on a marvellous new quest: the quest to explore the warped side of the universe – objects and phenomena that are made from warped spacetime," says Kip Thorne, Feynman Professor of Theoretical Physics at Caltech. "Colliding black holes and gravitational waves are our first beautiful examples."
Advanced LIGO is among the most sensitive instruments ever built. During its next observing stage, it is expected to detect five more black hole mergers and to detect around 40 binary star mergers each year, in addition to an unknown number of more exotic gravitational wave sources, some of which may not be anticipated by current theory.
29th January 2016
New research challenges long-held views on time evolution
Research into the nature of time by Griffith University's Centre for Quantum Dynamics shows how an asymmetry for time reversal might be responsible for making the universe move forwards in time.
New research from Griffith University's Centre for Quantum Dynamics is broadening perspectives on time and space. In a study published by the journal Proceedings of the Royal Society A, Associate Professor Joan Vaccaro challenges the long-held assumption that time evolution – the incessant unfolding of the universe over time – is an elemental part of Nature. In the paper, titled Quantum asymmetry between time and space, she suggests there may be a deeper origin due to a difference between the two directions of time: to the future and to the past.
"If you want to know where the universe came from and where it's going, you need to know about time," she says. "Experiments on subatomic particles over the past 50 years ago show that Nature doesn't treat both directions of time equally.
"In particular, subatomic particles called K and B mesons behave slightly differently, depending on the direction of time. When this subtle behaviour is included in a model of the universe, what we see is the universe changing from being fixed at one moment in time to continuously evolving.
"In other words, the subtle behaviour appears to be responsible for making the universe move forwards in time. Understanding how time evolution comes about in this way opens up a whole new view on the fundamental nature of time itself. It may even help us to better understand bizarre ideas such as travelling back in time."
According to her research, an asymmetry exists between time and space in the sense that physical systems inevitably evolve over time, whereas there is no corresponding ubiquitous translation over space. This asymmetry, long presumed to be elemental, is represented by equations of motion and conservation laws that operate differently over time and space.
However, Associate Professor Vaccaro used a "sum-over-paths formalism" to demonstrate the possibility of a time and space symmetry, meaning the conventional view of time evolution would need to be revisited.
"In the connection between time and space, space is easier to understand because it's simply there. But time is forever forcing us towards the future," says Vaccaro. "Yet while we are indeed moving forward in time, there is also always some movement backwards – a kind of jiggling effect – and it is this movement I want to measure using these K and B mesons."
Associate Professor Vaccaro says the research provides a solution to the origin of dynamics, an issue that has long perplexed science.
11th December 2015
Fusion reactor begins testing in Germany
The first helium plasma test has been successfully conducted at the Wendelstein 7-X fusion device in northeastern Germany. Tests with hydrogen plasma will begin in 2016.
The first helium plasma was produced yesterday in the Wendelstein 7-X fusion device at the Max Planck Institute for Plasma Physics (IPP) in Greifswald, northeastern Germany. Following more than a year of technical preparations and tests, experimental operation has now commenced according to plan. Wendelstein 7-X, the world's largest stellarator-type fusion device, will investigate the suitability of this type of device for a commercial power station.
After nine years of construction work and over a million assembly hours, the Wendelstein 7-X was completed in April 2014. Operational preparations have been underway ever since. Each technical system was tested in turn, the vacuum in the vessels, the cooling system, the superconducting coils and the magnetic field they produce, the control system, as well as the heating devices and measuring instruments.
On 10th December 2015, the day had arrived: the operating team in the control room started up the magnetic field and initiated the computer-operated experiment control system. This fed around one milligram of helium gas into the evacuated plasma vessel, switched on the microwave heating for a short pulse of 1.3 megawatts – and the first plasma was observed by the installed cameras and measuring devices. The exact moment of ignition was captured in this video.
“We’re starting with a plasma produced from the noble gas helium,” explains project leader, Professor Thomas Klinger: “We’re not changing over to the actual investigation object, a hydrogen plasma, until next year. This is because it’s easier to achieve the plasma state with helium. In addition, we can clean the surface of the plasma vessel with helium plasmas.”
The first plasma in the machine had a duration of one tenth of a second and achieved a temperature of around one million ºC. “We’re very satisfied”, concludes Dr. Hans-Stephan Bosch, whose division is responsible for the operation. “Everything went according to plan.” The next task will be to extend the duration of the plasma discharges and to investigate the best method of producing and heating helium plasmas using microwaves. After a break for New Year, the confinement studies will continue in January, which will prepare the way for producing the first plasma from hydrogen.
The Wendelstein 7-X fusion device. Photo: IPP, Thorsten Bräuer
The Wendelstein 7-X is the largest fusion device created using the "stellarator" concept, which refers to the possibility of harnessing the power source of the Sun, a stellar object. It is planned to operate with up to 30 minutes of continuous plasma discharge, demonstrating an essential feature of a future power plant: continuous operation. By contrast, tokamaks such as ITER can only operate in pulses without auxiliary equipment.
The Wendelstein 7-X is based on a five field-period Helias configuration. It is mainly a toroid – consisting of 50 non-planar and 20 planar superconducting magnetic coils, 3.5 m high – which induce a magnetic field that prevents the plasma from colliding with the reactor walls. The 50 non-planar coils are used for adjusting the magnetic field. It aims for a plasma temperature of 60 to 130 million K.
Stellarators were popular in the 1950s and 60s, but the much better results from tokamak designs led to them falling from favour in the 1970s. Wendelstein 7-X, however, aims to put the quality of the plasma equilibrium and confinement on a par with that of a tokamak for the very first time, potentially offering a new pathway to reliable fusion power.
Scheme of coil system (blue) and plasma (yellow) of the Wendelstein 7-X. A magnetic field line is highlighted in green on the plasma surface shown in yellow. Credit: Max Planck Institute for Plasma Physics [CC BY 3.0]
4th December 2015
1,000-fold increase in 3-D imaging resolution
A new system developed by MIT can increase the resolution of conventional 3-D imaging devices by 1,000 times.
Researchers at the Massachusetts Institute of Technology (MIT) have shown that by exploiting the polarisation of light – the physical phenomenon behind polarised sunglasses and most 3-D movie systems – they can increase the resolution of conventional 3-D imaging devices by up to 1,000 times. This technique could lead to high-quality 3-D cameras built into smartphones, or the ability to snap photos of objects and then use 3-D printing to produce accurate replicas. Further out, the work may also improve the ability of driverless cars in rain, snow and other reduced visibility conditions.
"Today, they can miniaturise 3-D cameras to fit on cellphones," says Achuta Kadambi, a PhD student in the MIT Media Lab and one of the system's developers. "But they make compromises to the 3-D sensing, leading to very coarse recovery of geometry. That's a natural application for polarisation, because you can still use a low-quality sensor, and adding a polarising filter gives you something that's better than many machine-shop laser scanners."
The researchers have described their new system – which they call Polarised 3D – in a paper to be presented at the International Conference on Computer Vision later this month.
Their experimental setup consisted of a Microsoft Kinect – which gauges depth using reflection time – combined with an ordinary polarising photographic lens placed in front of its camera. In each experiment, they took three photos of an object, rotating the polarising filter each time, and their algorithms compared the light intensities of the resulting images.
On its own, at a distance of several metres, the Kinect can resolve physical features as small as a centimetre or so across. But with the addition of the polarisation information, the hybrid system was able to resolve features in the range of tens of micrometres: one-thousandth the size. For comparison, they also imaged several of their test objects with a high-precision laser scanner, which requires that the object be inserted into the scanner bed. Polarised 3D still offered the higher resolution.
A mechanically rotated polarisation filter would probably be impractical in a cellphone camera, but grids of tiny polarisation filters that can overlay individual pixels in a light sensor would work. The paper also offers the tantalising prospect that polarisation systems may help in the development of self-driving cars. Experimental self-driving vehicles of today are reliable under normal illumination conditions – but their vision algorithms go haywire in rain, snow, or fog, due to water particles in the air scattering light in unpredictable ways. Polarised 3D could exploit information contained in interfering waves of light to handle scattering.
Yoav Schechner, associate professor of electrical engineering, comments on the research: "The work fuses two 3-D sensing principles, each having pros and cons. One principle provides the range for each scene pixel – the state of the art for most 3-D imaging systems. The second principle does not provide range. On the other hand, it derives the object slope, locally. In other words, per scene pixel, it tells how flat or oblique the object is."
"The work uses each principle to solve problems associated with the other principle," Schechner explains. "Because this approach practically overcomes ambiguities in polarisation-based shape sensing, it can lead to wider adoption of polarisation in the toolkit of machine-vision engineers."