Leaders of top robotics and AI companies have called on the UN to enact a worldwide ban on killer robots.
The letter, whose signatories hail from 26 countries and include Tesla's Elon Musk, reads as follows:
An Open Letter to the United Nations Convention on Certain Conventional Weapons
As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm. We warmly welcome the decision of the UN's Conference of the Convention on Certain Conventional Weapons (CCW) to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems. Many of our researchers and engineers are eager to offer technical advice to your deliberations.
We commend the appointment of Ambassador Amandeep Singh Gill of India as chair of the GGE. We entreat the High Contracting Parties participating in the GGE to work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies. We regret that the GGE's first meeting, which was due to start today (August 21, 2017), has been cancelled due to a small number of states failing to pay their financial contributions to the UN. We urge the High Contracting Parties therefore to double their efforts at the first meeting of the GGE now planned for November.
Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora's box is opened, it will be hard to close. We therefore implore the High Contracting Parties to find a way to protect us all from these dangers.
A key organiser of the letter, Toby Walsh, Scientia Professor of Artificial Intelligence at the University of New South Wales in Sydney, said: "Nearly every technology can be used for good and bad, and artificial intelligence is no different. It can help tackle many of the pressing problems facing society today: inequality and poverty, the challenges posed by climate change and the ongoing global financial crisis. However, the same technology can also be used in autonomous weapons to industrialise war. We need to make decisions today choosing which of these futures we want. I strongly support the call by many humanitarian and other organisations for a UN ban on such weapons, similar to bans on chemical and other weapons."
Ryan Gariepy, founder of Clearpath Robotics, commented: "The number of prominent companies and individuals who have signed this letter reinforces our warning that this is not a hypothetical scenario, but a very real, very pressing concern which needs immediate action. Unlike other potential manifestations of AI, which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability."
Stuart Russell, founder and Vice-President of Bayesian Logic, said: "Unless people want to see new weapons of mass destruction – in the form of vast swarms of lethal microdrones – spreading around the world, it's imperative to step up and support the United Nations' efforts to create a treaty banning lethal autonomous weapons. This is vital for national and international security."
If the potential for intelligent life to exist somewhere in the universe is so large, then where is everybody? In a new paper, an astrophysicist argues that species such as ours go extinct soon after attaining high levels of technology.
Credit: T.A.Rector and B.A.Wolpa/NOAO/AURA/NSF
The universe is incomprehensibly vast, with billions of other planets circling billions of other stars. The potential for intelligent life to exist somewhere out there should be enormous.
So, where is everybody?
That's the Fermi paradox in a nutshell. Daniel Whitmire, a retired astrophysicist who teaches mathematics at the University of Arkansas, once thought the cosmic silence indicated we as a species lagged far behind.
"I taught astronomy for 37 years," he says. "I used to tell my students that by statistics, we have to be the dumbest guys in the galaxy. After all we have only been technological for about 100 years while other civilisations could be more technologically advanced than us by millions or billions of years."
Recently, however, he's changed his mind. By applying a statistical concept called the principle of mediocrity – the idea that in the absence of any evidence to the contrary, we should consider ourselves typical, rather than atypical – Whitmire has concluded that instead of lagging behind, our species may be average. That's not good news.
In a paper published this month in the International Journal of Astrobiology, Whitmire argues that if we are typical, it follows that species such as ours go extinct soon after attaining technological knowledge.
This argument is based on two observations: (1) We are the first technological species to evolve on Earth, and (2) We are early in our technological development. (He defines "technological" as a biological species that has developed electronic devices and can significantly alter the planet.)
The first observation seems obvious, but as Whitmire notes in his paper, researchers believe the Earth should be habitable for animal life for at least another billion years into the future. Based on how long it took proto-primates to evolve into a technological species, that leaves enough time for it to happen again up to 23 times. On that time scale, there could have been others before us, but there's nothing in the geologic record to indicate we weren't the first: "We'd leave a heck of a fingerprint if we disappeared overnight," Whitmire notes.
By Whitmire's definition we became "technological" after the Industrial Revolution and the invention of radio, or roughly 100 years ago. According to the principle of mediocrity, a bell curve of the ages of all extant technological civilisations in the universe would put us in the middle 95%. In other words, technological civilisations that last millions of years, or longer, would be highly atypical. Since we are first, other typical technological civilisations should also be first. The principle of mediocrity allows no second acts. The implication is that once species become technological, they flame out and take the biosphere with them.
Whitmire argues that the principle holds for two standard deviations, or in this case about 200 years. But because the distribution of ages on a bell curve skews older (there is no absolute upper limit, but the age can't be less than zero), he doubles that figure and comes up with 500 years, give or take. Assuming a bell-shaped curve is not absolutely necessary – other assumptions give roughly similar results.
There's always the possibility that we are atypical and our species' lifespan will fall somewhere in the outlying 5% of the bell curve. If that's the case, we're back to the nugget of wisdom Whitmire taught his astronomy students for more than three decades.
"If we're not typical, then my initial observation would be correct," he said. "We would be the dumbest guys in the galaxy by the numbers."
Researchers at Brown University report the transmission of data through a terahertz multiplexer at 50 gigabits per second, which could lead to a new generation of ultra-fast Wi-Fi.
Credit: Mittleman lab / Brown University
Multiplexing, the ability to send multiple signals through a single channel, is a fundamental feature of any voice or data communication system. A team of researchers has now demonstrated a method for multiplexing data carried on terahertz waves – high-frequency radiation that could enable the next generation of ultra-high bandwidth wireless networks.
Writing in the journal Nature Communications, they describe the transmission of two real-time video signals through a terahertz multiplexer at an aggregate data rate of 50 gigabits per second, 100 times the optimal data rate of today's fastest cellular network.
"We showed that we can transmit separate data streams on terahertz waves at very high speeds and with very low error rates," said Daniel Mittleman, professor in Brown's University's School of Engineering and the paper's corresponding author. "This is the first time anybody has characterised a terahertz multiplexing system using actual data, and our results show that our approach could be viable in future terahertz wireless networks."
Current voice and data networks use microwaves to carry signals wirelessly. But like most forms of information technology, the demand for data transmission is growing exponentially, and quickly becoming more than microwave networks can handle. Terahertz waves have higher frequencies than microwaves and therefore a much larger capacity to carry data. However, scientists have only just begun experimenting with terahertz frequencies and many of the basic components needed for such communication don't exist yet.
A system for multiplexing and demultiplexing (also known as mux/demux) is one of those basic components. It's a technology that allows one cable to carry multiple TV channels, or hundreds of users to access a Wi-Fi network.
The mux/demux approach Mittleman and his colleagues developed uses two metal plates placed parallel to each other to form a waveguide, as shown in the illustration below. One plate has a slit cut into it. When a terahertz wave travels through the waveguide, some of the radiation leaks out of the slit. The angle at which radiation beams escape is dependent upon the frequency of the wave.
"We can put several waves at several different frequencies – each of them carrying a data stream – into the waveguide, and they won't interfere with each other because they're different frequencies; that's multiplexing," Mittleman said. "Each of those frequencies leaks out of the slit at a different angle, separating the data streams; that's demultiplexing."
Due to the nature of terahertz waves, signals in terahertz communications networks will propagate as directional beams, not omnidirectional broadcasts like in existing wireless systems. This directional relationship between propagation angle and frequency is key to enabling mux/demux in terahertz systems. A user at a particular location (and therefore at a particular angle from the multiplexing system) will communicate on a particular frequency.
Credit: Mittleman lab / Brown University
In 2015, Mittleman's team first published a paper describing their waveguide concept. For that initial work, they used a broadband terahertz light source to confirm that different frequencies did indeed emerge from the device at different angles. While that was an effective proof of concept, this latest work took the critical step of testing the device with real data.
The team encoded two high-definition television broadcasts, beamed together into the multiplexer system. Their experiments showed that transmissions were error-free up to 10 gigabits per second, which is much faster than today's standard Wi-Fi speeds. Error rates increased somewhat when the speed was boosted to 50 gigabits per second (25 gigabits per channel), but were still well within the range that can be fixed using forward error correction, which is commonly used in today's communications networks.
The researchers plan to continue developing this and other terahertz components. Mittleman recently received a license from the FCC to perform outdoor tests at terahertz frequencies on the Brown University campus.
"We think that we have the highest-frequency license currently issued by the FCC, and we hope it's a sign that the agency is starting to think seriously about terahertz communication," he said. "Companies are going to be reluctant to develop terahertz technologies until there's a serious effort by regulators to allocate frequency bands for specific uses, so this is a step in the right direction."
A major breakthrough in artificial intelligence has been announced, with a computer beating the world's best players at competitive eSports for the first time.
OpenAI is a non-profit research company established in 2015 whose founders include Elon Musk. They report that, for the first time, a computer program has beaten the world's best human players at competitive eSports – in this case, the game Defence of the Ancients 2 (aka Dota 2).
Earlier this year, Google's AlphaGo defeated the world's number one player at Go, a board game similar to chess but with far more complexity. Dota 2, however, is orders of magnitude more complex than even Go. This multiplayer battle arena pits two teams of five against each other, with each team occupying and defending their own separate base on a map. Each of the ten players independently controls a powerful character, known as a "hero", who all have unique abilities and styles of play. There are 113 to choose from. During a match, a player and their team collects experience points and items for their heroes in order to fight the opposing team's heroes and other defences. A team wins by being the first to destroy a large structure in the enemy base called the "Ancient".
This week, at the annual tournament hosted in Seattle, OpenAI has been demonstrating its bot, which mastered the game from scratch by self-play and does not use imitation learning or tree search. According to Greg Brockman, the company's Chief Technology Officer, self-playing is a more effective way for AI to learn complex tasks – as opposed to fighting much weaker enemies, or overwhelmingly strong ones. By playing against a copy of itself, the bot always has a worthy opponent and therefore remembers more useful information. After many thousands of practice runs, it gradually learned which moves worked best and which to avoid. It developed a number of behaviours that are demonstrated in the video below.
Whereas chess uses an 8 x 8 board, and the game Go has a 19 x 19 board (with each cell being either blank or occupied), Dota 2 is vastly more complex with a gaming "board" of 15,000 x 15,000 units, a lot of hidden information, and many more variables with every action. The OpenAI bot was able to handle this level of detail, however.
In a series of 1v1 matches, it went up against the top human champions in the world, beating them all. This included SumaiL (world's best 1v1 player) and Arteezy (top overall player in the world), as well as 27-year-old Danil "Dendi" Ishutin. At times, the movements of the bot were eerily human-like, with Dendi saying it "feels a little like [a] human, but a little like something else."
Elon Musk today praised OpenAI on Twitter. He has previously warned that AI represents a "fundamental risk to the existence of civilisation" and hopes the organisation will foster the creation of safer, more benevolent forms of machine intelligence.
OpenAI first ever to defeat world's best players in competitive eSports. Vastly more complex than traditional board games like chess & Go.
OpenAI is now developing its bot further, and hopes it can take part in a proper five-on-five match next year, as opposed to just 1v1. The company will also explore using its neural network for other games.
Scientists and engineers at the U.S. Army Research Laboratory (ARL) in Maryland have demonstrated a new nanomaterial powder that creates large amounts of energy by simply mixing it with water.
Credit: U.S. Army Research Laboratory
The substance is described as a nano-galvanic aluminium-based powder. It creates a bubbling reaction that splits apart water – two molecules of hydrogen and one of oxygen.
"The hydrogen that is given off can be used as a fuel in a fuel cell," said Scott Grendahl, a materials engineer and team leader. "What we discovered is a mechanism for a rapid and spontaneous hydrolysis of water."
It has already been known for a long time that hydrogen can be produced by adding a catalyst to aluminium. However, this normally takes time and requires elevated temperatures, added electricity and/or toxic chemicals. By contrast, the nanomaterial powder seen here does not require a catalyst and is very fast. The team calculates that one kilogram of the powder can produce 220 kilowatts of energy in just three minutes, which is doubled if you consider the amount of heat energy produced by the exothermic reaction.
"That's a lot of power to run any electrical equipment," said Dr. Anit Giri, a physicist for the Weapons and Materials Research Directorate. "These rates are the fastest known without using catalysts such as an acid, base or elevated temperatures."
As seen in the video, the team demonstrated a small radio-controlled tank powered by the powder and water reaction. They believe their discovery is dramatic in terms of future potential. It could be 3-D printed and incorporated into future air or ground robots. These self-cannibalising machines would feed off their own structures, then self-destruct after mission completion. It could also help future soldiers to recharge mobile devices for recon teams.
"There are other researchers who have been searching their whole lives and their optimised product takes many hours to achieve, say 50% efficiency," Grendahl said. "Ours does it to 100% efficiency in less than three minutes."
"The important aspect of the approach is that it lets you make very compact systems," notes Anthony Kucernak from Imperial College London, who was not involved in this particular study, but is an expert on fuel cell technology. "That would be very useful for systems which need to be very light or operate for long periods on hydrogen, where the use of hydrogen stored in a cylinder is prohibitive."
Researchers in the U.S. have created "smart windows" that rapidly change opacity, depending on how sunny it is. This new technology could cut utility costs.
Engineers at Stanford University have created dynamically changing windows that can switch from transparent to opaque in only a minute – and back again in just 20 seconds – a major improvement over dimming windows currently being installed to reduce cooling costs in some buildings.
The newly designed "smart" windows consist of conductive glass plates outlined with metal ions that spread out over the surface, blocking light in response to an electrical current. The results are described in the 9th August edition of the journal Joule.
"We're excited because dynamic window technology has the potential to optimise the lighting in rooms or vehicles, and save about 20 percent in heating and cooling costs," said Michael McGehee, a professor of materials science and engineering at Stanford and senior author of the study. "It could even change the way people wear sunglasses."
Credit: Barile et al./Joule 2017
The researchers have filed a patent for their new technology and have entered into discussions with glass manufacturers and other potential partners. However, more research is needed to make the surface area of the windows large enough for commercial applications. The prototypes used in the study are only about 4 square inches (25 square centimetres) in size. The team also wants to reduce manufacturing costs to be competitive with dynamic windows already on the market.
"This is an important area that is barely being investigated at universities," McGehee said. "There's a lot of opportunity to keep us motivated."
Commercially available smart windows are made of materials, such as tungsten oxide, that change colour when charged with electricity. But these tend to be expensive, have a blue tint, can take more than 20 minutes to dim, and become less opaque over their lifetime. The Stanford prototype blocks light through the movement of a copper solution over a sheet of indium tin oxide, modified with platinum nanoparticles.
Credit: Barile et al./Joule 2017
When transparent, the window is clear and lets roughly 80 percent of incoming natural light pass through. When dark, the transmission of light drops below five percent. To test its durability, the researchers switched the windows on and off more than 5,000 times and saw no degradation in the transmission of light.
"We've had a lot of moments where we thought, how is it even possible we've made something that works so well so quickly?" McGehee said. "We didn't tweak what was out there. We came up with a completely different solution."
Perhaps in some future decade, with further advances in nanotechnology, smart windows could be developed that respond to sunlight in real time and instantly change colour – while being affordable enough to feature as standard in every building and vehicle.
An international team of astronomers has found that four Earth-sized planets orbit the nearest Sun-like star, Tau Ceti, which lies 12 light years away and is visible to the naked eye. These planets have masses as low as 1.7 Earth mass, making them among the smallest planets ever to be detected around G-type stars near our Solar System. Two are super-Earths located in the habitable zone, meaning they could support liquid surface water.
The planets were detected by observing tiny wobbles in the movement of Tau Ceti. This required techniques sensitive enough to detect variations in the movement of the star as small as 30 centimetres (12 inches) per second.
"We are now finally crossing a threshold where, through very sophisticated modelling of large combined data sets, we can disentangle the noise due to stellar surface activity from very tiny signals generated by the gravitational tugs of Earth-sized orbiting planets," said the study co-author, Steven Vogt, Professor of Astronomy and Astrophysics at the University of California, Santa Cruz.
Illustration courtesy of Fabo Feng
According to lead author Fabo Feng at the University of Hertfordshire, UK, researchers are now tantalisingly close to the 10-centimetre-per-second limit required for detecting Earth analogues: "Our detection of such weak wobbles is a milestone in the search for Earth analogues and the understanding of the Earth's habitability through comparison with these analogues," said Feng. "We have introduced new methods to remove the noise in the data in order to reveal the weak planetary signals."
As shown in the diagram above, the outer two planets around Tau Ceti are likely to be candidate habitable worlds – although a massive debris disc around the star probably reduces their habitability, due to intensive bombardment by asteroids and comets.
The same team also investigated Tau Ceti in 2013, when co-author Mikko Tuomi from the University of Hertfordshire led an effort in developing data analysis techniques and using the star as a benchmark case: "We came up with an ingenious way of telling the difference between signals caused by planets and those caused by a star's activity," he explains. "We realised that we could see how a star's activity differed at different wavelengths and use that information to separate this activity from signals of planets."
The team painstakingly improved the sensitivity of their method and were able to rule out two signals they had identified in 2013 as planets: "But no matter how we look at the star, there seem to be at least four rocky planets orbiting it," Tuomi says. "We are slowly learning to tell the difference between wobbles caused by planets, and those caused by stellar active surface. This enabled us to essentially verify the existence of the two outer, potentially habitable planets in the system."
Sun-like stars are thought to be the best targets in the search for habitable Earth-like planets, due to their similarity to our Sun. Unlike more common smaller stars, such as the red dwarf stars Proxima Centauri and Trappist-1, they are not so faint that planets would be tidally locked, showing the same side to the star at all times. Tau Ceti is very similar to the Sun in its size and brightness, and both stars host multi-planet systems.
A paper on these new findings was accepted for publication in the peer-reviewed Astronomical Journal and is available online. The data was obtained by using the HARPS spectrograph (European Southern Observatory, Chile) and Keck-HIRES (W. M. Keck Observatory, Mauna Kea, Hawaii).
The Sun (left) and Tau Ceti (right). Both are G-type stars.
Credit: R.J. Hall [CC-BY-SA-3.0], via Wikimedia Commons
Scientists at the University of Utah report that youthful plasticity has been restored to aging mouse brains by manipulating only a single gene.
Like much of the rest of the body, the brain loses flexibility with age, impacting the ability to learn, remember and adapt. Now, scientists at the University of Utah Health report they can rejuvenate the plasticity of the mouse brain – specifically in the visual cortex – increasing its ability to adapt and change in response to new experiences. Furthermore, manipulating only a single gene is enough to trigger this improvement. The research appears this week in Proceedings of the National Academy of Sciences (PNAS).
"It's exciting because it suggests that by just manipulating one gene in adult brains, we can boost brain plasticity," says lead author Jason Shepherd, Ph.D., Assistant Professor of Neurobiology and Anatomy. "This has implications for potentially reducing normal cognitive decline with aging, or boosting recovery from brain injury after stroke or traumatic brain injury," he says. Additional research will need to be done to determine whether plasticity in humans and mice is regulated in the same way.
In a previous study, Shepherd and his colleagues found that during youth, a "critical window" of brain plasticity is never available to mice lacking a gene known as Arc. Temporarily closing a single eye of a normal young mouse deprived the visual cortex of normal input, and the neurons' electrophysiological response to visual experience changed. By contrast, young mice without Arc could not adapt to the new experience in the same way.
"Given our previous studies, we wondered whether Arc is essential for controlling the critical period of plasticity during normal brain development," explains Shepherd.
Overexpressed Arc in the visual cortex. Credit: Elissa Pastuzyn
If there is no visual plasticity without Arc, then perhaps the gene plays a role in keeping the "critical window" open? In support of this idea, the team's latest investigation finds that in the mouse visual cortex, Arc rises and falls in parallel with visual plasticity. The two peak in teen mice, falling sharply by middle age – suggesting they are linked.
The team probed this connection further in two more ways. First, they tested mice with a strong supply of Arc throughout life. At middle age, these mice responded to visual deprivation as robustly as their juvenile counterparts. By prolonging Arc's availability, the window of plasticity was open for longer.
A second set of experiments raised the bar higher. Viruses were used to deliver Arc to middle age mice, after the critical window had shut. Following this intervention, these older mice responded to visual deprivation just as a youngster would. So even though the window had already closed, Arc enabled it to open once again.
"It was incredible to see that in adult mice, who have gone through normal development and aging, simply overexpressing Arc with a virus restored plasticity," says co-first author Kyle Jenks, a graduate student in Shepherd's lab.
Additional research will now be required to understand precisely how manipulating Arc boosts plasticity. Whether Arc is involved in regulating the plasticity of neurological functions mediated by other brain structures – such as learning, memory, or repair – remains to be tested, but will be examined in the future, says Shepherd.
Hyperloop One – a new transportation system that could revolutionise how people and cargo move around the world – has completed its second phase of testing, with a 2.7 times faster speed.
As the only company in the world that has built an operational Hyperloop system, Hyperloop One continues to make history with the successful completion of its second phase of testing. On 29th July, they completed Phase 2, achieving higher test speeds and travelling nearly the full distance of the 500-metre DevLoop track in the Nevada desert. The XP-1, the company's first-generation pod, accelerated for 300 metres and glided above the track using magnetic levitation before braking and coming to a gradual stop.
"This is the beginning, and the dawn of a new era of transportation," said Shervin Pishevar, Executive Chairman and Co-founder of Hyperloop One. "We've reached historic speeds of 310 km an hour, and we're excited to finally show the world the XP-1 going into the Hyperloop One tube. When you hear the sound of the Hyperloop One, you hear the sound of the future."
During phase 2, Hyperloop One achieved record speeds, in a tube depressurised down to the equivalent of air at 38 miles (61 km) above sea level. All components of the system were successfully tested – including the highly efficient electric motor, advanced controls and power electronics, custom magnetic levitation and guidance, pod suspension and vacuum system.
With Hyperloop One, passengers and cargo are loaded into a pod, and accelerate gradually via electric propulsion through a low-pressure tube. The pod quickly lifts above the track using magnetic levitation and glides at airline speeds for long distances due to ultra-low aerodynamic drag.
This latest test reached 192 mph (310 km/h), using only 300 metres of stator for propulsion. With an additional 2,000 metres of stator, the team believes it will eventually be possible to reach 700 mph (1,126 km/h). For comparison, the fastest passenger trains currently operational in the United States are the Acela Express (reaching 150 mph) and the Northeast Regional (125 mph).
"We've proven that our technology works, and we're now ready to enter into discussions with partners, customers and governments around the world about the full commercialisation of our Hyperloop technology," said Hyperloop One CEO Rob Lloyd. "We're excited about the prospects and the reception we've received from governments around the world to help solve their mass transportation and infrastructure challenges."
"Our team of engineers continues to make history at DevLoop. Only a handful of teams would have attempted something so audacious, while far less could have achieved it," said Josh Giegel, President of Engineering and Co-founder of Hyperloop One. "Through tireless preparation, dedication and hard work, we successfully completed Phase 1, proving that Hyperloop One technology works and that Hyperloop is real. Phase 2 was far more difficult as we built upon everything we learned from our initial test and accomplished faster speeds at a farther distance. We're now one step closer to deploying Hyperloop around the world."
Phase 2 vs. Phase 1
Achieved 2.7x faster speed (192 mph vs. 69 mph)
Went 4.5x farther distance (1,433 feet vs. 315 feet)
For the first time, scientists have used the CRISPR system in human embryos to delete faulty DNA responsible for a hereditary heart condition.
Credit: Oregon Health and Science University (OHSU)
Scientists have, for the first time, corrected a disease-causing mutation in early stage human embryos with gene editing. The technique, which uses the CRISPR-Cas9 system, corrected the mutation for a heart condition at the earliest stage of embryonic development so that the defect would not be passed on to future generations.
The work, described today in the journal Nature, is a collaboration between the Salk Institute, Oregon Health and Science University (OHSU) and Korea's Institute for Basic Science. It could pave the way for improved in vitro fertilization (IVF) outcomes, as well as eventual cures for some of the thousands of diseases caused by mutations in single genes.
"Thanks to advances in stem cell technologies and gene editing, we are finally starting to address disease-causing mutations that impact potentially millions of people," says Juan Carlos Izpisua Belmonte, a professor in Salk's Gene Expression Laboratory and a corresponding author of the paper. "Gene editing is still in its infancy, so even though this preliminary effort was found to be safe and effective, it is crucial that we continue to proceed with the utmost caution, paying the highest attention to ethical considerations."
Hypertrophic cardiomyopathy (HCM) is the most common cause of sudden death in otherwise healthy young athletes, and affects approximately 1 in 500 people overall. It is caused by a dominant mutation in the MYBPC3 gene, but often goes undetected until it is too late. Since people with a mutant copy of the MYBPC3 gene have a 50 percent chance of passing it on to their own children, being able to correct the mutation in embryos would prevent the disease not only in affected children, but also in their descendants.
The researchers generated induced pluripotent stem cells from a skin biopsy donated by a male with HCM and developed a gene-editing strategy based on CRISPR-Cas9 that would specifically target the mutated copy of the MYBPC3 gene for repair. The targeted mutated MYBPC3 gene was cut by the Cas9 enzyme, allowing the donor's cells' own DNA-repair mechanisms to fix the mutation during the next round of cell division by using either a synthetic DNA sequence or the non-mutated copy of MYBPC3 gene as a template.
Using IVF techniques, the researchers injected the best-performing gene-editing components into healthy donor eggs newly fertilised with the donor's sperm. Then they analysed all the cells in the early embryos at single-cell resolution to see how effectively the mutation was repaired.
The scientists were surprised by just how safe and efficient the method was. Not only did a high percentage of embryonic cells get repaired, but also gene correction did not induce any detectable off-target mutations and genome instability – major concerns for gene editing. In addition, the researchers developed a robust strategy to ensure the repair occurred consistently in all the cells of the embryo (spotty repairs can lead to some cells continuing to carry mutations – see illustration below).
Professor Belmonte and his colleagues emphasise that, although promising, these are very preliminary results and more research will need to be done to ensure no unintended effects occur.
"Our results demonstrate the great potential of embryonic gene editing, but we must continue to realistically assess the risks as well as the benefits," adds Belmonte.
Future work will continue to assess the safety and effectiveness of the procedure and efficacy of the technique with other mutations.
However, this latest study has already been condemned by Dr David King, from the campaign group Human Genetics Alert, which described the research as "irresponsible" and a "race for first genetically modified baby".
"Perhaps the biggest question, and probably the one that will be debated the most, is whether we should be physically altering genes of an IVF embryo at all," said Darren Griffin, a professor of genetics at the University of Kent. "This is not a straightforward question. Equally, the debate on how morally acceptable it is not to act when we have the technology to prevent these life-threatening diseases must also come into play."
Researchers in South Korea have announced a new world record efficiency of 22.1% for perovskite solar cells.
Credit: Ulsan National Institute of Science and Technology (UNIST)
Each year, the efficiency of solar power continues to inch upward, as new technological advances are made. Now, researchers at the Ulsan National Institute of Science and Technology (UNIST), South Korea, have announced the latest breakthrough in perovskite solar cells, which is published in the journal Science.
There are many different types of solar power. A major advantage of perovskite solar cells (PSCs) is that it is possible to make them with cheaper and more commonly available metals and chemicals, as opposed to the expensive raw materials used in other silicon substitutes. They also have more flexibility in terms of colour adjustment, enabling them to be fabricated in more aesthetically-pleasing ways on rooftops, for example. They can even be processed as additional layers on top of silicon panels, to improve the appearance of the more advanced and expensive panels.
The formation of a dense and uniform thin layer on the substrates is crucial for high-performance PSCs. The concentration of defect states, which reduce a cell's performance by decreasing the open-circuit voltage and short-circuit current density, needs to be as low as possible.
The research team at UNIST reports that careful control of the growth conditions of perovskite layers with management of deficient halide anions is essential for realising high-efficiency thin-film PSCs based on lead-halide-perovskite absorbers. In their study, they demonstrated the introduction of additional iodide ions into the organic cation solution, which are used to form the perovskite layers through an intramolecular exchanging process, decreasing the concentration of deep-level defects.
"This study can improve the current record efficiency of perovskite solar cells from 20.1% to 22.1%," says Professor Sang-Il Seok, from the School of Energy and Chemical Engineering at UNIST. "This will accelerate the commercialisation of low-cost, high-performance perovskite solar cells."
The energy conversion efficiency of 22.1% has now been officially certified by the National Renewable Energy Laboratory (NREL) in the USA.