The B612 Foundation has released a video showing evidence of 26 multi-kiloton asteroid impacts since 2001.
At a press conference yesterday at the Seattle Museum of Flight, three prominent astronauts supporting the B612 Foundation presented a visualisation of new data showing the surprising frequency at which the Earth is hit by asteroids. The astronauts were guests of the Seattle Museum for a special series of public events on Earth Day 2014.
Dr. Ed Lu, former US Shuttle and Soyuz Astronaut and co-founder and CEO of the B612 Foundation was joined by former NASA Astronaut Tom Jones, President of the Association of Space Explorers and Apollo 8 Astronaut Bill Anders, first Chairman of the Nuclear Regulatory Commission. They discussed findings recently released from the Nuclear Test Ban Treaty Organisation, which operates a network of sensors that monitors Earth around the clock listening for the infrasound signature of nuclear detonations.
Between 2000 and 2013, this network detected 26 explosions on Earth ranging in energy from 1 to 600 kilotons – all caused not by nuclear explosions, but rather by asteroid impacts. To put that in perspective, the atomic bomb that destroyed Hiroshima in 1945, exploded with an energy of 15 kilotons. While most of these asteroids exploded too high in the atmosphere to do serious damage on the ground, the evidence is important in estimating the frequency of a potential "city-killer-size" asteroid.
The Earth is continuously colliding with fragments of asteroids, the largest in recent times exploding over Tunguska, Siberia in 1908 with an energy impact of 5-15 megatons. More recently, we witnessed the 600-kiloton impact in Chelyabinsk, Russia in 2013, and impacts greater than 20 kilotons occurred in South Sulawesi, Indonesia in 2009, in the Southern Ocean in 2004, and in the Mediterranean Sea in 2002. Important to note as well is the fact that none of these asteroids were detected or tracked in advance by any existing space-based or terrestrial observatory.
"While most large asteroids with the potential to destroy an entire country or continent have been detected, less than 10,000 of the more than a million dangerous asteroids with the potential to destroy an entire major metropolitan area have been found by all existing space or terrestrially-operated observatories," stated Dr. Lu. "Because we don't know where or when the next major impact will occur, the only thing preventing a catastrophe from a 'city-killer' sized asteroid has been blind luck."
The B612 Foundation aims to change that by building the Sentinel Space Telescope Mission, an early warning infrared space telescope for tracking asteroids that would provide many years to deflect an asteroid when it is still millions of miles away. The B612 Sentinel Mission will be the world's first privately funded deep space mission that will create the first comprehensive, dynamic map of our inner Solar System, identifying the current and future locations and trajectories of Earth crossing asteroids. Sentinel will detect and track over 200,000 asteroids in just the first year of operation, after a planned launch in 2018. The spacecraft will be operational until 2024.
The superior light-emitting properties of quantum dots can be applied to solar energy, helping to more efficiently harvest sunlight.
A house window that doubles as a solar panel could be on the horizon, thanks to recent work by Los Alamos National Laboratory researchers in collaboration with scientists from the University of Milano-Bicocca (UNIMIB), Italy. Their project demonstrates that superior light-emitting properties of quantum dots can be applied in solar energy by helping more efficiently harvest sunlight.
Quantum dots are ultra-small nanocrystals of semiconductor matter that are synthesized with nearly atomic precision. Their emission colour can be tuned by simply varying their dimensions. Colour tunability is combined with high emission efficiencies approaching 100%. These properties have recently become the basis of a new technology – quantum dot displays – employed, for example, in the newest generation of the Kindle Fire e-reader.
A luminescent solar concentrator (LSC) is a photon management device, representing a slab of transparent material that contains highly efficient emitters such as dye molecules or quantum dots. Sunlight absorbed in the slab is re-radiated at longer wavelengths and guided towards the slab edge equipped with a solar cell.
Quantum dot LSC devices under ultraviolet illumination.
Lead researcher Victor Klimov explained: “The LSC serves as a light-harvesting antenna – which concentrates solar radiation collected from a large area onto a much smaller solar cell – and this increases its power output.”
“LSCs are especially attractive because, in addition to gains in efficiency, they can enable new interesting concepts such as photovoltaic windows that can transform house facades into large-area energy generation units,” said his colleague, Sergio Brovelli.
To implement their concept, Los Alamos researchers created a series of cadmium selenide/cadmium sulfide (CdSe/CdS) quantum dots, which were then incorporated by the Italian team into large slabs of transparent polymer. The particles are tiny, only about 10 nanometres (nm) across. For comparison, human hairs are typically 50,000 nm wide.
Spectroscopic measurements indicated virtually no losses to re-absorption on distances of tens of centimetres. Tests using simulated solar radiation demonstrated high photon harvesting efficiencies of around 10% per absorbed photon – achievable in nearly transparent samples – perfectly suited for utilisation as photovoltaic windows.
These findings are published in Nature Photonics. According to a report earlier this year, the quantum dot and quantum dot display (QLED) markets are expected to see a 42-fold growth in the next five years, reaching $6.4 billion by 2019.
After years of failed attempts, researchers have successfully cloned human stem cells from adult cells – a breakthrough that could one day lead to diseased or damaged cells being regenerated in patients.
A report in the journal Cell Stem Cell describes how the same cloning technique that produced Dolly the sheep was applied, but with adult human cells. Last year, a team at Oregon Health & Science University used this method to clone stem cells from fetuses.
This time, the cells were derived from adults – 35 and 75 years old, respectively.
Nuclear transfer, as the process is called, involves taking a donor's DNA – in this case from skin cells – then inserting that DNA into an "empty" egg cell with its DNA stripped out. The resulting hybrid cell is stimulated to fuse and begin dividing, with a new line of stem cells being created within a few days from the donor DNA. These stem cells are then extracted in the laboratory, where further treatments enable them to develop into specific types of cells, like neurons, muscle, insulin-producing cells, or whatever is required. Type 1 diabetics, for example, unable to make enough insulin, could in theory generate their own cells that produced the hormone.
Key to the success of this latest breakthrough was the use of caffeine to prevent the hybrid egg from dividing prematurely. The eggs were made to rest for about two hours – rather than 30 minutes – giving the DNA extra time to adjust and interact with its new environment. This delayed reaction apparently "erased" the cell's history, causing it to behave like an entirely new structure.
Of the 77 samples, however, only two fully developed into clone stem cells. The process remains highly inefficient and expensive, meaning that only extremely rich individuals could benefit at present. Despite this, lead researcher Dr. Robert Lanza and his team at biotech firm Advanced Cell Technology are optimistic that progress will continue. Their experiments have now proved, for the first time, that successful cloning of human stem cells is possible with donors of any age – even the elderly. Lanza now hopes to create a virtual library of cells, using carefully selected DNA donors taken from millions of different samples.
This breakthrough also reignites the debate on full human cloning, its ethical implications and potential for abuse. Marcy Darnovsky, a director at the Center for Genetics and Society, has commented: "If we're going to be having cloned embryos in laboratories around the country, we really need to get our act together and have a law that prohibits human reproductive cloning. Sixty countries have done that."
As to the question of when human cloning might happen – Dr. Paul Knoepfler, from UC Davis School of Medicine, says: "I don't believe that's coming anytime soon, but certainly this kind of technology could be abused by some kind of rogue scientist."
In a sign of the changing times, marijuana is now publicly available from vending machines in Colorado. American Green, part of Tranzbyte Corporation, has begun distributing "Zazzz Machines" containing the drug. These utilise radio-frequency identification tags (RFID) to track the products, along with biometrics to verify a customer's age. They even accept Bitcoin, a new digital currency. The first machine was unveiled on 12th April and is located at the Herbal Elements store in Avon, Colorado. A recent Gallup poll showed a clear majority of Americans (58%) in favour of marijuana being made fully legal, with growing numbers admitting to have tried it. Colorado expects to collect nearly $100 million in tax revenue from recreational marijuana use this year – about 40% more than originally forecast.
A new statistical analysis of temperature data since the year 1500 concludes "with confidence levels greater than 99%, and most likely greater than 99.9%" that recent global warming is not caused by natural factors and is man-made.
A new analysis of temperature data since 1500 all but rules out the possibility that global warming in the modern era (1880 — present) is just a natural fluctuation in the Earth’s climate. The study, led by Professor Shaun Lovejoy at McGill University, is published in the peer-reviewed scientific journal Climate Dynamics. It represents a new approach to the question of whether global warming in modern times has been caused by man-made emissions from the burning of fossil fuels. Rather than using complex computer models to estimate the effects of greenhouse-gas emissions, Lovejoy examines historical data to assess the competing hypothesis: that warming over the past century is due to natural, long-term variations in temperature.
“This study will be a blow to any remaining climate-change deniers,” Lovejoy comments. “Their two most convincing arguments – that the warming is natural in origin, and that the computer models are wrong – are either directly contradicted by this analysis, or simply do not apply to it.”
Lovejoy’s study applies statistical methodology to determine the probability that global warming since 1880 is due to natural variability. His conclusion: the natural warming hypothesis may be ruled out “with confidence levels greater than 99%, and most likely greater than 99.9%.”
To assess the natural variability before much human interference, the new study uses “multi-proxy climate reconstructions” developed by scientists in recent years to estimate historical temperatures, as well as fluctuation-analysis techniques from nonlinear geophysics. The climate reconstructions take into account a variety of gauges found in nature – such as tree rings, ice cores, and lake sediments. And the fluctuation-analysis techniques make it possible to understand the temperature variations over a wide range of time scales.
For the industrial era, Lovejoy uses carbon dioxide from the burning of fossil fuels as a proxy for all man-made climate influences – a simplification justified by the tight relationship between global economic activity and the emission of greenhouse gases and particulate pollution. “This allows the new approach to implicitly include the cooling effects of particulate pollution that are still poorly quantified in computer models,” he says.
While his new study makes no use of the huge computer models commonly used by scientists to estimate the magnitude of future climate change, Lovejoy’s findings effectively complement those of the International Panel on Climate Change (IPCC), he says. His study predicts, with 95% confidence, that a doubling of carbon dioxide levels in the atmosphere would cause the climate to warm by between 1.9 and 4.2 degrees Celsius. That range is more precise than – but in line with – the IPCC’s prediction that temperatures will rise by 1.5 to 4.5 degrees Celsius if CO2 concentrations double.
“We’ve had a fluctuation in average temperature that’s just huge since 1880 – on the order of about 0.9 degrees Celsius,” Lovejoy says. “This study shows that the odds of that being caused by natural fluctuations are less than one in a hundred and are likely to be less than one in a thousand.
“While the statistical rejection of a hypothesis can’t generally be used to conclude the truth of any specific alternative, in many cases – including this one – the rejection of one greatly enhances the credibility of the other.”
Honda this week showcased the newest version of ASIMO, the world's most advanced humanoid robot, for the first time in North America, featuring its latest innovations – including the ability to communicate in sign language and to climb stairs without stopping.
ASIMO – which stands for Advanced Step in Innovative Mobility – was first introduced 14 years ago. Since then, it has made significant advances – including physical improvements like running and hopping on one leg, as well as breakthroughs in dexterity and intelligence, that have furthered Honda's dream of creating humanoid robots to help society.
"This is an exciting project for Honda," said Satoshi Shigemi, senior chief engineer of Honda R&D and the leader of Honda's humanoid robotics program. "Our engineers are working tirelessly to develop new technologies aimed at helping ASIMO work in a real world environment."
The new version of ASIMO has undergone numerous changes to its 4'3", 110-pound body. Developments in the lower body have enhanced stability and balance control, allowing the robot to climb more smoothly, run faster and change directions in a more-controlled fashion.
Enhancements in the upper body include major increases in the degrees of freedom available in the robot's hands. Each hand now contains 13 degrees of freedom, which allows ASIMO to perform many more intricate and precise tasks.
The increased hand dexterity provides additional movement in each finger, which also led to the development of ASIMO's new ability to communicate using both American and Japanese sign language. Force sensors in the robot's hands also provide instantaneous feedback allowing ASIMO to use the appropriate amount of force when performing a task. This allows the robot to pick up paper cups without crushing them, for example, but still allows it to use a stronger force when necessary.
"It was obvious that overall flexibility was necessary, and many more complex tasks can now be performed because of the improved operational capacity in the hands," Shigemi continued. "But perhaps more importantly, these innovations enhance ASIMO's communication skills, which is essential to interact with human beings."
Advanced technologies derived from research on ASIMO have also benefited other Honda business lines. For example, the Vehicle Stability Assistance (VSA) used in the Honda Civic, along with technologies in the championship-winning Honda Moto GP motorcycles had their genesis in Honda's robotics research program.
Later this summer, the new ASIMO will follow in the footsteps of its predecessor to become a daily performer at Disneyland's Tomorrowland.
NASA has announced the discovery of Kepler 186 f, an Earth-sized exoplanet in the habitable zone of its host star, Kepler 186.
This artistic concept is the result of scientists and artists collaborating to help imagine the appearance of the Kepler-186 star system and its planets. Credit: NASA.
The first Earth-sized exoplanet orbiting within the habitable zone of another star has been confirmed by observations with both the W. M. Keck Observatory and the Gemini Observatory. The initial discovery, made by NASA's Kepler Space Telescope, is one of a handful of smaller planets found by Kepler and verified using large ground-based telescopes. It also confirms that Earth-sized planets do exist in the habitable zone of other stars.
"What makes this finding particularly compelling is that this Earth-sized planet, one of five orbiting this star, which is cooler than the Sun, resides in a temperate region where water could exist in liquid form," says Elisa Quintana of the SETI Institute and NASA Ames Research Centre, who led the paper published in the current issue of the journal Science. The region in which this planet orbits its star is called the habitable zone, as it is thought that life would most likely form on planets with liquid water.
Steve Howell, Kepler's Project Scientist and a co-author on the paper, adds that neither Kepler (nor any telescope) is currently able to directly spot exoplanets of this size and proximity to their host star. "However, what we can do is eliminate essentially all other possibilities, so the validity of these planets is really the only viable option."
With such a small host star, the team employed a technique that eliminated the possibility that either a background star or a stellar companion could be mimicking what Kepler detected. To do this, they obtained extremely high spatial resolution observations from the eight-metre Gemini North telescope on Mauna Kea in Hawaii, using a technique called speckle imaging, as well as adaptive optics (AO) observations from the ten-metre Keck II telescope, Gemini's neighbour on Mauna Kea. Together, these data allowed the team to rule out sources close enough to the star's line-of-sight to confound the Kepler evidence, and conclude that Kepler's detected signal has to be from a small planet transiting its host star.
"The Keck and Gemini data are two key pieces of this puzzle," says Quintana. "Without these complementary observations, we wouldn't have been able to confirm this Earth-sized planet."
The Gemini "speckle" data directly imaged the system, zooming to within about 400 million miles (about 4 AU, approximately equal to the orbit of Jupiter in our Solar System) of the host star and confirming there were no other stellar-sized objects orbiting within this radius from the star. Augmenting this, Keck AO observations probed a larger region around the star but to fainter limits. According to Quintana, "These Earth-sized planets are extremely hard to detect and confirm, and now that we've found one, we want to search for more. Gemini and Keck will no doubt play a large role in these endeavours."
The host star, Kepler-186, is an M1-type dwarf star relatively close to our Solar System at 500 light years and in the constellation of Cygnus. The star is very dim, being over half a million times fainter than the faintest stars we can see with the naked eye. Five small planets have been found orbiting it – four of which are in very short-period orbits and are very hot. The planet designated Kepler-186f, however, is Earth-sized and orbits within the star's habitable zone. The Kepler evidence for this planetary system comes from the detection of planetary transits. These transits can be thought of as tiny eclipses of the host star by a planet (or planets) as seen from the Earth. When such planets block part of the star's light, its total brightness diminishes. Kepler detects that as a variation in the star's total light output and evidence for planets. So far, more than 3,800 candidate planets have been detected by this method with Kepler.
The Gemini data utilised the Differential Speckle Survey Instrument (DSSI) on the Gemini North telescope. DSSI is a visiting instrument developed by a team led by Howell who adds, "DSSI on Gemini rocks! With this combination, we can probe down into this star system to a distance of about 4 times that between the Earth and the Sun. It's simply remarkable that we can look inside other solar systems." DSSI works on a principle that utilises multiple short exposures of an object to capture and remove the noise introduced by atmospheric turbulence producing images with extreme detail.
Observations with the W.M. Keck Observatory used the Natural Guide Star Adaptive Optics system with the NIRC2 camera on the Keck II telescope. NIRC2 (the Near-Infrared Camera, second generation) works in combination with the Keck II adaptive optics system to obtain very sharp images at near-infrared wavelengths, achieving spatial resolutions comparable to or better than those achieved by the Hubble Space Telescope at optical wavelengths. NIRC2 is probably best known for helping to provide definitive proof of a central massive black hole at the centre of our galaxy. Astronomers also use NIRC2 to map surface features of Solar System bodies, detect planets orbiting other stars, and study detailed morphology of distant galaxies.
"Observations from Keck and Gemini, combined with other data and numerical calculations, allowed us to be 99.98% confident that Kepler-186f is real," says Thomas Barclay, Kepler scientist and co-author on the paper. "Kepler started this story, and Gemini and Keck helped close it," he adds.
Graphene has the potential to usher in a new era of next generation electronic devices, including flexible displays and wearable technology.
Samsung Electronics have announced a breakthrough synthesis method to speed the commercialisation of graphene, a unique material ideally suited for electronic devices. Samsung Advanced Institute of Technology (SAIT), in partnership with Sungkyunkwan University, became the first in the world to develop this new method.
“This is one of the most significant breakthroughs in graphene research in history,” said the laboratory leaders at SAIT’s Lab. “We expect this discovery to accelerate the commercialisation of graphene, which could unlock the next era of consumer electronic technology.”
Graphene has 100 times greater electron mobility than silicon, the most widely used material in semiconductors today. It is more durable than steel and has high heat conductibility as well as flexibility, which makes it the perfect material for use in flexible displays, wearables and other next generation electronic devices.
Through its partnership with Sungkyungkwan University’s School of Advanced Materials Science and Engineering, SAIT uncovered a new method of growing large area, single crystal wafer scale graphene. Engineers around the world have invested heavily in research for the commercialisation of graphene, but have faced many obstacles due to the challenges associated with it. In the past, researchers have found that multi-crystal synthesis – the process of synthesising small graphene particles to produce large-area graphene – deteriorated the electric and mechanical properties of the material, limiting its application range.
The new method developed by SAIT and Sungkyunkwan University synthesises large-area graphene into a single crystal on a semiconductor, maintaining its electric and mechanical properties. The new method repeatedly synthesises single crystal graphene on the current semiconductor wafer scale.
Over the past several decades, the growth of the semiconductor industry has been driven by the ability to grow the area of a silicon wafer, while steadily decreasing the process node. In order to commercialise graphene to displace the industry’s reliance on silicon, it is vital to develop a new method to grow a single crystal graphene into a large area.
The research results are published in Science Magazine and ScienceExpress, one of the world’s most prestigious science journals.
Researchers have published the first comprehensive, large-scale data set on how the brain of a mammal is wired, providing a groundbreaking new data resource and fresh insights into how the nervous system processes information.
Credit: Allen Institute for Brain Science
While the human brain contains over 100 billion individual neurons, the mouse brain contains 75 million. However, the two structures are very similar, making it possible to compare them and identify many important processes. As computer power continues to advance exponentially, it is becoming possible to model networks of neurons in greater and greater detail. The first complete simulation of a single neuron was perfected in 2005; this was followed by a neocortical column with 10,000 neurons in 2008; then a cortical mesocircuit with 1,000,000 neurons in 2011. Now, a team of researchers from the Allen Institute in Seattle has achieved a major milestone by simulating an entire mouse brain, containing 75 million neurons. If trends continue, entire human brains could be modelled within the next decade.
A landmark study published this month in the journal Nature both describes the publicly available Allen Mouse Brain Connectivity Atlas, and demonstrates the exciting new knowledge that can be gleaned from this valuable resource.
"Understanding how the brain is wired is among the most crucial steps to understanding how the brain encodes information," explains Hongkui Zeng, Senior Director of Research Science at the Allen Institute for Brain Science. "The Allen Mouse Brain Connectivity Atlas is a standardised, quantitative, and comprehensive resource that will stimulate exciting investigations around the entire neuroscience community, and from which we have already gleaned unprecedented details into how structures are connected inside the brain."
Using the data – which took four years of work to collect – the researchers were able to demonstrate highly specific patterns in the connections among different brain regions. The strengths of these connections were found to vary with greater than five orders of magnitude, balancing a small number of strong connections with a large number of weak connections.
The researchers set out to create a wiring diagram of the brain – known as a "connectome" – to illustrate short and long-range connections using genetically-engineered viruses, able to trace and illuminate individual neurons. To get a truly comprehensive view, imaging data was collected at resolutions smaller than a micrometre from over 1,700 mouse brains.
"The data for the Allen Mouse Brain Connectivity Atlas was collected in a way that’s never been done before," says Zeng. "Standardising the data generation process allowed us to create a 3D common reference space, meaning we could put the data from all of our thousands of experiments next to each other and compare them all in a highly quantitative way at the same time."
The Allen Mouse Brain Connectivity Atlas contains over 1.8 petabytes of data – equivalent to 24 years of continuous HD video. The team behind it has been steadily releasing new data since November 2011; and in March, they released the last major update, though the resource will continue to be updated as technology develops and researchers are able to add more new types of connectivity data.
"The Allen Mouse Brain Connectivity Atlas provides an initial road-map of the brain, at the level of interstate highways and major cities that they link," explains David Anderson, Professor of Biology at the California Institute of Technology. "Smaller road networks and their intersections with the interstates will be the next step, followed by maps of local streets in different municipalities. This information will provide a framework for what we ultimately want to understand: ‘traffic patterns’ of information flow in the brain during various activities such as decision-making, mapping of the physical environment, learning and remembering, and other cognitive or emotional processes."
With the Nature publication, Allen Institute scientists have already begun to demonstrate the power of analysis contained within the Atlas. By analysing their data, Zeng and her team were able to discover several interesting properties of the mouse brain's connectome. For example, there are extensive connections across the two hemispheres with mirror-image symmetry. Pathways belonging to different functional circuits in the brain can be identified and their relationships and intersections visualised in 3D.
The Atlas will serve as an invaluable tool for neuroscientists all over the world, long into the future. "Previously, the scientific community had to rely on incomplete, fragmented data sets, like small pieces of a map but at different scales and resolutions, so it was impossible to see the bigger picture," explains Professor Ed Callaway, in the Systems Neurobiology Laboratories at the Salk Institute for Biological Studies. “Now, we have instant access to complete and consistent data across the entire brain, and the suite of web-based analytic and display tools make it easy to find what you need and to see it in 3D.
"Who you are – all your thoughts and actions your entire life – is based on connections between neurons," Callaway continues. "So if we want to understand any of these processes or how they go wrong in disease, we have to understand how those circuits function. Without an atlas, we couldn’t hope to gain that understanding."
Nanotechnology startup company, StoreDot, has unveiled a ground-breaking battery capable of charging your smartphone and other devices in under 30 seconds.
At Microsoft’s Think Next symposium in Tel Aviv, StoreDot demonstrated the prototype of its ultra-fast-charge battery for the first time. This company specialises in technology that is inspired by natural processes. They have produced "nanodots" derived from bio-organic material that, due to their size, have both increased electrode capacitance and electrolyte performance. These nanodots – described as "stable, robust spheres" – have a diameter of just 2.1 nanometres and are made of chemically synthesized peptide molecules, short chains of amino acids that form the building blocks of proteins.
StoreDot’s bio-organic devices, which include smartphone displays, provide much more efficient power consumption, and are eco-friendly. While other nanodot and quantum-dot technologies currently in use are heavy metal based, and therefore toxic, StoreDot's are biocompatible and superior to all previous discoveries in the field. Using their method, the company is hoping to synthesize new nanomaterials for use in a wide variety of applications. Nano-crystals in memory chips, for example, could triple the speed of traditional flash memory, while image sensors could be five times more sensitive.
Furthermore, the nanodots are relatively inexpensive, as they originate naturally, and utilise a basic biological mechanism of self-assembly. They can be made from a vast range of bio-organic raw materials that are readily available and environmentally friendly.
The battery seen in the video above remains in the prototype stage, with a rather bulky form factor. However, the CEO of Storedot, Doron Myersdorf, says he is confident that a smaller version can be developed and on the market by 2017.
“The fast-charge battery is the result of our focus on commercialising the materials we have discovered," he explained. "We’re particularly pleased that this innovative nanotechnology, inspired by nature, not only changes the rules of mobile device capabilities, but is also environmentally-friendly.”