South Korea has conducted its first successful orbital launch, using the Naro-1 rocket to place a satellite in orbit around the Earth.
The Naro-1 is South Korea's first carrier vehicle. Its maiden flight took place on 25th August 2009 – from the country's new spaceport, the Naro Space Center. However, it failed to reach orbit as the payload separation system malfunctioned. The following year, a second attempt was made – but this too ended in failure, with the spacecraft exploding two minutes after blast off.
Yesterday at 07:00 UTC (16:00 KST), a third attempt was made – this time successfully. Officials confirmed that everything had gone as planned; the Naro-1 had reached its target altitude and deployed its satellite 540 seconds after the launch. Science Minister, Lee Ju-ho, told reporters that South Korea would use this "overwhelming moment as a strong, dynamic force" to help drive an independent space programme.
The satellite now in orbit, STSAT-2C, has a mass of 100 kg (220 lb) and is designed to collect climate data. Further plans are being made for a 75-ton thrust engine by 2018 and a 300-ton launch vehicle by 2021. The space program has so far cost US$471 million.
With its successful launch of the Naro-1, South Korea becomes only the 11th independent nation to develop orbital launch capability. This has occurred just one month after the country's immediate neighbour – North Korea – also deployed its first rocket into space with the successful December 2012 launch of the Unha-3. It also comes in the wake of news that North Korea is planning a third nuclear test. Chinese Navy official Yin Zhuo said that South Korea has been competing with Japan for favour with the United States, and the launch aims to strengthen relations with the U.S.
Looked at from a historical perspective, the graph below appears to show that we're extremely far from having all of the world's nations in this space-faring club. However, a number of game-changing technologies in the coming decades could greatly accelerate this trend.
Interplanetary Media Group – the Mars One daughter company which manages the IP and media associated with its human mission to Mars – has received its first investments. These funds will be used to finance the Conceptual Design Studies and the launch of the global Astronaut Selection Program.
Kai Staats, Director of Business Development for Mars One states, “Organising a human mission to Mars is a tremendously complex venture. There are many engineering hurdles to overcome and the total funds required are tremendous. Raising a few million [US dollars] in the coming months may seem insignificant in the shadow of the pending billions required, but we are taking it one step at a time. These first few bring tangible demonstration to nearly two years in planning. For us, committed funds in this phase of development are an important indicator we are moving in the right direction.”
In the first half of 2013, Mars One will award the Conceptual Design studies to industry suppliers. These are sophisticated engineering bids, technical plans which lay the foundation for the major components such as the transport vehicles, space suits, life support systems and living modules on Mars. These will substantiate the Mission plan with real-world engineering designs and data.
Mars One will also launch the Astronaut Selection Program which immediately, directly involves people from around the world. This is a new paradigm for anyone who is interested to participate in space travel. As Mars One is anticipating hundreds of thousands, perhaps more than one million applicants, the infrastructure required to professionally manage such a process is substantial.
Mars One remains open to additional investors. Interested parties may contact Mars One at email@example.com.
About Mars One
Mars One is a not-for-profit organisation that aims to establish a permanent human settlement on Mars in 2023 through the integration of existing, readily available technologies from industry leaders worldwide. Unique in its approach, Mars One intends to fund this decade-long endeavor by involving the whole world as the audience of an interactive, televised broadcast of every aspect of this mission, from launch to landing to living on Mars.
Stem cells from bone marrow or fat can improve recovery after stroke in rats, according to a study published in BioMed Central's open access journal Stem Cell Research & Therapy. Treatment with stem cells improved the amount of brain and nerve repair and the ability of the animals to complete behavioural tasks.
Stroke is among the leading causes of death worldwide, killing 5.7 million each year (just under 10% of total deaths). Approximately 9 million people had a stroke in 2008 and 30 million people have previously had a stroke and are still alive.
Stem cell therapy holds enormous promise for the future, but there are many questions which need to be answered regarding treatment protocols and which cell types to use. This study attempted to address some of these questions.
Rats were treated intravenously with stem cells or saline 30 minutes after a stroke. At 24 hours, the stem cell-treated rats showed a better functional recovery. By two weeks, the animals had near-normal scores in tests. This improvement was seen even though the stem cells did not actually migrate to the damaged area of brain. The treated rats also had higher levels of biomarkers implicated in brain repair – including the growth factor VEGF.
A positive result was seen for both fat (adipose) and bone-marrow derived stem cells. Dr Exuperio Díez-Tejedor from La Paz University Hospital, explained: "Improved recovery was seen regardless of origin of the stem cells, which may increase the usefulness of this treatment in human trials. Adipose-derived cells in particular are abundant and easy to collect without invasive surgery."
Stanford Engineering's Center for Turbulence Research (CTR) has set a new record in computational science by successfully using a supercomputer with more than 1 million computing cores. This was done to solve a complex fluid dynamics problem – the prediction of noise generated by a supersonic jet engine.
Joseph Nichols, a research associate in the centre, worked on the newly installed Sequoia IBM Bluegene/Q system at Lawrence Livermore National Laboratories (LLNL). Sequoia recently topped the list of the world's most powerful supercomputers, boasting 1,572,864 compute cores (processors) and 1.6 petabytes of memory connected by a high-speed five-dimensional torus interconnect.
Because of Sequoia's impressive numbers of cores, Nichols was able to show for the first time that million-core fluid dynamics simulations are possible – and also to contribute to research aimed at designing quieter aircraft engines.
The physics of noise
The exhausts of high-performance aircraft at takeoff and landing are among the most powerful man-made sources of noise. For ground crews, even for those wearing the most advanced hearing protection available, this creates an acoustically hazardous environment. To the communities surrounding airports, such noise is a major annoyance and a drag on property values.
Understandably, engineers are keen to design new and better aircraft engines that are quieter than their predecessors. New nozzle shapes, for instance, can reduce jet noise at its source, resulting in quieter aircraft.
Predictive simulations – advanced computer models – aid in such designs. These complex simulations allow scientists to peer inside and measure processes occurring within the harsh exhaust environment that is otherwise inaccessible to experimental equipment. The data gleaned from these simulations are driving computation-based scientific discovery as researchers uncover the physics of noise.
More cores, more challenges
Parviz Moin, a Professor in the School of Engineering and Director of CTR: "Computational fluid dynamics (CFD) simulations, like the one Nichols solved, are incredibly complex. Only recently, with the advent of massive supercomputers boasting hundreds of thousands of computing cores, have engineers been able to model jet engines and the noise they produce with accuracy and speed."
CFD simulations test all aspects of a supercomputer. The waves propagating throughout the simulation require a carefully orchestrated balance between computation, memory and communication. Supercomputers like Sequoia divvy up the complex math into smaller parts so they can be computed simultaneously. The more cores you have, the faster and more complex the calculations can be.
And yet, despite the additional computing horsepower, the difficulty of the calculations only becomes more challenging with more cores. At the one-million-core level, previously innocuous parts of the computer code can suddenly become bottlenecks.
Ironing out the wrinkles
Over the past few weeks, Stanford researchers and LLNL computing staff have been working closely to iron out these last few wrinkles. This week, they were glued to their terminals during the first "full-system scaling" to see whether initial runs would achieve stable run-time performance. They watched eagerly as the first CFD simulation passed through initialisation then thrilled as the code performance continued to scale up to and beyond the all-important one-million-core threshold, and as the time-to-solution declined dramatically.
"These runs represent at least an order-of-magnitude increase in computational power over the largest simulations performed at the Center for Turbulence Research previously," said Nichols. "The implications for predictive science are mind-boggling."
The current simulations were a homecoming of sorts for Nichols. He was inspired to pursue a career in supercomputing as a high-school student when he attended a two-week summer program at Lawrence Livermore computing facility in 1994 sponsored by the Department of Energy. Back then, he worked on the Cray Y-MP, one of the fastest supercomputers of its time. "Sequoia is approximately 10 million times more powerful than that machine," Nichols noted.
The Stanford ties go deeper still. The computer code used in this study is named CharLES and was developed by former Stanford senior research associate, Frank Ham. This code utilises unstructured meshes to simulate turbulent flow in the presence of complicated geometry.
In addition to jet noise simulations, Stanford researchers are using the CharLES code to study advanced-concept scramjet propulsion systems, used in hypersonic flight at many times the speed of sound.
The European Commission today announced the winners of a multi-billion euro competition of Future and Emerging Technologies (FET). The winning Graphene and Human Brain initiatives are set to receive one billion euros each, to deliver 10 years of world-beating science. Each initiative involves researchers from at least 15 EU Member States and nearly 200 research institutes.
The "Graphene" project will investigate and exploit the unique properties of this revolutionary carbon-based material. Graphene has an extraordinary combination of physical and chemical properties: it is the thinnest material, it conducts electricity much better than copper, it is 100-300 times stronger than steel and has unique optical properties. The use of graphene was made possible by European scientists in 2004, and the substance is set to become the wonder material of the 21st century, as plastics were to the 20th century. This includes replacing silicon in microchips, revolutionising the fields of energy and transportation, transforming health and medicine and a host of other areas.
The "Human Brain Project" will create the world's largest experimental facility for developing the most detailed ever model of the brain, for studying how the human brain works and ultimately to develop personalised treatments of neurological and related diseases. This research lays the foundations for medical progress with potential to dramatically improve the quality of life for millions of people.
The European Commission will support "Graphene" and the "Human Brain Project" as FET flagships over 10 years through its research and innovation funding programmes. Sustained funding for the full duration of the project will come from the EU's research framework programmes, principally from the Horizon 2020 programme (2014-2020) which is currently negotiated in the European Parliament and Council.
European Commission Vice President Neelie Kroes said: "Europe's position as a knowledge superpower depends on thinking the unthinkable and exploiting the best ideas. This multi-billion competition rewards home-grown scientific breakthroughs and shows that when we are ambitious we can develop the best research in Europe. To keep Europe competitive, to keep Europe as the home of scientific excellence, EU governments must agree an ambitious budget for the Horizon 2020 programme in the coming weeks."
"Graphene" is led by Prof. Jari Kinaret, from Sweden's Chalmers University. The Flagship involves over 100 research groups, with 136 principal investigators, including four Nobel laureates. "The Human Brain Project" involves scientists from 87 institutions and is led by Prof. Henry Markram of the École Polytechnique Fédérale de Lausanne.
The future of computing and science will be driven by collaboration. The FET flagships programme is a world-leading effort to ride this wave. The flagship race has fostered collaboration on a new scale and duration. Instead of the usual two-to-four year funding cycles, the 10 year duration and massive financial incentive has driven the level of science in the project proposals to a much higher level, which will deliver greater benefits over the long-term, including major new technologies and faster innovation.
Every day, scientists learn more about how the world works at the smallest scales. While this knowledge has the potential to help others, it's possible that the same discoveries could also be used in ways that cause widespread harm.
A new article in the journal Nanomedicine – the product of an FBI workshop held at the University of Notre Dame – tackles this complex "dual-use" aspect of nanotechnology research.
"The rapid pace of breakthroughs in nanotechnology, biotechnology, and other fields, holds the promise of great improvements in areas such as medical diagnosis and treatment" says Kathleen Eggleson, author of the study. "But the risk of misuse of these breakthroughs rises along with the potential benefit. This is the essence of the 'dual-use dilemma.'"
The report examines the potential for nano-sized particles (which are measured in billionths of a metre) to breach the blood-brain barrier, a tightly knit layer of cells that afford the brain the highest level of protection – from microorganisms, harmful molecules, etc. – in the human body. Neuroscientists are purposefully engineering nanoparticles that can cross the blood-brain barrier (BBB) to deliver medicines in a targeted and controlled way directly to diseased parts of the brain.
At the same time, the report notes, "nanoparticles designed to cross the BBB constitute a serious threat ... in the context of combat." For example, it is theorized that "aerosol delivery" of some nano-engineered agent in "a crowded indoor space" could cause serious harm to many people at once.
The problem of dual-use research was highlighted last year when controversy erupted over the publication of findings that – with just a handful of modifications – the H5N1 influenza virus ("bird flu") can be altered in a way that would enable it to be transmitted between mammalian populations. After a self-imposed one-year moratorium on this research, several laboratories around the world have announced that they will restart the work in early 2013.
The FBI is actively responding to these developments in the scientific community.
"The law enforcement-security community seeks to strengthen the existing dialogue with researchers," William So of the FBI's Biological Countermeasures Unit says in the study. "Science flourishes because of the open and collaborative atmosphere for sharing and discussing ideas. The FBI believes this model can do the same for our two communities ... [and] create effective safeguards for science and national interests."
The scientists and engineers who conduct nanoscale research have the ability and responsibility to consider the public safety aspects of their research and to act to protect society when necessary, argues Eggleson: "The relationship between science and society is an uneasy one, but it is undeniable on the whole and not something any individual can opt out of in the name of progress for humanity's benefit. Thought about dual-use, and action when appropriate, is inherent to socially responsible practice of nano-biomedical science."
This month has witnessed a number of breakthroughs in solar power research. Here are some of the more significant developments.
Thin-film solar cells: new world record for efficiency
EMPA in Switzerland has developed thin-film solar cells on flexible polymer substrate with a record efficiency of 20.4% for converting sunlight into electricity. This is a significant improvement over the previous record of 18.7% achieved by the same team in May 2011. The cells are based on CIGS (copper indium gallium (di)selenide) and the technology is now awaiting scale-up for industrial uses.
Thin-film, lightweight and flexible solar modules are attractive for applications such as roofs and facades of buildings, automobiles and portable electronics. They can be produced using continuous roll-to-roll manufacturing, with panels coming off the assembly line as if from a printing press. This provides further cost reductions compared to standard silicon technologies. Thin-film solar cells were included in TIME's Best Inventions of 2008.
Gian-Luca Bona, Director of EMPA: "The series of record efficiencies for flexible CIGS solar cells developed at EMPA demonstrates that thin film solar cells can match the excellent performance of polycrystalline silicon cells. Now it is time for the next step – the scale-up of the technology to cover large areas in a cost-efficient roll-to-roll manufacturing process with an industrial partner."
Peel-and-stick solar panels
Stanford researchers have succeeded in developing the world's first "peel-and-stick" thin-film solar cells. Unlike standard thin-film solar, this version does not require any direct fabrication on the final carrier substrate. All the challenges associated with putting solar cells on unconventional materials are avoided with this new process, vastly expanding the number of potential applications. The researchers attached their solar cells to paper, plastic and window glass among various other materials, without losing the original cell efficiency.
Non-conventional substrates are difficult to fabricate because of their poor surface flatness, along with low tolerance of chemicals and hotter temperatures in the production process. The researchers got around these problems by developing a unique metal "sandwich". The base of this sandwich is formed by a silicon/silicon dioxide wafer. A 300-nanometer film of nickel is deposited on top. Thin-film solar cells are then placed on the nickel layer and covered with protective polymer. Thermal release tape is then attached to the top of the solar cells, to augment their transfer off the production wafer and onto a new substrate.
The wafer is submerged in water at room temperature and the edge of the thermal release tape is peeled back slightly, allowing water to seep into and penetrate between the nickel and silicon dioxide interface. The solar cell is thus freed from the hard substrate, but still attached to the thermal release tape. After heating to 90°C, the tape is removed, leaving just the solar cell which can be applied to virtually any surface.
Tests prove that the peel-and-stick process leaves the solar cells wholly intact and functional. The silicon wafer base is undamaged and clean after removal, and can be reused for another batch of solar cells. Overall, this new process gives thin-film solar cells a flexibility and attachment potential never seen before – while simultaneously reducing their general cost and weight.
Solar nanowires: ideal diameter identified
About 10 years ago, the first generation of solar nanowires began to appear in research labs – arrays of tiny, semiconducting structures able to convert sunlight into energy. Despite their intricacy and compactness, however, performance lagged far behind other technologies. Researchers were unable to attain efficiencies of greater than 10%.
This is now changing. Scientists from Lund University in Sweden claim to have identified the ideal diameter for nanowires in solar cells: 180 nanometres. Reporting their study in the journal Science, they show how efficiencies of 13.8% can now be achieved using the semiconductor material indium phosphide.
Magnus Borgström, a researcher in semiconductor physics and the principal author: "The right size is essential for the nanowires to absorb as many photons as possible. If they are just a few tenths of a nanometre too small, their function is significantly impaired."
The nanowires are shaped like antennae. They are assembled on surfaces of just one square millimetre that each house four million nanowires. This produces an effect per active surface unit several times greater than today's silicon cells.
Traditional silicon cells for domestic use are relatively cheap, but inefficient because they only absorb a limited part of the light spectrum, being composed of one single material. Researchers have therefore aimed to combine different types of semiconductor material to utilise a broader part of the spectrum. The disadvantage is that they become extremely expensive and can therefore only be used in niche contexts, such as on satellites or military planes.
However, this is not the case with nanowires. Because of their tiny dimensions, the same sort of material combinations can be created with far less effort and complexity, which offers higher efficiency at a lower cost. In the Science article, the researchers have shown that the nanowires can generate power at the same level as a thin film of the same material, even if they cover just one-tenth of the surface. Although still in the laboratory phase, they could eventually be used in large solar power plants.
These indium phosphide nanowires are 180 nm in diameter, allowing them to capture more light. Wallentin et al.
Multi-junction solar cell may break the 50% efficiency barrier
As we reported in more detail here, U.S. Naval Research Laboratory scientists, working with Imperial College London and MicroLink Devices, have proposed a novel triple-junction solar cell with the potential to break the 50 percent conversion efficiency barrier, which is the current goal in multi-junction photovoltaic development. At present, the world record for this type of solar power generation is 44 percent under concentration. The researchers believe their technology is "realistically achievable" by 2016.
The U.S. Army's largest solar array installation
The largest solar power system in the U.S. Army has come online at White Sands Missile Range, New Mexico, and officials marked the occasion with a ribbon-cutting ceremony. The Energy Savings Performance Contract (ESPC) provides the sprawling desert base with a new 4.5 megawatt solar photovoltaic system, guarantees energy savings of 35,358M British thermal units per year, and cuts energy consumption by 10 percent.
Michael Norton, Huntsville Center Energy Division: "To date, this is the largest solar project in the Army. Projects like this are important because the impact of rising energy prices on installations has resulted in an adverse increase of utility budgets spent on existing, often inefficient or outdated equipment."
"ESPC projects provide energy efficient equipment resulting in a lower utility consumption. Lower utility consumption reduces the DOD utility bills and assists in meeting federal mandates."
Panasonic is developing a form of artificial photosynthesis – the same system used by plants and other organisms to convert water and carbon dioxide into oxygen and carbohydrates, using sunlight. In the future, it is hoped this could be scaled up to industrial-sized facilities, absorbing CO2 from factories and other infrastructure.
The system could produce substances like ethanol as a by-product, for use in generating energy. Panasonic has achieved a world record in terms of solar to chemical energy conversion, with efficiency close to that of real plants. The scale of humanity's challenge in terms of reducing atmospheric CO2 levels is vast, but this new system looks very promising and has major potential if developed further.
Every year, 2 million Americans – at least 1 in 20 patients – contract a hospital infection. Of these, over 100,000 will die from their illness. This is more than breast cancer, AIDS/HIV and road fatalities combined and results in $30 billion of additional healthcare costs. One of the most dangerous pathogens, the highly resilient Clostridium difficile (C. diff), is able to survive for months on surfaces. With antibiotics, handwashing and traditional methods of disinfection proving to be increasingly ineffective, these "superbugs" have the potential to become a major crisis.
Thankfully, technology may come to the rescue once again, as a new machine has been developed that could revolutionise the hospital environment. Using pulses of ultraviolet (UV) light, a robot from Xenex Healthcare is 20 times more effective at eliminating bacteria than standard chemical cleaning. It can treat rooms in just five minutes, plus it has less environmental impact than discarded plastic containers and heavy use of disinfectants. A motion detection system and door guard ensures the safety of patients, visitors and staff.
Although expensive – at $82,000 each – a growing number of hospitals are now using this machine. The first to do so was Cooley Dickinson Hospital, Massachusetts, which has since witnessed an 82 percent drop in C. diff. infections. Last month, Stamford Hospital became the first and only hospital in Connecticut to use the device.
An explosion in extreme wealth is exacerbating inequality and hindering the world's ability to tackle poverty, Oxfam has warned, in a briefing published ahead of the World Economic Forum in Davos next week.
Credit: joeborg / Shutterstock
The $240 billion net income in 2012 of the richest 100 billionaires would be enough to make extreme poverty history four times over, according to "The cost of inequality: how wealth and income extremes hurt us all". The agency is calling on world leaders to curb today's income extremes and commit to reducing inequality to at least 1990 levels.
The richest one per cent has increased its income by 60 per cent in the last 20 years, with the financial crisis accelerating rather than slowing this trend.
In its report, Oxfam warns that extreme wealth is economically inefficient, politically corrosive, socially divisive and environmentally destructive. This is corroborated by evidence from past studies by organisations like the Equality Trust, showing a clear correlation between inequality and social problems.
Barbara Stocking, Oxfam Chief Executive: "We can no longer pretend that the creation of wealth for a few will inevitably benefit the many – too often the reverse is true. Concentration of resources in the hands of the top one per cent depresses economic activity and makes life harder for everyone else – particularly those at the bottom of the economic ladder. In a world where even basic resources such as land and water are increasingly scarce, we cannot afford to concentrate assets in the hands of a few and leave the many to struggle over what's left."
Members of the richest one per cent are estimated to use as much as 10,000 times more carbon than the average US citizen.
Oxfam said world leaders should learn from the current success of countries such as Brazil, which has grown rapidly while reducing inequality – as well as the historical success such as the United States in the 1930s when President Roosevelt's New Deal helped bring down inequality and tackle vested interests. Roosevelt famously warned: "the political equality we once had won is meaningless in the face of economic inequality."
Stocking added: "We need a global new deal to reverse decades of increasing inequality. As a first step, world leaders should formally commit themselves to reducing inequality to the levels seen in 1990. From tax havens to weak employment laws, the richest benefit from a global economic system which is rigged in their favour. It is time our leaders reformed the system so that it works in the interests of the whole of humanity rather than a global elite."
Closing tax havens – which hold as much as $32 trillion (£20tr), or a third of all global wealth – could yield an additional $189bn (£118bn) in additional tax revenues.
In addition to a tax haven crackdown, elements of a global new deal could include:
a reversal of the trend towards more regressive forms of taxation;
a global minimum corporation tax rate;
measures to boost wages compared with returns available to capital;
increased investment in free public services and safety nets.
The second asteroid mining company in less than a year will soon be announced.
Last year, Planetary Resources was announced as the world's first commercial asteroid mining company. Co-founded by Peter Diamandis and Eric Anderson, its stated goal was "to expand Earth's natural resource base" by developing a series of telescopes, probes and robotic vehicles. With a single asteroid containing more precious metals than ever mined in history, this endeavour could potentially "add trillions of dollars to the global GDP" and "enable humanity's prosperity to continue for centuries to come."
Planetary Resources – with a team of high-profile backers – generated overwhelming interest from the public. Since that press conference, the firm has signed an agreement with Virgin Galactic enabling multiple launches for its spacecraft.
It now appears that a second company has entered the race. Although its website is rather sparse, we understand that Deep Space Industries will be announcing their plans on 22nd January. From the little information gleaned elsewhere, former Astrobotic Technology President David Gump is said to be involved. The firm is developing "a breakthrough process for manufacturing in space" and intends to pursue "an aggressive schedule."
Deep Space Industries will showcase their plans at the Santa Monica Museum of Flying, California. The Science Channel's Geoff Notkin will host the event, which will include a video showing the new spacecraft and the company's other plans. When more information becomes available, we will of course post it here.
The sight of the Millennium Falcon making the "jump to lightspeed" is one of the most iconic images from the Star Wars trilogy. But University of Leicester students have calculated that – in reality – Han, Luke and Leia would not see the light from stars stretching past the ship as we are shown in the movies.
The final year masters students – from the Department of Physics and Astronomy – found that the depiction of hyperspace in Star Wars is light years away from how it would appear in reality. In the films, every star in the sky is seen to stretch before the characters' eyes when the hyperdrive is engaged.
The four students – Riley Connors, Katie Dexter, Joshua Argyle and Cameron Scoular – have shown that this would not be the case. In their studies, they explained how the crew would actually see a central disc of bright light. There would be no sign of stars, because of the Doppler effect – the same phenomenon which causes the siren of an ambulance to become higher in pitch as it moves towards you.
Doppler blue shift is caused by a source of electromagnetic radiation, including visible light, moving towards an observer. The effect means that the wavelength of electromagnetic radiation will be shortened. From the Millennium Falcon crew’s viewpoint, the wavelength of light from stars will decrease and ‘shift’ out of the visible spectrum into the X-ray range.
They would simply see a central disc of bright light (pictured above) as Cosmic Microwave Background Radiation is shifted into the visible spectrum. Cosmic Microwave Background Radiation is radiation left behind from the Big Bang, and is spread across the universe fairly uniformly.
After further investigation, the group determined that the intense X-rays from stars would push the ship back, causing it to slow down. The pressure felt by the ship would be comparable to that felt at the bottom of the Pacific Ocean.
Their calculations also show that Han would need to store extra amounts of energy on his ship to overcome this pressure in order to continue on his journeys.
A team of researchers from Université Laval, CHU de Québec, and pharmaceutical firm GlaxoSmithKline (GSK) has discovered a way to stimulate the brain's natural defence mechanisms in people with Alzheimer's disease.
This major breakthrough – published in the 15th January early edition of PNAS – opens the door to a new treatment for Alzheimer's disease and a vaccine to prevent the illness.
One of the main characteristics of Alzheimer's is the production in the brain of a toxic molecule known as amyloid beta. Microglial cells, the nervous system's defenders, are unable to eliminate this substance, which forms deposits called senile plaques (illustrated above).
The team led by Dr. Serge Rivest identified a molecule that stimulates activity of the brain's immune cells. The molecule, known as MPL (monophosphoryl lipid A), has been used extensively as a vaccine adjuvant by GSK for many years, and its safety is already well established.
In mice with Alzheimer's symptoms, weekly injections of MPL over a 12-week period eliminated up to 80% of senile plaques. In addition, tests measuring the mice's ability to learn new tasks showed significant improvement in cognitive function over the same period.
The researchers see two potential uses for MPL in humans. It could be administered by intramuscular injection, to slow the progression of the illness. It could also be incorporated into a vaccine designed to stimulate the production of antibodies against amyloid beta.
"The vaccine could be given to people who already have the disease to stimulate their natural immunity," said Dr. Rivest. "It could also be administered as a preventive measure to people with risk factors for Alzheimer's disease."
With an 80% reduction in protein deposits, this method is even more successful than another breakthrough which was reported last year, in which turning off cytokines (immune system signal transmitters) reduced plaques in mice by 65%. How well this translates into humans, of course, remains to be seen.
Scientists at the Texas Biomedical Research Institute have, for the first time, demonstrated that baboon embryonic stem cells can totally restore a severely damaged artery. These early results show promise for eventually developing stem cell therapies to restore human tissues or organs damaged by age or disease.
John VandeBerg, Ph.D., Texas Biomed's chief scientific officer: "We first cultured the stem cells in petri dishes under special conditions to make them differentiate into cells that are the precursors of blood vessels, and we saw that we could get them to form tubular and branching structures, similar to blood vessels."
This finding gave VandeBerg and his team the confidence to do more complex experiments, to find out if these cells could actually heal a damaged artery. The results are published in the Journal of Cellular and Molecular Medicine.
The scientists found that cells derived from embryonic stem cells could actually repair experimentally damaged baboon arteries and "are promising therapeutic agents for repairing damaged vasculature of people," according to the authors.
Researchers completely removed the cells that line the inside surface from a segment of artery, and then put cells that had been derived from embryonic stem cells inside the artery. They then connected both ends of the arterial segment to plastic tubing inside a device called a bioreactor which is designed to grow cells and tissues. The scientists then pumped fluid through the artery under pressure as if blood were flowing through it. The outside of the artery was bathed in another fluid to sustain the cells located there.
Three days later, the complex structure of the inner surface was beginning to regenerate, and by 14 days, the inside of the artery had been perfectly restored to its complex natural state. It went from a non-functional tube to a complex fully functional artery.
"Just think of what this kind of treatment would mean to a patient who had just suffered a heart attack as a consequence of a damaged coronary artery. And this is the real potential of stem cell regenerative medicine — that is, a treatment with stem cells that regenerates a damaged or destroyed tissue or organ," VandeBerg said.
To show that the artery couldn't heal itself in the absence of stem cells, the researchers took a control arterial segment that also was stripped of the cells on its interior surface, but did not seed it with stem cells. No healing occurred.
Stains for proteins that indicate functional characteristics showed that the healed artery had completely normal function and could do everything that a normal artery does in a healthy individual.
"This is evidence that we can harness stem cells to treat the gravest of arterial injuries," said VandeBerg.
Eventually, scientists hope to be able to take a skin cell, or a white blood cell, or a cell from any other tissue in the body, and induce it to become just like an embryonic stem cell in its capacity to differentiate into any tissue or organ.
"The vision of the future is, for example, for a patient with a pancreas damaged because of diabetes, doctors could take skin cells, induce them to become stem cells, and then grow a new pancreas that is just like the one before disease developed," VandeBerg said.
The Energy Technologies Institute (ETI) in Britain has appointed Blade Dynamics to develop what are expected to be the world's longest wind turbine blades ever built.
Blade Dynamics will construct blades for the ETI of between 80 to 100 metres in length, incorporating carbon fibre rather than conventional fibre glass. This compares with blades now deployed offshore of between 60 to 75 metres in length.
The ETI commissioned and funded project will be delivered using Blade Dynamics' innovative design and manufacturing processes that construct blades through the assembly of smaller, more accurate and easily manufactured components, rather than from extremely large and expensive full-length mouldings.
The project will see prototype blades manufactured and ready to be put into production by late 2014. Structural testing for the first blade is then expected to be carried out at a UK test facility. The design of the blades will see them weigh up to 40 per cent less than conventional glass-fibre blades, enabling significant weight and cost savings to be achieved throughout the rest of the turbine system. The design will also help to reduce the cost of the energy produced.
The intended end-use for the blade technology is on the next generation of large offshore wind turbines currently under development with a capacity of 8-10MW. This compares with the 5-6MW capacity turbines currently deployed.
UK Minister for Universities and Science, David Willetts: "This investment will enable Blade Dynamics to develop and demonstrate a potentially world-leading technology. The project could vastly improve the manufacturing process of very large turbine blades, as well as helping to reduce the cost of the energy generated. It shows Britain is leading the way in developing innovative solutions to help with the transition to a low carbon economy."
Paul Trinick, Offshore Wind Project Manager at the ETI: "Offshore wind has the potential to be a much larger contributor to the UK energy system if today's costs can be significantly reduced. Investing in this project to develop larger, more efficient blades is a key step for the whole industry in paving the way for more efficient turbines, which will in turn help bring the costs of generating electricity down."
David Cripps, Senior Technical Manager at Blade Dynamics: "We have worked hard on the design of this blade technology for a number of years now. Financial backing from the ETI for this project allows deployment on ultra-large turbines far sooner than would otherwise have been possible and as a result of this project we will be hiring new engineers and technologists to make this possible. Our driver is to make the generation of electricity through offshore wind both more reliable and more economical. We believe longer, low weight blades to be a key part of the solution – but for such blades to be most effective, we need to design their construction differently."
U.S. Naval Research Laboratory scientists, in collaboration with Imperial College London and MicroLink Devices, have proposed a novel triple-junction solar cell with the potential to break the 50 percent conversion efficiency barrier, which is the current goal in multi-junction photovoltaic development.
Click to enlarge:
"This research has produced a novel, realistically achievable, lattice-matched, multi-junction solar cell design with the potential to break the 50 percent power conversion efficiency mark under concentrated illumination," said Robert Walters, Ph.D., NRL research physicist. "At present, the world record triple-junction solar cell efficiency is 44 percent under concentration and it is generally accepted that a major technology breakthrough will be required for the efficiency of these cells to increase much further."
In multi-junction (MJ) solar cells, each junction is 'tuned' to different wavelength bands in the solar spectrum to increase efficiency. High bandgap semiconductor material is used to absorb the short wavelength radiation with longer wavelength parts transmitted to subsequent semiconductors. In theory, an infinite-junction cell could obtain a maximum power conversion percentage of nearly 87 percent. The challenge is to develop a semiconductor material system that can attain a wide range of bandgaps and be grown with high crystalline quality.
By exploring novel semiconductor materials and applying band structure engineering, via strain-balanced quantum wells, the NRL research team has produced a design for a MJ solar cell that can achieve direct band gaps from 0.7 to 1.8 electron volts (eV) with materials that are all lattice-matched to an indium phosphide (InP) substrate.
"Having all lattice-matched materials with this wide range of band gaps is the key to breaking the current world record" adds Walters. "It is well known that materials lattice-matched to InP can achieve band gaps of about 1.4 eV and below, but no ternary alloy semiconductors exist with a higher direct band-gap."
The primary innovation enabling this new path to high efficiency is the identification of InAlAsSb quaternary alloys as a high band gap material layer that can be grown lattice-matched to InP. Drawing from their experience with Sb-based compounds for detector and laser applications, NRL scientists modeled the band structure of InAlAsSb and showed that this material could potentially achieve a direct band-gap as high as 1.8eV. With this result, and using a model that includes both radiative and non-radiative recombination, the NRL scientists created a solar cell design that is a potential route to over 50 percent power conversion efficiency under concentrated solar illumination.
Recently awarded a U.S. Department of Energy (DoE), Advanced Research Projects Agency-Energy (ARPA-E) project, NRL scientists, working with MicroLink and Rochester Institute of Technology, N.Y., will execute a three year materials and device development program to realise this new solar cell technology.
Through a highly competitive, peer-reviewed proposal process, ARPA-E seeks out transformational, breakthrough technologies that show fundamental technical promise but are too early for private-sector investment. These projects have the potential to produce game-changing breakthroughs in energy technology, form the foundation for entirely new industries, and to have large commercial impacts.
Cyberpunk 2077 is a role-playing video game, based on the Cyberpunk series of pen-and-paper games. It is being produced by CD Projekt RED, developers of The Witcher and The Witcher 2: Assassins of Kings. Cyberpunk 2077 will feature a decadent futuristic world, in which ultra-modern technology co-exists with a degenerated human society.
The game aims to be a mature, ambitious title with character customisation being strongly tied to the plot. It will have a non-linear story with different character classes. The developers say that it won't be ready until 2015, but a trailer was released this week, which you can see below. For more info, visit the official website. If you're into sci-fi, check out the Fictional Future section of our forum.
The Gravity Recovery and Interior Laboratory (GRAIL) was a NASA mission that ran from March-December 2012, featuring twin spacecraft orbiting the Moon. These used gravitational field mapping to study its interior structure, producing the highest resolution gravity field map ever obtained for a celestial body. This was achieved by measuring tiny changes in the distance between each probe – as small as 1/10th of a micron.
In unprecedented detail, the maps revealed an abundance of features never before seen – such as tectonic structures, fractures, volcanic landforms, basin rings and crater central peaks. In addition, the bulk average density of the Moon's highland crust was found to be substantially lower than previously assumed. It is hoped that this data will provide a better understanding of how Earth and other rocky planets formed and evolved.
Towards the end of their mission, the GRAIL spacecraft began to operate at lower orbital altitudes. On 17th December, traveling at 3,760 miles per hour (6,050 km/h) they were deliberately crashed onto the surface. Three days prior to this planned impact (on a mountain at the lunar north pole), mission controllers activated two cameras, taking footage of the Moon's far side. The video below was released by NASA's Jet Propulsion Laboratory this week.
The Institution of Mechanical Engineers calls for urgent action to prevent 2 billion tonnes of all food produced in the world ending up as waste.
A new report by the Institution of Mechanical Engineers finds that as much as 50% of all food produced around the world never reaches a human stomach – due to issues as varied as inadequate infrastructure and storage facilities, through to overly strict sell-by dates, "buy one get one free" offers and consumers demanding cosmetically perfect food.
With UN predictions that there could be about an extra three billion people to feed by the end of this century and an increasing pressure on the resources needed to produce food – including land, water and energy – the Institution is calling for urgent action to tackle this waste.
• between 30% and 50% (about 1.2-2 billion tonnes) of food produced around the world each year is thrown away;
• as much as 30% of UK vegetable crops are not harvested due to them failing to meet exacting standards based on their physical appearance, while up to half of the food that's bought in Europe and the USA is thrown away by the consumer;
• about 550 billion m³ of water is wasted globally in growing crops that never reach the consumer;
• it takes 20-50 times the amount of water to produce 1 kilogram of meat than 1 kilogram of vegetables;
• the demand for water in food production could reach 10–13 trillion m³ a year by 2050. This is 2.5 to 3.5 times greater than the total human use of fresh water today and could lead to more dangerous water shortages around the world;
• there is the potential to provide 60-100% more food, by eliminating losses and waste, while at the same time freeing up land, energy and water resources.
Dr Tim Fox, Head of Energy and Environment at the Institution of Mechanical Engineers said: "The amount of food wasted and lost around the world is staggering. This is food that could be used to feed the world's growing population – as well as those in hunger today. It is also an unnecessary waste of the land, water and energy resources that were used in the production, processing and distribution of this food."
He continued: “The reasons for this situation range from poor engineering and agricultural practices, inadequate transport and storage infrastructure through to supermarkets demanding cosmetically perfect foodstuffs and encouraging consumers to overbuy through buy-one-get-one free offers.
“As water, land and energy resources come under increasing pressure from competing human demands, engineers have a crucial role to play in preventing food loss and waste by developing more efficient ways of growing, transporting and storing foods.
“But in order for this to happen Governments, development agencies and organisation like the UN must work together to help change people’s mindsets on waste and discourage wasteful practices by farmers, food producers, supermarkets and consumers.”
By 2075, the UN predicts that the world’s population is set to reach around 9.5 billion, which could mean an extra three billion mouths to feed. A key issue to dealing with this population growth is how to produce more food in a world with resources under competing pressures – particularly given the added stresses caused by global warming and the increasing popularity of eating meat – which requires around 10 times the land resources of food like rice or potatoes.
The world produces about four billion metric tonnes of food per year, but wastes up to half of this food through poor practices and inadequate infrastructure. By improving processes and infrastructure, as well as changing consumer mindsets, we would have the ability to provide 60-100% more food to feed the world’s growing population.
An enzyme treatment which could neutralise the effects of lethal chemicals responsible for the deaths of hundreds of thousands of people across the world has been developed by experts at the University of Sheffield.
Organo-phosphorus agents (OP) are used as pesticides in developing countries, where acute poisoning is common because of insufficient control, poor storage, ready availability, and inadequate education amongst farmers. Globally an estimated 200,000 people die each year from OP poisoning, through occupational exposure, unintentional use and misuse – along with deliberate terrorist activities. OPs include compounds like Tabun (developed in 1936 by German scientists during WWII), Sarin, Soman, Cyclosarin, VX and VR.
In sub-Saharan Africa, the potential cost of pesticide-related illnesses between 2005 and 2020 could reach $90bn (£56bn) according to a UN report issued last year. This would exceed the total amount of international aid for basic health services in the region, excluding HIV/Aids. It is also a particular problem in India, Pakistan and Sri Lanka.
Using a modified human enzyme, Professor Mike Blackburn from the University of Sheffield’s Department of Molecular Biology and Biotechnology worked with Professor Alexander Gabibov of the Shemyakin-Ovchinnikov Institute, Moscow, and Professor Patrick Masson of the Centre de Recherches du Service de Santé des Armées, to create a “bioscavenger”, which was found to protect mice against the nerve agent VR and showed no lasting effects.
In studies performed at the Institute of Bioorganic Chemistry in Pushchino, Russia, a total of eight mice were treated with the new enzyme after being subjected to enough VR agent to kill several of the animals – 63 mg – and all survived.
Professor Blackburn said: “This current publication describes a novel method to generate a bioscavenger for the Russian VR organo-phosphorus agent with the key property of being long-acting in the bloodstream. That has been achieved by a combination of chemical surface modification (polysialylation) and biotechnology of production (through the use of an in vitro CHO-based expression system, employing genes encoding butyrylcholinesterase and a proline-rich peptide under special promoter control)."
An international team of scientists has taken the next step in creating nanoscale machines, by designing a multi-component molecular motor that can be moved clockwise and counterclockwise.
Nanotechnology researchers have already learned to control, rotate and switch individual molecules on and off. However, this new study is the first to create a stand-alone molecular motor with multiple parts. Ohio University professor of physics and astronomy, Saw-Wai Hla, led the study alongside Christian Joachim of A*Star in Singapore and CEMES/CNRS in France and Gwenael Rapenne of CEMES/CNRS.
It's an essential step in creating nanoscale devices — quantum machines that operate on different laws of physics than classical machines — that scientists envision could one day be used for everything from powering quantum computers to sweeping away blood clots in arteries.
In the study, published in Nature Nanotechnology, the team shows that they could control the motion of the motor with energy generated by electrons from a scanning tunneling microscope tip. The motor is around 2 nanometres in length and 1 nanometre high and was constructed on a gold crystal surface.
At a temperature of -315ºF (-193ºC), the motor could move independently through thermal excitation. When scientists cooled the sample to -450ºF (-268ºC), the motor stopped rotating. The researchers selectively applied electron energy to different parts of the motor, prompting it to move clockwise and counterclockwise.
"If we want to build an actual device based on this motor, we would install electrodes on the surface to create an energy source," Hla said.
To construct the molecular motor, the scientific team designed a stationary base of atoms that is connected to an upper moving part by one atom of ruthenium, which serves as the "ball bearing." The upper piece of the motor features five arms made of iron atoms. The researchers made one arm shorter than the others to be able to track the motion of the machine. The entire device is held upright by using sulfur as an "atomic glue" to secure the motor to the gold surface, Hla explained. The scientists now plan to use this model to build more complex machines with automated components.
Rice University's latest nanotechnology breakthrough was more than 10 years in the making, but it still came with a shock.
Scientists from Rice University, Dutch firm Teijin Aramid, the U.S. Air Force and Israel's Technion Institute have this week unveiled a new carbon nanotube (CNT) fiber that looks and acts like textile thread and conducts electricity and heat like a metal wire. In this week's issue of Science, they describe an industrially scalable process for making the threadlike fibers, which outperform commercially available high-performance materials in a number of ways.
Lead researcher Matteo Pasquali, professor of chemical and biomolecular engineering and chemistry at Rice: “We finally have a nanotube fiber with properties that don’t exist in any other material. It looks like black cotton thread, but behaves like both metal wires and strong carbon fibers.”
Study co-author Marcin Otto, business development manager at Teijin Aramid: “The new CNT fibers have a thermal conductivity approaching that of the best graphite fibers, but with 10 times greater electrical conductivity. Graphite fibers are also brittle, while the new CNT fibers are as flexible and tough as a textile thread."
The phenomenal properties of carbon nanotubes have enthralled scientists from the moment of their discovery in 1991. The hollow tubes of pure carbon, which are barely as wide as a strand of DNA, are about 100 times stronger than steel at one-sixth the weight. Their conductive properties – for both electricity and heat – rival the best metal conductors. They can also serve as light-activated semiconductors, drug-delivery devices and even sponges to soak up oil.
Study co-author Junichiro Kono, Rice professor of electrical and computer engineering: “The research showed that the electrical conductivity of the fibers could be tuned and optimised with techniques that were applied after initial production. This led to the highest conductivity ever reported for a macroscopic CNT fiber.”
The fibers reported in Science have about 10 times the tensile strength and electrical and thermal conductivity of the best previously reported wet-spun CNT fibers. The specific electrical conductivity of the new fibers is on par with copper, gold and aluminum wires, but the new material has advantages over metal wires. For example, one application where high strength and electrical conductivity could prove useful would be in data and low-power applications.
“Metal wires will break in rollers and other production machinery if they are too thin,” Pasquali said. “In many cases, people use metal wires that are far more thick than required for the electrical needs, simply because it’s not feasible to produce a thinner wire. Data cables are a particularly good example of this.”
Study co-author Marcin Otto: "We expect this combination of properties will lead to new products with unique capabilities for the aerospace, automotive, medical and smart-clothing markets.”
Tobacco plants bloom when they are just a few months old – and then they die. Now, researchers have located a genetic switch which can keep the plants young for years and permits unbounded growth. In short, an ideal source of biomass.
The life of tobacco plants is short. They grow for around three to four months, followed by flowering, then die. Their size is also limited, with plants only growing to about one-and-a-half to two metres tall. Now, researchers at the Fraunhofer Institute for Molecular Biology and Applied Ecology IME in Münster have located the plant's very own fountain of youth, which means they can keep it forever young. The Münster-based researchers discovered a genetic switch which can prevent the plants from change blooming to flowering. This also averts the plants' early change demise to senescence – and suppresses the factor that halts growth. "The first of our tobacco plants is now almost eight years old but it still just keeps on growing and growing," says Professor Dirk Prüfer, head of the Department of Functional and Applied Genomics at the IME. "Although we regularly cut it, it's six-and-a-half metres tall. If our greenhouse were a bit higher, it would probably be even bigger. Its stem is already ten centimetres in diameter." Whereas in normal tobacco plants the leaves, which grow from the bottom of the stem, soon turn yellow and drop off, the IME plant's leaves stay healthy and green. This is why the scientists have christened their modified plant species "forever young".
But what exactly do the researchers do to give their plants eternal youth and make them capable of unbounded growth? "We modify the expression of a certain gene – or rather, the information contained within it – so that the plant's flowering is delayed," explains Prüfer. Researchers then insert the modified gene back into the plant using a bacterium. The role of the bacterium is to act as a sort of shuttle service for the modified gene.
Producing more biomass
The principle is transferable and could be used on other kinds of plants; at the moment, the scientists are working also on potato plants on behalf of a Japanese chemical company. They use their knowledge to get crops to yield a far greater amount of biomass. In the case of potatoes, this means a great deal more starch. "If we want to guarantee security of supply for foodstuffs and plant-based raw materials, the yield per hectare will have to double by 2050, claims the German Bioeconomy Council. This new technology brings us a great deal nearer to that target," reckons Prüfer. "However, our method is only likely to deliver success as long as the flowers of the plant in question play no significant role – sugar beet, for instance. It would make no sense to use the technique on rapeseed." Preventing plants from flowering presents a significant advantage, in that no flowering means no production of seeds or pollen. As a result, plants have no way of reproducing, which means they cannot spread into the environment in an unplanned way.
In the future, the researchers want to go further and be able to disable plants' growth limits using chemical mutagenesis as well – that is to say, using normal growing techniques. This process involves using chemical additives to bring about changes in a seed's DNA sequence. The advantage is that a plant grown in this way would no longer be genetically modified but simply a plant grown using standard techniques. "But in order to be able to do that, we first need to gain a better understanding of the deregulation of genes," says Prüfer, who hopes cultivation experiments might begin next year. Then perhaps normal plants will be in a position to grow tall, too.
Mars One, a not-for-profit organisation which plans to establish a human settlement on Mars in 2023, has issued the base requirements for its pending Astronaut Selection Program. This establishes the first step toward the global selection process which will commence in the first half of 2013.
As we reported in December, Mars One is unique to all other space exploration endeavors before it. The company's astronaut program is open to anyone on Earth that meets its base criteria. It is not necessary to have military training nor experience in flying aircraft nor even a science degree. What is most important is that applicants be intelligent, in good mental and physical health, and be willing to dedicate eight years to training and learning before making the journey to their new home on Mars.
Norbert Kraft, a former Senior Research Associate at NASA and Chief Medical Director for Mars One: “In my former work with NASA we established strict criteria for the selection and training of astronauts on long duration space flights. Gone are the days when bravery and the number of hours flying a supersonic jet were the top criteria. Now, we are more concerned with how well each astronaut works and lives with the others, in the long journey from Earth to Mars and for a lifetime of challenges ahead. Psychological stability, the ability to be at your best when things are at their worst is what Mars One is looking for. If you are the kind of person that everyone chooses to have on their island, then we want you to apply too.”
Applicants need to be at least 18 years of age, have a deep sense of purpose, willingness to build and maintain healthy relationships, the capacity for self-reflection and ability to trust. They must be resilient, adaptable, curious, creative and resourceful. Mars One is not seeking specific skill sets such as medical doctors, pilots or geologists. Rather, candidates will receive a minimum of eight years extensive training while employed by Mars One. While any formal education or real-world experience can be an asset, all skills required on Mars will be learned while in training.
Suzanne Flinkenflögel, the Director of Communications at Mars One: “Well before the official Astronaut Selection Program, we received more than 1,000 emails from individuals who desire to go to Mars. While they may not yet realise the incredible challenges that lay ahead, this show of support for a global selection campaign is so important to us. We are working hard to launch our selection campaign as soon as possible, to open the doors to everyone who aspires to do something tremendous in their lifetime."
The Mars One Foundation will employ the astronauts during their Earth-based training and life on Mars and will be the manager of the simulation bases on Earth and the human settlement on Mars. Eight robotic cargo missions (2016-2021) will establish a habitable settlement which will welcome the humans upon their arrival to Mars. The final Astronaut candidates will be selected from the global applications through a combination of critical review by Mars One experts and a global, televised program which ultimately selects which set of four astronauts will be the first to go.
At the Consumer Electronics Show (CES) in Las Vegas, a company called Tactus has been demonstrating the world's first fully-integrated dynamic touchscreen tablet.
This 7" device showcases the company's tactile touchscreen technology and introduces an entirely new category of product made possible through its Tactus Morphing Tactile™ surface.
By enhancing both function and usability with Tactus, it is now possible to merge the essential capabilities of smartphones, tablets and laptops through a true physical interface. In a world of flat, static devices, Tactus aims to bring new life to touchscreens by enabling real, physical buttons that rise up from a screen's surface on demand, then disappear back into the screen, leaving a flat, transparent surface when no longer needed.
With normal touchscreens, input errors increase and typing speed decreases for most users. It can also be difficult to know when a "button" was pushed on a completely flat screen, plus there are no orientation cues to guide fingers to the right location. Tactus can solve this problem.
An earlier prototype was seen at last year's Society for Information Display (SID) conference. Since then, a number of design improvements have been made – which include a new coating material to reduce glare, a reduction in the controller's size by 70 percent and a doubling of the speed at which the keyboard activates.
The Consumer Electronics Show (CES) – the biggest technology exhibition of the year – is currently underway in Las Vegas. Among the companies present is Sharp, which has just released a video exploring the future possibilities of "IGZO", a new semiconducting material that has already begun to appear in its products.
IGZO stands for "Indium Gallium Zinc Oxide" and is used as the channel for a transparent thin-film transistor. It replaces amorphous silicon for the active layer of an LCD screen, and, with 40 times higher electron mobility than amorphous silicon, allows either smaller pixels (for screen resolutions higher than HDTV), or much higher reaction speeds for a screen. It is ultra-responsive to touch, drastically minimising the noise caused during touch input. This allows for quick, easy and more natural-feeling writing and smooth lines. It is also far more energy efficient, maintaining onscreen data for a period of time without refreshing the data, even when the current is off.
Sharp is the first company to successfully mass produce IGZO. In April 2012, it was announced that they would be producing bulk volumes of 32-inch 3840×2160, 10-inch 2560×1600 and 7-inch 1280x800 panels. In addition to IGZO, Sharp is showcasing a range of other next-generation TVs and devices – including its 2013 AQUOS® LED TV lineup, featuring the world's biggest LED TV (90" diagonal).
Toshi Osawa, the CEO and Chairman of Sharp: "Whether in your home or in your hand, display technology is everywhere. From game changing IGZO, to stunning Ultra HD products, and large screen televisions, the introductions we are making at CES 2013 will advance people's lives at home, work and everywhere in between."
"DIEGO-SAN" is a creation of the Machine Perception Laboratory at the University of California, San Diego. Funding was received from the National Science Foundation, which contracted two companies – Kokoro Co. Ltd. and Hanson Robotics – to build the android infant. His face was engineered by David Hanson and Hanson Robotics, while his body is by Kokoro.
The robot stands 4' 3" (130 cm) tall and weighs 66 lb (30 kg) with a total of 44 pneumatic joints as well as high definition cameras in both eyes. He is intended to be a research platform for studying the cognitive, learning and emotional development of young infants, with a focus on facial expressions.
Dr. Javier Movellan, who heads the Machine Perception Lab at the university: "Its main goal is to try and understand the development of sensory motor intelligence from a computational point of view. It brings together researchers in developmental psychology, machine learning, neuroscience, computer vision and robotics. Basically we are trying to understand the computational problems that a baby's brain faces when learning to move its own body and use it to interact with the physical and social worlds."
A simple, precise and inexpensive method for cutting DNA to insert genes into human cells could transform genetic medicine – making routine what are now expensive, complicated and rare procedures for replacing defective genes in order to fix genetic disease or even cure AIDS.
Discovered last year by Jennifer Doudna and Martin Jinek of the Howard Hughes Medical Institute and University of California, Berkeley, and Emmanuelle Charpentier of the Laboratory for Molecular Infection Medicine-Sweden, the technique was labeled a "tour de force" in a 2012 review in the journal Nature Biotechnology.
That review was based solely on the team's June 2012 Science paper, in which the researchers described a new method of precisely targeting and cutting DNA in bacteria.
Two new papers published last week in the journal Science Express demonstrate that the same technique also works in human cells. A paper by Doudna and her team reporting similarly successful results in human cells has been accepted for publication by the new open-access journal eLife.
"The ability to modify specific elements of an organism's genes has been essential to advance our understanding of biology, including human health," said Doudna. "However, the techniques for making these modifications in animals and humans have been a huge bottleneck in both research and the development of human therapeutics.
"This is going to remove a major bottleneck in the field – because it means that essentially anybody can use this kind of genome editing or reprogramming to introduce genetic changes into mammalian or, quite likely, other eukaryotic systems."
"I think this is going to be a real hit," said George Church, professor of genetics at Harvard Medical School and principal author of one of the Science Express papers. "There are going to be a lot of people practicing this method because it is easier and about 100 times more compact than other techniques."
"Based on the feedback we've received, it's possible that this technique will completely revolutionise genome engineering in animals and plants," said Doudna, who also holds an appointment at Lawrence Berkeley National Laboratory. "It's easy to program and could potentially be as powerful as the Polymerase Chain Reaction (PCR)."
The latter technique made it easy to generate millions of copies of small pieces of DNA and permanently altered biological research and medical genetics.
Two developments – zinc-finger nucleases and TALEN (Transcription Activator-Like Effector Nucleases) proteins – have gotten a lot of attention recently, including being together named one of the top 10 scientific breakthroughs of 2012 by Science magazine. The magazine labeled them "cruise missiles" since both techniques allow researchers to home in on a particular part of a genome and snip the double-stranded DNA there and there only.
Researchers can use these methods to make two precise cuts to remove a piece of DNA and, if an alternative piece of DNA is supplied, the cell will plug it into the cut instead. In this way, doctors can excise a defective or mutated gene and replace it with a normal copy. Sangamo Biosciences, a clinical stage biopharmaceutical company, has already shown that replacing one specific gene in a person infected with HIV can make him or her resistant to AIDS.
Both the zinc finger and TALEN techniques require synthesising a large new gene encoding a specific protein for each new site in the DNA that is to be changed. By contrast, the new technique uses a single protein that requires only a short RNA molecule to program it for site-specific DNA recognition, Doudna said.
In the new Science Express paper, Church compared the new technique, which involves an enzyme called Cas9, with the TALEN method for inserting a gene into a mammalian cell and found it five times more efficient.
"It (the Cas9-RNA complex) is easier to make than TALEN proteins, and it's smaller," making it easier to slip into cells and even to program hundreds of snips simultaneously, he said. The complex also has lower toxicity in mammalian cells than other techniques, he added.
"It's too early to declare total victory" over TALENs and zinc-fingers, Church said, "but it looks promising."
Based on the immune systems of bacteria
Doudna discovered the Cas9 enzyme while working on the immune system of bacteria that have evolved enzymes that cut DNA to defend themselves against viruses. These bacteria cut up viral DNA and stick pieces of it into their own DNA, from which they make RNA that binds and deactivates the viruses.
UC Berkeley professor of earth and planetary science Jill Banfield brought this unusual viral immune system to Doudna's attention a few years ago, and Doudna became intrigued. Her research focuses on how cells use RNA (ribonucleic acids), which are essentially the working copies that cells make of the DNA in their genes.
Doudna and her team worked out the details of how the enzyme-RNA complex cuts DNA: the Cas9 protein assembles with two short lengths of RNA, and together the complex binds a very specific area of DNA determined by the RNA sequence. The scientists then simplified the system to work with only one piece of RNA and showed in the earlier Science paper that they could target and snip specific areas of bacterial DNA.
"The beauty of this compared to any of the other systems that have come along over the past few decades for doing genome engineering is that it uses a single enzyme," Doudna said. "The enzyme doesn't have to change for every site that you want to target – you simply reprogram it with a different RNA transcript, which is easy to design and implement."
The three new papers show this bacterial system works beautifully in human cells as well as in bacteria.
"Out of this somewhat obscure bacterial immune system comes a technology that has the potential to really transform the way that we work on and manipulate mammalian cells and other types of animal and plant cells," Doudna said. "This is a poster child for the role of basic science in making fundamental discoveries that affect human health."
Doudna's coauthors include Jinek and Alexandra East, Aaron Cheng and Enbo Ma of UC Berkeley's Department of Molecular and Cell Biology. Doudna's work was sponsored by the Howard Hughes Medical Institute.
Researchers have proposed a method for cooling trapped antihydrogen which could offer 'a major experimental advantage' and help to map the mysterious properties of antimatter that have to date remained elusive.
The new method, developed by a group of researchers from the USA and Canada, could potentially cool trapped antihydrogen atoms to temperatures 25 times colder than already achieved, making them much more stable and a lot easier to experiment on.
The suggested method, which is published today in the Journal of Physics B: Atomic, Molecular and Optical Physics, involves a laser, directed at antihydrogen atoms to give them a 'kick', causing them to lose energy and cool down.
Antihydrogen atoms are formed in an ultra-high vacuum trap by injecting antiprotons into positron plasma. An atomic process causes the antiproton to capture a positron which gives an electronically excited antihydrogen atom.
Typically, the antihydrogen atoms have a lot of energy compared to the trapping depth which can distort the measurements of their properties. As it is only possible to trap very few antihydrogen atoms, the main method for reducing the high energies is to laser cool the atoms to extremely low temperatures.
Co-author of the study, Professor Francis Robicheaux of Auburn University in the USA, said: "By reducing the antihydrogen energy, it should be possible to perform more precise measurements of all of its parameters. Our proposed method could reduce the average energy of trapped antihydrogen by a factor of more than 10.
"The ultimate goal of antihydrogen experiments is to compare its properties to those of hydrogen. Colder antihydrogen will be an important step for achieving this."
This process, known as Doppler cooling, is an established method for cooling atoms; however, because of the restricted parameters that are needed to trap antimatter, the researchers need to be absolutely sure that it is possible.
"It is not trivial to make the necessary amount of laser light at a specific wavelength of 121 nm. Even after making the light, it will be difficult to mesh it with an antihydrogen trapping experiment. By doing the calculations, we've shown that this effort is worthwhile," continued Professor Robicheaux.
Through a series of computer simulations, they showed that antihydrogen atoms could be cooled to around 20 millikelvin; trapped antihydrogen atoms so far have energies up to 500 millikelvin.
In 2011, researchers from CERN reported that they had trapped antimatter for over 1000 seconds – a record. In 2012, the first experiments were performed on antihydrogen whilst it was trapped between a series of magnets. Even though the processes that control the trapping are largely unknown, the researchers believe that the laser cooling should increase the amount of time antihydrogen can be trapped for.
"Whatever the processes are, having slower moving and more deeply trapped antihydrogen should decrease the loss rate," said Professor Robicheaux.
Colder antihydrogen atoms could also be used to measure the gravitational property of antimatter. "No one has ever seen antimatter actually fall in the field of gravity," said co-author Dr Makoto Fujiwara of TRIUMF, Canada's National Laboratory for Particle and Nuclear Physics. "Laser cooling would be a very significant step towards such an observation."
Every particle has an antiparticle. For example, an electron's antiparticle is the positron and a proton's antiparticle is an antiproton.
An antiparticle is exactly the same as its corresponding particle but carries an opposite charge.
If a particle and its corresponding antiparticle meet, they destroy each other. This is known as annihilation.
The combination of one positron and one antiproton creates antihydrogen.
Theories suggest that after the Big Bang, equal amounts of matter and antimatter should have formed. As the Universe today is composed almost entirely of matter, it remains a great mystery why we don't have this symmetry.
Scientists such as the ALPHA collaboration at CERN have been trying to measure the properties of antihydrogen to find clues as to why this asymmetry exists.
In the future, antimatter could be used as a fuel for interplanetary travel – or even interstellar travel – as part of antimatter catalysed nuclear pulse propulsion, or other antimatter-based rocketry such as the redshift rocket. Since the energy density of antimatter is much higher than that of conventional fuels, an antimatter fueled spacecraft would have a higher thrust-to-weight ratio than a conventional spacecraft.
At a recent presentation with NASA's Future In-Space Operations working group, an expert claimed that such technology may be possible within 50 or 60 years. Spacecraft could reach Jupiter within four months, potentially opening up parts of the outer Solar System to manned exploration.
For a number of years now, Google has been leading the way in self-driving, autonomous car technology. However, car makers Toyota and Audi are now developing the vehicles themselves, independently of the Internet search giant.
Both companies have confirmed that they will demonstrate self-driving systems at the Consumer Electronics Show (CES), the biggest technology trade show of the year, which begins on 8th January. Toyota released a brief, 5 second teaser clip this week, showing its prototype Lexus LS 600h. This is apparently codenamed the AASRV (Advanced Active Safety Research Vehicle) and will "lead the industry into a new automated era."
As you can see in the video below, it appears very similar to a Google self-driving Prius – but as mentioned, Toyota has developed this model entirely independently, with no partnership involved. In addition to the vehicle itself, they will also discuss the state of Intelligent Transport Systems (ITS) research and development, which includes vehicle-to-vehicle and vehicle-to-infrastructure communications technology. This is expected to be fairly widespread by 2019 and could massively reduce the number of casualties on the roads.
As for Audi, there is no video available. However, a spokesperson has stated that its car will include a feature allowing it to find a parking space and park without a driver behind the wheel.
Thanks largely to Google's lobbying efforts, new laws were introduced last year – in California and Nevada – to make self-driving vehicles a reality. It's clear that this technology is moving forward and could soon enter the mainstream. In our recent poll, 70% of readers said they would feel safe riding in a computer-controlled car.
Look up at the night sky and you'll see stars, sure. But you're also seeing planets – billions and billions of them. At least.
That's the conclusion of a new study by astronomers at the California Institute of Technology (Caltech) that provides yet more evidence that planetary systems are the cosmic norm. The team made their estimate while analysing planets orbiting a star called Kepler-32 — planets that are representative, they say, of the vast majority in the galaxy and thus serve as a perfect case study for understanding how most planets form.
"There's at least 100 billion planets in the galaxy — just our galaxy," says John Johnson, assistant professor of planetary astronomy at Caltech and coauthor of the study, which was recently accepted for publication in the Astrophysical Journal. "That's mind-boggling."
"It's a staggering number, if you think about it," adds Jonathan Swift, a postdoc at Caltech and lead author of the paper. "Basically there's one of these planets per star."
The planetary system in question, which was detected by the Kepler space telescope, contains five planets. The existence of two of those planets have already been confirmed by other astronomers. The Caltech team confirmed the remaining three, then analysed the five-planet system and compared it to other systems found by the Kepler mission.
The planets orbit a star that is an M dwarf — a type that accounts for about three-quarters of all stars in the Milky Way. The five planets, which are similar in size to Earth and orbit close to their star, are also typical of the class of planets that the telescope has discovered orbiting other M dwarfs, Swift says. Therefore, the majority of planets in the galaxy probably have characteristics comparable to those of the five planets:
While this particular system may not be unique, what does set it apart is its coincidental orientation: the orbits of the planets lie in a plane that's positioned such that Kepler views the system edge-on. Due to this rare orientation, each planet blocks Kepler-32's starlight as it passes between the star and the Kepler telescope.
By analysing changes in the star's brightness, the astronomers were able to determine the planets' characteristics, such as their sizes and orbital periods. This orientation therefore provides an opportunity to study the system in great detail — and because the planets represent the vast majority of planets that are thought to populate the galaxy, the team says, the system also can help astronomers better understand planet formation in general.
"I usually try not to call things 'Rosetta stones,' but this is as close to a Rosetta stone as anything I've seen," Johnson says. "It's like unlocking a language that we're trying to understand — the language of planet formation."
One of the fundamental questions regarding the origin of planets is how many of them there are. Like the Caltech group, other teams of astronomers have estimated that there is roughly one planet per star, but this is the first time researchers have made such an estimate by studying M-dwarf systems, the most numerous population of planets known.
To do that calculation, the Caltech team determined the probability that an M-dwarf system would provide Kepler-32's edge-on orientation. Combining that probability with the number of planetary systems Kepler is able to detect, the astronomers calculated that there is, on average, one planet for every one of the approximately 100 billion stars in the galaxy. But their analysis only considers planets that are in close orbits around M dwarfs — not the outer planets, or those orbiting other kinds of stars. As a result, they say, their estimate is conservative. In fact, says Swift, a more accurate estimate that includes data from other analyses could lead to an average of two planets per star.
Credit: ESO/M. Kornmesser
M-dwarf systems like Kepler-32's are quite different from our own solar system. For one, M dwarfs are cooler and much smaller than the sun. Kepler-32, for example, has half the mass of the sun and half its radius. The radii of its five planets range from 0.8 to 2.7 times that of Earth, and those planets orbit extremely close to their star. The whole system fits within just over a tenth of an astronomical unit (the average distance between Earth and the sun) — a distance that is about a third of the radius of Mercury's orbit around the sun. The fact that M-dwarf systems vastly outnumber other kinds of systems carries a profound implication, according to Johnson, which is that our Solar System is extremely rare. "It's just a weirdo," he says.
The fact that the planets in M-dwarf systems are so close to their stars doesn't necessarily mean that they're fiery, hellish worlds unsuitable for life, the astronomers say. Indeed, because M dwarfs are small and cool, their temperate zone — also known as the "habitable zone," the region where liquid water might exist — is also further inward. Even though only the outermost of Kepler-32's five planets lies in its temperate zone, many other M dwarf systems have more planets that sit right in their temperate zones.
As for how the Kepler-32 system formed, no one knows for sure yet. But the team says its analysis places constraints on possible mechanisms. For example, the results suggest that the planets all formed farther away from the star than they are now, and migrated inward over time.
Like all planets, the ones around Kepler-32 formed from a proto-planetary disk — a disk of dust and gas that clumped up into planets around the star. The astronomers estimated that the mass of the disk within the region of the five planets was about as much as that of three Jupiters. But other studies of proto-planetary disks have shown that three Jupiter masses can't be squeezed into such a tiny area so close to a star, suggesting to the Caltech team that the planets around Kepler-32 initially formed farther out.
Another line of evidence relates to the fact that M dwarfs shine brighter and hotter when they are young, when planets would be forming. Kepler-32 would have been too hot for dust — a key planet-building ingredient — to even exist in such close proximity to the star. Previously, other astronomers had determined that the third and fourth planets from the star are not very dense, meaning that they are likely made of volatile compounds such as carbon dioxide, methane, or other ices and gases, the Caltech team says. However, those volatile compounds could not have existed in the hotter zones close to the star.
Finally, the Caltech astronomers discovered that three of the planets have orbits that are related to one another in a very specific way. One planet's orbital period lasts twice as long as another's, and the third planet's lasts three times as long as the latter's. Planets don't fall into this kind of arrangement immediately upon forming, Johnson says. Instead, the planets must have started their orbits farther away from the star before moving inward over time and settling into their current configuration.
"You look in detail at the architecture of this very special planetary system, and you're forced into saying these planets formed farther out and moved in," Johnson explains.
The implications of a galaxy chock full of planets are far-reaching, the researchers say. "It's really fundamental from an origins standpoint," says Swift, who notes that because M dwarfs shine mainly in infrared light, the stars are invisible to the naked eye. "Kepler has enabled us to look up at the sky and know that there are more planets out there than stars we can see."
A variant of a gene associated with active personality traits in humans seems to also be involved with living a longer life, UC Irvine and other researchers have found.
This derivative of a dopamine-receptor gene – called the DRD4 7R allele – appears in significantly higher rates in people more than 90 years old and is linked to lifespan increases in mouse studies.
Robert Moyzis, professor of biological chemistry at UC Irvine, and Dr. Nora Volkow, a psychiatrist who conducts research at the Brookhaven National Laboratory, led a research effort that included data from the UC Irvine-led 90+ Study in Laguna Woods, California. Results appear online in The Journal of Neuroscience.
The variant gene is part of the dopamine system, which facilitates the transmission of signals among neurons and plays a major role in the brain network responsible for attention and reward-driven learning. The DRD4 7R allele blunts dopamine signaling, which enhances individuals' reactivity to their environment.
People who carry this variant gene, Moyzis said, seem to be more motivated to pursue social, intellectual and physical activities. The variant is also linked to attention-deficit/hyperactivity disorder, along with addictive and risky behaviors.
"While the genetic variant may not directly influence longevity," Moyzis said, "it is associated with personality traits that have been shown to be important for living a longer, healthier life. It's been well documented that the more you're involved with social and physical activities, the more likely you'll live longer. It could be as simple as that."
Numerous studies – including a number from the 90+ Study – have confirmed that being active is important for successful aging, and it may deter the advancement of neurodegenerative diseases, such as Alzheimer's.
Prior molecular evolutionary research led by Moyzis and Chuansheng Chen, UC Irvine professor of psychology & social behavior, indicated that this "longevity allele" was selected for during the nomadic out-of-Africa human exodus more than 30,000 years ago.
In the new study, the UC Irvine team analysed genetic samples from 310 participants in the 90+ Study. This "oldest-old" population had a 66 percent increase in individuals carrying the variant relative to a control group of 2,902 people between the ages of 7 and 45. The presence of the variant also was strongly correlated with higher levels of physical activity.
Next, Volkow, neuroscientist Panayotis Thanos and their colleagues at the Brookhaven National Laboratory found that mice without the variant had a 7 percent to 9.7 percent decrease in lifespan compared with those possessing the gene, even when raised in an enriched environment.
While it's evident that the variant can contribute to longevity, Moyzis said further studies must take place to identify any immediate clinical benefits from the research. "However, it is clear that individuals with this gene variant are already more likely to be responding to the well-known medical adage to get more physical activity," he added.
LG and Samsung both announced 55in OLEDs last year, but LG is the first to make its commercially available.
OLED stands for "Organic Light Emitting Diode". Organic LEDs emit their own light through organic compounds in response to electrical input, as opposed to LCD or LCD LED displays which require separate backlighting. This allows each individual pixel in the OLED screen to emit red, green and blue colour to create a picture, while the lack of backlighting creates darker blacks and an ultra-thin screen. Pictures are extremely vibrant and natural in appearance, with consistent colour and superior contrast.
After many years of research and development, LG Electronics (LG) has announced that it will begin accepting pre-orders for its eagerly-awaited 55-inch WRGB OLED TV (Model 55EM9700) in South Korea this month, with deliveries scheduled to begin in February. Other markets where the next-generation TV is being sold will be announced in the next several weeks along with their prices. The announcement comes just days before the 2013 Consumer Electronics Show (CES), where an early version of the TV last year was awarded “Best of Show.”
More than 1,400 LG retail stores in South Korea will begin accepting orders from consumers for KRW 11 million – approximately US $10,000 – starting today (3rd January). As the first and only company to announce availability of the next-generation TV technology, LG is prepared to ramp up quickly to take the lead in the OLED segment that is expected to reach 7.2 million units in 2016, by which time it should be much more affordable.
“We are extremely pleased to be able to make this announcement at the start of the new year because we believe that OLED will usher in a whole new era of home entertainment,” said Havis Kwon, President and CEO of LG’s Home Entertainment Company. “Not since colour TV was first introduced 60 years ago has there been a more transformational moment. When high definition TV was first introduced 15 years ago, the public’s reaction was ‘wow!’ but when customers see our razor-thin OLED TV for the first time, they’re left speechless. That’s a clear indicator as any that OLED TV is much more than just an incremental improvement to current television technology.”
Only 4 millimeters (0.16 inches) thin and weighing less than 10 kilograms (22 pounds), LG’s OLED TVs produce astoundingly vivid and realistic pictures thanks to its superior WRGB technology. In addition to the standard three colours, LG’s unique Four-Colour Pixel system features a white sub-pixel, which works in conjunction with the conventional red, blue, green setup to create the perfect colour output. LG’s exclusive "Colour Refiner" delivers even greater tonal enhancement, resulting in images that are more vibrant and natural than anything seen before. The 55-inch OLED TV also offers an infinite contrast ratio, which maintains optimal contrast levels regardless of ambient brightness or viewing angle.
Even before its launch, LG’s OLED TV was turning heads all over the world. In addition to being named Best of Show at CES 2012, the influential Industrial Designers Society of America recognised the TV with a coveted IDEA Award. Meanwhile, LG received the European Display Achievement 2012-2013 Award from the European Imaging and Sound Association (EISA). And to cap it off, LG’s OLED received Korea’s Good Design President Award in October.
Trees and insects wage constant war on each other. Insects burrow and munch on bark; trees deploy lethal and disruptive defences in the form of chemicals. But in a rapidly warming world, where temperatures and seasonal change are in flux, the tide of battle may be shifting in some insects' favour, according to a new study.
In a report published yesterday in the Proceedings of the National Academy of Sciences, a team of scientists from the University of Wisconsin-Madison reports a rising threat to whitebark pine forests in the northern Rocky Mountains as native mountain pine beetles climb ever higher, attacking trees that have not evolved strong defences to stop them.
The whitebark pine forests of the western United States and Canada are the forest ecosystems that occur at the highest elevation to sustain trees. It is a critical habitat for iconic species such as the grizzly bear and plays an important role in governing the hydrology of the mountain west by shading snow and regulating the flow of meltwater.
"Warming temperatures have allowed tree-killing beetles to thrive in areas that were historically too cold for them most years," explains Ken Raffa, a UW-Madison professor of entomology and a senior author of the new report. "Tree species at these high elevations never evolved strong defences."
A warming world has not only made it easier for the mountain pine beetle to invade new and defenceless ecosystems, but also to better withstand winter weather that is milder and erupt in large outbreaks capable of killing entire stands of trees, no matter their composition.
"A subject of much concern in the scientific community is the potential for cascading effects of whitebark pine loss on mountain ecosystems," says Phil Townsend, professor of forest ecology and also a senior author of the study.
The beetle's historic host is the lodgepole pine – a tree common at lower elevations. Normally, the insects, which are about the size of a grain of rice, play a key role in regulating the health of a forest by attacking old or weakened trees and fostering the development of a younger forest.
However, recent years have been characterised by unusually hot and dry summers and mild winters, which have allowed insect populations to boom. This has led to an infestation of mountain pine beetles at higher elevations, described as the most significant insect blight ever seen in North America. In 2011, a similar study by the U.S. Fish and Wildlife Service found that whitebark pine forests could be extinct in 120 years, if trends continue. This would have major implications for the ability of Earth's biosphere to absorb and remove greenhouse gases (CO2) from the atmosphere.
Lodgepole pines (unlike whitebark pines) co-evolved with bark beetles at lower elevations. Over time, they devised countermeasures such as volatile toxic compounds and agents that disrupt the beetle's chemical communication system. Despite its strong defences, the lodgepole pine is still the preferred menu item for the mountain pine beetle, suggesting that the beetle has not yet adjusted its host preference to whitebark pines. "Nevertheless, at elevations consisting of pure whitebark pine, the mountain pine beetle readily attacks it," says Townsend.
The study, conducted in the Greater Yellowstone Ecosystem – one of the last nearly intact ecosystems in the Earth's northern temperate regions – also revealed that insects preying on or competing with mountain pine beetles are staying in their preferred lodgepole pine habitat. That, says Townsend, is a concern because tree-killing bark beetles will encounter fewer of these enemies in fragile, high-elevation stands.
Whitebark pines are a vital food source for wildlife, including black and grizzly bears, and birds such as Clark's nutcracker which is essential to the forest ecology as the bird's seed caches help regenerate the forests. With their broad crowns, high-elevation whitebark pines also act as snow fences, helping to slowly release water into mountain streams and extending stream flow into mountain valleys well into the summer.
"Loss of the canopy will lead to greater desiccation during the winter and faster melting in the summer – due to loss of tree canopies for shade," says Townsend.
A group of Whitebark Pines.
In a separate study last month, the U.S. Geological Survey reported that climate change is already having major effects on ecosystems and species. Plants and animals are shifting their geographic ranges and the timing of their life events – such as flowering, laying eggs or migrating – at faster rates than documented even just a few years ago.
Mismatches in availability and timing of natural resources can influence species' survival. For example, if insects emerge before the arrival of migrating birds that rely on them for food, it can adversely affect bird populations. Species that must live at high altitudes or live in cold water within a narrow temperature range – such as salmon – face even greater risks. Earlier thaw and shorter winters can extend growing seasons for pests. This can substantially alter the benefits that humans derive from ecosystems, such as clean water, wood products and food.