2nd February 2016
Graphene shown to safely interface with neurons in the brain
Researchers in Europe have demonstrated that graphene can be successfully interfaced with neurons, while maintaining the integrity of these vital nerve cells. It is believed this could lead to greatly improved brain implants.
A new study published in the journal ACS Nano demonstrates how it is possible to interface graphene with neurons, whilst maintaining the integrity of these vital nerve cells. The research was part of the EU's Graphene Flagship – a €1 billion project that aims to bring graphene from laboratories into commercial applications within 10 years. The study involved a collaboration between nanotechnologists, chemists, biophysicists and neurobiologists from the University of Trieste in Italy, the University Castilla-La Mancha in Spain and the Cambridge Graphene Centre in the UK.
Prof. Laura Ballerini, lead neuroscientist in the study: "For the first time, we interfaced graphene to neurons directly, without any peptide coating used in the past to favour neuronal adhesion. We then tested the ability of neurons to generate electrical signals known to represent brain activities and found that the neurons retained unaltered their neuronal signalling properties. This is the first functional study of neuronal synaptic activity using uncoated, graphene-based materials."
Using electron microscopy and immuno-fluorescence in rat brain cell cultures, the researchers observed that the neurons interfaced well with the untreated graphene electrodes – remaining healthy, transmitting normal electric impulses and, importantly, showing no adverse glial reaction which can lead to damaging scar tissue. This is therefore the first step towards using pristine, graphene-based material for a neuro-interface.
Graphene-based electrodes implanted in the brain could restore sensory functions for amputees or paralysed patients, or treat individuals with motor disorders such as epilepsy or Parkinson's disease. Further into the future, perhaps they could be used to enhance or upgrade the abilities of normal, healthy people too, bringing the age of transhumanism closer to reality.
Too often, the modern electrodes used for neuro-interfaces (based on tungsten or silicon) suffer partial or complete loss of signal over time. This is often caused by scar tissue formation during the electrode insertion and by its rigid nature preventing the electrode from moving with the natural movements of the brain. Graphene, by contrast, appears to be a highly promising material to solve these problems. It has excellent conductivity, flexibility, biocompatibility and stability within the body.
"Hopefully this will pave the way for better deep brain implants to both harness and control the brain, with higher sensitivity and fewer unwanted side effects," said Ballerini.
"These initial results show how we are just at the tip of the iceberg when it comes to the potential of graphene and related materials in bio-applications and medicine," said Professor Andrea Ferrari, Director of the Cambridge Graphene Centre. "The expertise developed at the Cambridge Graphene Centre allows us to produce large quantities of pristine material in solution, and this study proves the compatibility of our process with neuro-interfaces."
29th January 2016
Pen-sized microscope identifies cancer cells
Researchers at the University of Washington have developed a new handheld, pen-sized microscope that could identify cancer cells in doctor's offices and operating rooms.
Surgeons removing a malignant brain tumour don't want to leave cancerous material behind. But they're also trying to protect healthy brain matter and minimise neurological harm. Once they open up a patient's skull, there's no time to send tissue samples to a pathology lab – where they are typically frozen, sliced, stained, mounted on slides and investigated under a bulky microscope – to clearly distinguish between cancerous and normal brain cells.
But a handheld, miniature microscope being developed by University of Washington (UW) engineers could allow surgeons to "see" at a cellular level in the operating room and determine precisely where to stop cutting. This technology, made in collaboration with Memorial Sloan Kettering Cancer Centre, Stanford University and the Barrow Neurological Institute, is outlined in the February 2016 issue of Biomedical Optics Express.
"Surgeons don't have a very good way of knowing when they're done cutting out a tumour," said Jonathan Liu, senior author on the paper and a UW assistant professor of mechanical engineering. "They're using their sense of sight, sense of touch, pre-operative images of the brain – and oftentimes it's pretty subjective. Being able to zoom and see at the cellular level during the surgery would really help them to accurately differentiate between tumour and normal tissues and improve patient outcomes."
Similarly, dentists who find a suspicious-looking lesion in a patient's mouth will often have to cut it out and send it to a lab to be biopsied for oral cancer, a process that subjects patients to an invasive procedure and overburdens pathology labs. A miniature microscope with high enough resolution to see changes at a cellular level could be used in dental or dermatological clinics to assess which lesions or moles are normal and which need to be biopsied.
Real-time microscope images (bottom) illuminate similar details in mouse tissues as the images (top) produced during an expensive, multi-day process at a clinical pathology lab. Credit: University of Washington
"The microscope technologies that have been developed over the last couple of decades are expensive and still pretty large," said Milind Rajadhyaksha, at the Memorial Sloan Kettering Cancer Centre in NYC, co-author on the study. "So there's a need for creating much more miniaturised microscopes."
The new microscope revealed by UW is able to combine technologies in a compact and novel way, generating high-quality images at faster speeds than existing bulkier devices. It uses "dual-axis confocal microscopy" to illuminate and more clearly see through opaque tissue, capturing details up to half a millimetre beneath the tissue surface, where some types of cancerous cells originate. In the video below, for example, the team produced images of fluorescent blood vessels in a mouse ear. In their paper, they demonstrate how their invention has sufficient resolution to see subcellular details.
"For brain tumour surgery, there are often cells left behind that are invisible to the neurosurgeon. This device will really be the first to let you identify these cells during the operation and determine exactly how much further you can reduce this residual," said project collaborator Nader Sanai, a professor of neurosurgery at the Barrow Neurological Institute in Phoenix. "That's not possible to do today."
Human clinical trials are expected to start in 2017 and the team hopes it can be introduced into surgeries by 2018-2020.
22nd January 2016
Brain implant will connect a million neurons with superfast bandwidth
A neural interface being created by the United States military aims to greatly improve the resolution and connection speed between biological and non-biological matter.
The Defence Advanced Research Projects Agency (DARPA) – a branch of the U.S. military – has announced a new research and development program known as Neural Engineering System Design (NESD). This aims to create a fully implantable neural interface able to provide unprecedented signal resolution and data-transfer bandwidth between the human brain and the digital world.
The interface would serve as a translator, converting between the electrochemical language used by neurons in the brain and the ones and zeros that constitute the language of information technology. A communications link would be achieved in a biocompatible device no larger than a cubic centimetre. This could lead to breakthrough treatments for a number of brain-related illnesses, as well as providing new insights into possible future upgrades for aspiring transhumanists.
“Today’s best brain-computer interface systems are like two supercomputers trying to talk to each other using an old 300-baud modem,” says Phillip Alvelda, program manager. “Imagine what will become possible when we upgrade our tools to really open the channel between the human brain and modern electronics.”
Among NESD’s potential applications are devices that could help restore sight or hearing, by feeding digital auditory or visual information into the brain at a resolution and experiential quality far higher than is possible with current technology.
Neural interfaces currently approved for human use squeeze a tremendous amount of information through just 100 channels, with each channel aggregating signals from tens of thousands of neurons at a time. The result is noisy and imprecise. In contrast, the NESD program aims to develop systems that communicate clearly and individually with any of up to one million neurons in a given region of the brain.
To achieve these ambitious goals and ensure the technology is practical outside of a research setting, DARPA will integrate and work in parallel with numerous areas of science and technology – including neuroscience, synthetic biology, low-power electronics, photonics, medical device packaging and manufacturing, systems engineering, and clinical testing. In addition to the program’s hardware challenges, NESD researchers will be required to develop advanced mathematical and neuro-computation techniques, to transcode high-definition sensory information between electronic and cortical neuron representations and then compress and represent the data with minimal loss.
The NESD program aims to recruit a diverse roster of leading industry stakeholders willing to offer state-of-the-art prototyping, manufacturing services and intellectual property. In later phases of the program, these partners could help transition the resulting technologies into commercial applications. DARPA will invest up to $60 million in the NESD program between now and 2020.
21st January 2016
Nanoparticles kill 90% of antibiotic-resistant bacteria
Light-activated nanoparticles able to kill over 90% of antibiotic-resistant bacteria have been demonstrated at the University of Colorado.
Salmonella bacteria under a microscope. Photo by NIAID / Wikipedia.
Antibiotic-resistant bacteria such as Salmonella, E. Coli and Staphylococcus infect some two million people and kill 23,000 in the U.S. each year. Efforts to defeat these so-called "superbugs" have consistently fallen short, due to the bacteria's ability to adapt rapidly and develop immunity to common antibiotics such as penicillin. In 2014, the World Health Organisation declared this a "major global threat" and warned that the world is heading for a post-antibiotic era, in which even common infections and minor injuries which have been treatable for decades can once again kill.
In this ever-escalating evolutionary battle with drug-resistant bacteria, we may soon have an advantage, however, thanks to adaptive, light-activated nanotherapy developed by scientists at the University of Colorado Boulder. Their latest research suggests that the solution to this big global problem might be to think small – very small.
In findings published by the journal Nature Materials, researchers at the Department of Chemical and Biological Engineering and the BioFrontiers Institute describe new light-activated nanoparticles known as "quantum dots." These dots, which are 20,000 times smaller than a human hair and resemble the tiny semiconductors used in consumer electronics, successfully killed 92% of drug-resistant bacterial cells in a lab-grown culture.
"By shrinking these semiconductors down to the nanoscale, we're able to create highly specific interactions within the cellular environment that only target the infection," said Prashant Nagpal, senior author of the study.
Credit: University of Colorado Boulder / BioFrontiers Institute
Previous research has shown that metal nanoparticles – created from silver and gold, among various other metals – can be effective at combating antibiotic resistant infections, but can indiscriminately damage surrounding cells as well. Quantum dots, however, can be tailored to particular infections thanks to their light-activated properties. The dots remain inactive when in darkness, but can be "activated" on command by exposing them to light, allowing researchers to modify the wavelength in order to alter and kill the infected cells.
"While we can always count on these superbugs to adapt and fight the therapy, we can quickly tailor these quantum dots to come up with a new therapy and therefore fight back faster in this evolutionary race," said Nagpal.
The specificity of this innovation may help reduce or eliminate the potential side effects of other treatment methods, as well as provide a path forward for future development and clinical trials.
"Antibiotics are not just a baseline treatment for bacterial infections, but HIV and cancer as well," said Anushree Chatterjee, an assistant professor in the Department of Chemical and Biological Engineering at CU-Boulder and a senior author of the study. "Failure to develop effective treatments for drug-resistant strains is not an option, and that's what this technology moves closer to solving."
Nagpal and Chatterjee are co-founders of PRAAN Biosciences, a Colorado-based startup that can sequence genetic profiles using just a single molecule – technology that may aid in the diagnosis and treatment of superbug strains. The authors have filed a patent on their new quantum dot technology.
21st January 2016
Tiny electronic implants that monitor brain injury, then melt away
Researchers have developed a new class of small, thin electronic sensors that monitor temperature and pressure within the skull – crucial health parameters after a brain injury or surgery – then melt away when no longer needed. This eliminates the need for additional surgery to remove the monitors and reduces the risk of infection and haemorrhage.
Similar sensors can be adapted for postoperative monitoring in other body systems as well, the researchers say. Led by John A. Rogers, a professor of materials science and engineering at the University of Illinois at Urbana-Champaign, and Wilson Ray, a professor of neurological surgery at the Washington University School of Medicine in St. Louis, the researchers have published their work in the journal Nature.
"This is a new class of electronic biomedical implants," said Professor Rogers. "These kinds of systems have potential across a range of clinical practices, where therapeutic or monitoring devices are implanted or ingested, perform a sophisticated function, and then resorb harmlessly into the body after their function is no longer necessary."
After a traumatic brain injury or brain surgery, it is crucial to monitor the patient for swelling and pressure on the brain. Current monitoring technology is bulky and invasive, Rogers said, and the wires restrict the patent's movement and hamper physical therapy as they recover. Because they require continuous, hard-wired access into the head, such implants also carry the risk of allergic reactions, infection and haemorrhage, and could even exacerbate the inflammation they are meant to monitor.
"If you simply could throw out all the conventional hardware and replace it with very tiny, fully implantable sensors capable of the same function, constructed out of bioresorbable materials in a way that also eliminates or greatly miniaturises the wires, then you could remove a lot of the risk and achieve better patient outcomes," Rogers said. "We were able to demonstrate all of these key features in animal models, with a measurement precision that's just as good as that of conventional devices."
The new devices incorporate dissolvable silicon technology developed by Rogers' group. The sensors, smaller than a grain of rice, are built on extremely thin sheets of silicon – which are naturally biodegradable – that are configured to function normally for a few weeks, then dissolve away, completely and harmlessly in the body's own fluids.
Rogers' group teamed with Illinois materials science and engineering professor Paul V. Braun to make the silicon platforms sensitive to clinically relevant pressure levels in the intracranial fluid surrounding the brain. They also added a tiny temperature sensor and connected it to a wireless transmitter roughly the size of a postage stamp, implanted under the skin but on top of the skull.
A downloadable image gallery is available.
The Illinois group worked with clinical experts in traumatic brain injury at Washington University to implant the sensors in rats, testing for performance and biocompatibility. They found that the temperature and pressure readings from the dissolvable sensors matched the conventional monitoring devices for accuracy.
"The ultimate strategy is to have a device that you can place in the brain – or in other organs in the body – that is entirely implanted, intimately connected with the organ you want to monitor and can transmit signals wirelessly to provide information on the health of that organ, allowing doctors to intervene if necessary to prevent bigger problems," said Rory Murphy, a neurosurgeon at Washington University and co-author of the paper. "After the critical period that you actually want to monitor, it will dissolve away and disappear."
The researchers are moving toward human trials for this technology, as well as extending its functionality for other biomedical applications.
"We have established a range of device variations, materials and measurement capabilities for sensing in other clinical contexts," Rogers said. "In the near future, we believe that it will be possible to embed therapeutic function – such as electrical stimulation or drug delivery – into the same systems while retaining the essential bioresorbable character."
Dissolution of NFC system. All images credit: University of Illinois
26th December 2015
New genes associated with extreme longevity identified
A new Big Data statistical method has identified five longevity loci, providing clues about the physiological mechanisms of successful aging.
Centenarians – that is, people who live to be 100 or more – make up around 0.1% of the 40 million U.S. adults aged 65 and older. These individuals demonstrate successful aging as they remain active and alert even at very old ages. In a study this month, scientists at Stanford University and the University of Bologna have uncovered new clues about the basis for longevity, by finding genetic loci associated with extreme lifespans.
Previous research has indicated that centenarians have health and dietary habits similar to the average person, suggesting that factors in their genetic make-up could contribute to successful aging. However, prior studies have identified only a single gene (APOE, known to be involved in Alzheimer's) that was different in centenarians versus normal agers. The results from this latest study indicate that, in fact, several disease variants may be absent in centenarians versus the general population.
Disease GWAS show substantial genetic overlap with longevity. Shown are results for coronary artery disease and Alzheimer's disease. The y-axis is the observed P values for longevity, and the x-axis is the expected P values under the null hypothesis that the disease is independent of longevity. Cyan, blue and purple lines show the P values for longevity of the top 100, 250, and 500 disease SNPs from independent genetic loci, respectively. Red lines show the background distribution of longevity P values for all independent genetic loci tested in both the longevity and disease GWAS. The grey diagonal line corresponds to threshold for nominal significance (P< = 0.05) for longevity.
The report by Kristen Fortney and colleagues, published in PLOS Genetics, is an example of using Big Data to glean information about an extremely complicated trait such as longevity. To find the longevity genes, they first developed a new statistical method, known as informed genome-wide association studies (iGWAS). This took advantage of existing data from 14 diseases to narrow the search for genes associated with longevity. By using their iGWAS method, the scientists found five longevity loci, providing valuable clues about the physiological mechanisms for healthy aging. These loci are known to be involved in various processes – including cell senescence, autoimmunity and cell signalling, as well as Alzheimer's disease.
The incidence of nearly all diseases increases with age, so understanding the genetic factors for successful aging could have a large impact on health. Future work may lead to a better understanding of precisely how these genes enable successful aging. Future studies could also identify additional longevity genes by recruiting a greater number of centenarians for analysis.
8th December 2015
Genes for longer and healthier life identified
From a 'haystack' of 40,000 genes in three different organisms, scientists have found genes that are involved in physical aging. If you influence only one of these genes, the healthy lifespan of laboratory animals is extended – and possibly that of humans, too.
Driven by the quest for eternal youth, humankind has spent centuries obsessed with the question of how exactly it is that we age. With advancements in molecular genetics in recent decades, the search for genes involved in the aging process has greatly accelerated. Until now, this was mostly limited to genes of individual model organisms such as the C. elegans nematode, which revealed that around 1% of its genes could influence life expectancy. However, researchers have long assumed that such genes arose during the course of evolution and in all living beings whose cells preserved a nucleus – from yeast to humans.
Researchers at ETH Zurich and the JenAge consortium in Germany have now systematically gone through the genomes of three different organisms in search of the genes associated with the aging process that are present in all three species – and thus, derived from a common ancestor. Although they are found in different organisms, these so-called orthologous genes are closely related to each other, and they are all found in humans, too.
To detect them, researchers examined around 40,000 genes in the nematode C. elegans, zebra fish and mice. By screening them, the scientists wanted to determine which genes are regulated in an identical manner in all three organisms in each comparable aging stage: young, mature and old. As a measure of gene activity, they recorded the amount of messenger RNA (mRNA) molecules found in the cells of these animals. mRNA is the transcript of a gene and the blueprint of a protein. When there are many copies of an mRNA of a specific gene, it is very active; the gene is said to be "upregulated". Fewer mRNA copies, to the contrary, are regarded as a sign of low activity.
From this information, the researchers used statistical models to establish an intersection of genes that were regulated in the same manner in the worms, fish and mice. This showed that the three organisms have only 30 genes in common that significantly influence the aging process.
From left to right: C. elegans nematode, zebra fish and mouse.
Credit: Bob Goldstein [CC BY-SA 3.0]
By conducting experiments in which the mRNA of the corresponding genes were selectively blocked, the researchers pinpointed their effect on the aging process in nematode worms. With a dozen of these genes, blocking them extended the lifespan by at least five percent.
One of these genes proved to be particularly influential: the bcat-1 gene. "When we blocked the effect of this gene, it significantly extended the mean lifespan of the nematode by up to 25 percent," says Professor Michael Ristow, coordinating author of the recently published study and Professor of Energy Metabolism at ETH.
When the gene activity of bcat-1 was inhibited, branched-chain amino acids accumulated in the tissue, triggering a molecular signalling cascade that increased longevity. Moreover, the timespan during which the worms remained healthy was extended. As a measure of vitality, the researchers observed the accumulation of aging pigments, the speed at which the creatures moved, and how often the nematodes successfully reproduced. All of these parameters improved markedly.
Professor Ristow has no doubt that the same mechanism occurs in humans: "We looked only for the genes that are conserved in evolution and therefore exist in all organisms including humans," he says. A follow-up study is already planned. "However, we can't measure the life expectancy of humans for obvious reasons," he adds. Instead, they plan to incorporate various health parameters, such as cholesterol or blood sugar levels in their study to obtain indicators on the health status of their subjects.
Multiple branched-chain amino acids are already being used to treat liver damage and also feature in sports nutrition products. This follow-up study will deliver new and important indicators on how the aging process could be influenced and how age-related diseases might be prevented.
"However, the point is not for people to grow even older – but rather, to stay healthy for longer," the researchers argue. Given the unfavourable demographics and steadily increasing life expectancy, it is important to extend the healthy life phase – or "healthspan" – and not to simply reach an even higher age that is characterised by chronic diseases. With such preventive measures, elderly people could greatly improve their quality of life, while at the same time cutting their healthcare costs by more than half.
28th November 2015
Cell survival genes identified
By switching off, one by one, almost 18,000 genes — about 90 per cent of the entire human genome — scientists have identified the genes that are essential for cell survival. This could improve our understanding of which genes are most important in diseases like cancer.
Cancer cells grown in a dish. Credit: University of Toronto
Scientists from the University of Toronto's Donnelly Centre have mapped out the genes that keep our cells alive, creating a long-awaited foothold for understanding how our genome works and which genes are crucial in disease like cancer. A team of researchers led by Professor Jason Moffat have switched off, one by one, almost 18,000 genes — 90% of the entire human genome — to find the genes that are essential for cell survival.
The data, published on 25th November in the peer-reviewed journal Cell, reveals a "core" set of 1,500 essential genes. This lays the foundation for reaching the long-standing goal in biomedical research of pinpointing a role for every single gene in the human genome.
By turning genes off in five different cancer cell lines — including brain, retinal, ovarian, and two kinds of colorectal cancer cells — the team uncovered that each set of cells relies on a unique set of genes that can be targeted by specific drugs. This finding raises hope of devising new treatments that would target only cancerous cells, leaving the healthy tissue unharmed.
"It's when you get outside the core set of essential genes, that it starts to get interesting in terms of how to target particular genes in different cancers and other disease states," says Moffat.
Sequencing of the human genome in 2003 allowed scientists to compile a list of parts – our 20,000 genes – which form our cells and bodies. But despite this major achievement, they still didn't understand the function of each individual gene, or how some genes make us sick when they go wrong. For this, scientists realised they would have to switch genes off, one by one across the entire genome to determine what processes go wrong in the cells. But the available tools were either inaccurate or too slow.
The recent arrival of the gene editing technology CRISPR has finally made it possible to turn genes off, swiftly and with pinpoint accuracy, kicking off a global race among multiple competing research teams. The Toronto study, along with the paper from Harvard and MIT, published recently in Science, found that roughly 8% of our genes are essential for cell survival.
These findings show the majority of human genes play more subtle roles in the cell, because switching them off doesn't kill the cell. But if two or more of such genes are mutated at the same time, or the cells are under environmental stress, their loss begins to count.
Because different cancers have different mutations, they tend to rely on different sets of genes to survive. Professor Moffatt's team have identified distinct sets of "smoking gun" genes for each of the tested cancers – each set susceptible to different drugs.
"We can now interrogate our genome at unprecedented resolution in human cells that we grow in the lab with incredible speed and accuracy," he says. "In short order, this will lead to a functional map of cancer that will link drug targets to DNA sequence variation."
Already, his team has shown how this can work. In their study, a widely prescribed diabetes drug called Metformin successfully killed brain cancer cells and those of one form of colorectal cancer – but was useless against the other cancers he studied. However, the antibiotics chloramphenicol and linezolid were effective against another form of colorectal cancer, and not against brain or other cancers studied. These data illustrate the clinical potential of the data in pointing to more precise treatments for the different cancers – and show the value of personalised medicine.
"The Moffat group has developed a powerful CRISPR library that could be used by investigators around the world to identify new strategies for the treatment of cancer," says Dr. Aaron Schimmer from Princess Margaret Cancer Centre in Toronto, who was not involved in the study. "I would be interested in using this tool to identify new treatment approaches for acute myeloid leukaemia – a blood cancer with a high mortality rate."
23rd November 2015
Global drug spending to increase 30% by 2020
Global spending on medicines is predicted to rise by 30% over the next five years – driven by expensive new drugs, price hikes, aging populations and increased generic drug use in developing countries, according to a new forecast by IMS Health.
More than half of the world’s population will live in countries where medicine use will exceed one dose per person per day by 2020 – up from 31 percent in 2005, as the medicine use gap between the developed and “pharmerging” markets narrows. According to new research by the IMS Institute for Healthcare Informatics, total spending on medicines will reach $1.4 trillion by 2020, due to greater patient access to chronic disease treatments and breakthrough innovations in drug therapies. Global spending is forecast to grow at a 4-7 percent compound annual rate over the next five years.
The report, Global Medicines Use in 2020: Outlook and Implications, found that total global spend for pharmaceuticals will increase by $349 billion on a constant-dollar basis, compared with $182 billion during the past five years. Spending is measured at the ex-manufacturer level before adjusting for rebates, discounts, taxes and other adjustments that affect net sales received by manufacturers. The impact of these factors is estimated to reduce growth by $90 billion or approximately 25 percent of the growth forecast through 2020.
“During the next five years, we expect to see a surge of innovative medicines emerging from R&D pipelines, as well as technology-enabled advances that will deliver measurable improvements to health outcomes,” said Murray Aitken, IMS Health senior vice president and executive director of the IMS Institute for Healthcare Informatics. “With unprecedented treatment options, greater availability of low-cost drugs and better use of evidence to inform decision making, stakeholders around the world can expect to get more ‘bang for their medicine buck’ in 2020 than ever before.”
In its latest study, the IMS Institute highlights the following findings:
• Global medicine use in 2020 will reach 4.5 trillion doses, up 24 percent from 2015. Most of the global increase in use of medicines will take place in pharmerging markets, with India, China, Brazil and Indonesia representing nearly half of that growth. Volumes in developed markets will remain relatively stable and trend toward original branded products, as use of specialty medicines becomes more widespread. Generics, non-original branded and over the counter (OTC) products will account for 88 percent of total medicine use in pharmerging markets by 2020, and provide the greatest contribution to increased access to medicines in those countries. Newer specialty medicines, which typically have low adoption rates in pharmerging countries lacking the necessary healthcare infrastructure, will represent less than one percent of the total volume in those markets.
• Global spending will grow by 29-32 percent through 2020, compared with an increase of 35 percent in the prior five years. Spending levels will be driven by branded drugs primarily in developed markets, along with the greater use of generics in pharmerging markets – offset by the impact of patent expiries. Brand spending in developed markets will rise by $298 billion as new products are launched and as price increases are applied in the U.S., most of which will be offset by off-invoice discounts and rebates. Patent expiries are expected to result in $178 billion in reduced spending on branded products, including $41 billion in savings on biologics as biosimilars become more widely adopted. Many of the newest treatments are specialty medicines used to address chronic, rare or genetic diseases and yielding significant clinical value. By 2020, global spending on these medicines is expected to reach 28 percent of the total.
• More than 90 percent of U.S. medicines will be dispensed as generics by 2020. Generic medicines will continue to provide the vast majority of the prescription drug usage in the U.S., rising from 88 percent to 91-92 percent of all prescriptions dispensed by 2020. Spending on medicines in the U.S. will reach $560-590 billion, a 34 percent increase in spending over 2015 on an invoice price basis. While invoice price growth – which does not reflect discounts and rebates received by payers – is expected to continue at historic levels during the next five years, net price trends for protected brands will remain constrained by payers and competition, resulting in 5-7 percent annual price increases. The impact of the Affordable Care Act (ACA) will continue to have an effect on medicine spending during the next five years largely due to expanded insurance coverage. By 2020, there will be broad adoption of ACA provisions that encourage greater care coordination and movement of at least one-third of spending to an outcomes or performance basis.
• More than 225 medicines will be introduced by 2020, with one-third focused on treating cancer. Disease treatments in 2020 will be transformed by the increased number and quality of new drugs in clusters of innovation around cancer, hepatitis C, autoimmune disorders, heart disease and an array of rare diseases. During the next five years, an additional 75 new orphan drugs are expected to be available for dozens of therapeutic areas that currently have limited or no treatment options. By 2020, technology will be enabling more rapid changes to treatment protocols, increasing patient engagement and accountability, shifting patient-provider interaction, and accelerating the adoption of behaviour changes that will improve patient adherence to treatments. Every patient with multiple chronic conditions will have the potential to use wearables, mobile apps and other technologies to manage their health, interact with providers, fellow patients and family members. The ubiquity of smartphones, tablets, apps and related wearable devices, as well as electronic medical records and exponentially increasing real-world data volumes, will open new avenues to connect healthcare while offering providers and payers new mechanisms to control costs.
The full report, including a detailed description of the methodology, is available at theimsinstitute.org. It can also be downloaded as an app via iTunes at https://itunes.apple.com. The study was produced independently as a public service, without industry or government funding.
22nd November 2015
Genetically modified salmon approved by FDA
For the first time, the U.S. Food and Drug Administration (FDA) has approved genetically modified fish for human consumption.
AquaBounty Technologies, Inc., a biotechnology company focused on enhancing productivity in aquaculture, announced this week that the FDA has approved its application for the production, sale and consumption of "AquAdvantage Salmon". This Atlantic salmon has been genetically enhanced to reach market size in less time than conventional farmed Atlantic salmon.
Ronald Stotish, Ph.D., CEO of AquaBounty, commented: "AquAdvantage Salmon is a game-changer that brings healthy and nutritious food to consumers in an environmentally responsible manner without damaging the ocean and other marine habitats. Using land-based aquaculture systems, this rich source of protein and other nutrients can be farmed close to major consumer markets in a more sustainable manner."
The U.S. currently imports over 90% of its seafood – and more specifically, over 95% of the Atlantic salmon it consumes. AquAdvantage Salmon will create the opportunity to grow an economically viable, domestic aquaculture industry. Through greater efficiency and localised production, AquaBounty claims it can increase productivity while reducing the costs and environmental impacts of current salmon farming operations. Land-based aquaculture systems can provide a continuous supply of fresh, safe, traceable and sustainable GM salmon to communities across the U.S. and do so with a lower carbon footprint. This offers an alternative approach to fish farming that does not exploit the oceans.
Jack Bobo, Senior Vice President and Chief Communications Officer at parent company Intrexon, stated: "The U.S. Dietary Guidelines Advisory Committee encourages Americans to eat a wide variety of seafood, including wild caught and farmed, as part of a healthy diet rich in healthy fatty acids. However, this must occur in an environmentally friendly and sustainable manner. FDA's approval of the AquAdvantage Salmon is an important step in this direction."
The AquAdvantage fish program is based on a molecular modification that results in more rapid growth during early development. A gene responsible for growth hormone regulation is taken from a Pacific Chinook salmon, combined with a promoter from an ocean pout, then added to the Atlantic salmon's 40,000 genes. This makes it grow year-round, instead of only during spring and summer, without affecting its ultimate size or other qualities. The GM fish grows to market size in 16 to 18 months, rather than three years.
The AquaAdvantage program has other qualities that improve its sustainability credentials. The fish require 25% less feed than other Atlantic salmon on the market today. When farmed in land-based facilities close to major metropolitan areas, they will travel only a short distance to the consumer. Not only will this make them the freshest fish on the market, it will significantly cut the transportation distance from farm to table. Unlike salmon imported from Norway and Chile that travel thousands of miles by airfreight and are then trucked to markets, AquaBounty's salmon will have a carbon footprint that is 23-25 times less.
The FDA determined that the approval of the GM technology would not have a significant environmental impact, because of multiple and redundant measures taken to contain the fish and prevent their escape into the wild. These measures include a series of physical barriers placed in the tanks and in the plumbing that carries water out of the facilities to block the eggs and fish. Furthermore, the AquAdvantage Salmon are reproductively sterile, so that even in the highly unlikely event of an escape, they would be unable to interbreed or establish populations in the wild. The FDA will maintain regulatory oversight of the production and facilities and will conduct inspections to confirm these containment measures remain adequate.
Despite a lengthy and detailed review process, however, the FDA's approval has provoked an angry response from some, who have questioned the safety aspects and object to the fact that no labelling will be required to indicate the fish were genetically engineered. The Centre for Food Safety (CFS), a non-profit organisation working to protect human health and promote organic food methods, has already announced plans to sue the FDA and prevent the modified salmon being sold in the U.S.
"The review process by FDA was inadequate, failed to fully examine the likely impacts of the salmon's introduction and lacked a comprehensive analysis," said executive director Andrew Kimbrell in a press statement, citing the 2 million people who filed public comments in opposition, the largest number of comments the FDA has ever received on any issue. "This decision sets a dangerous precedent, lowering the standards of safety in this country. CFS will hold FDA to their obligations to the American people."
Globally, traditional "capture" fisheries have been on a plateau since the late 1980s due to unsustainable yields. Aquaculture is now among the fastest growing industries in the agricultural sector and is projected to supply the majority of the world's seafood by the mid-2020s, overtaking wild catch harvests by weight. With fisheries collapsing from over-exploitation, pollution, climate change and other problems, aquaculture is likely to become a sustainable and vitally important industry of the 21st century.
20th November 2015
Self-healing sensor brings 'electronic skin' closer to reality
Scientists have developed a self-healing, flexible sensor that mimics the self-healing properties of human skin. Cuts or scratches to the sensors "heal" themselves in less than one day.
Flexible sensors have been developed for use in consumer electronics, robotics, health care, and space flight. Future possible applications could include the creation of ‘electronic skin’ and prosthetic limbs that allow wearers to ‘feel’ changes in their environments.
One problem with current flexible sensors, however, is that they can be easily scratched and otherwise damaged, potentially destroying their functionality. Researchers in the Department of Chemical Engineering at the Technion – Israel Institute of Technology in Haifa (Israel), who were inspired by the healing properties in human skin, have developed materials that can be integrated into flexible devices to “heal” incidental scratches or damaging cuts that might compromise device functionality. The advancement, using a new kind of synthetic polymer (a polymer is a large molecule composed of many repeated smaller molecules) has self-healing properties that mimic human skin, which means that e-skin “wounds” can quickly “heal” themselves in remarkably short time – less than a day.
A paper outlining the characteristics and applications of the unique, self-healing sensor has been published in the current issue of Advanced Materials.
“The vulnerability of flexible sensors used in real-world applications calls for the development of self-healing properties similar to how human skin heals,” said self-healing sensor co-developer Professor Hossam Haick. “Accordingly, we have developed a complete, self-healing device in the form of a bendable and stretchable chemiresistor where every part – no matter where the device is cut or scratched – is self-healing.”
The new sensor is comprised of a self-healing substrate, high conductivity electrodes, and molecularly modified gold nanoparticles. “The gold particles on top of the substrate and between the self-healing electrodes are able to “heal” cracks that could completely disconnect electrical connectivity,” explains Prof. Haick.
Once healed, the polymer substrate of the self-healing sensor demonstrates sensitivity to volatile organic compounds (VOCs), with detection capability down to tens of parts per billion. It also demonstrates superior healability at the extreme temperatures of -20 degrees C to 40 degrees C. This property, said the researchers, can extend applications of the self-healing sensor to areas of the world with extreme climates. From sub-freezing cold to equatorial heat, the self-healing sensor is environment-stable.
The healing polymer works quickest, said the researchers, when the temperature is between 0 degrees C and 10 degrees C, when moisture condenses and is then absorbed by the substrate. Condensation makes the substrate swell, allowing the polymer chains to begin to flow freely and, in effect, begin “healing.” Once healed, the nonbiological, chemiresistor still has high sensitivity to touch, pressure and strain, which the researchers tested in demanding stretching and bending tests.
Another unique feature is that the electrode resistance increases after healing and can survive 20 times or more cutting/healing cycles than prior to healing. Essentially, healing makes the self-healing sensor even stronger. The researchers noted in their paper that “the healing efficiency of this chemiresistor is so high that the sensor survived several cuttings at random positions.”
The researchers are currently experimenting with carbon-based self-healing composites and self-healing transistors.
“The self-healing sensor raises expectations that flexible devices might someday be self-administered, which increases their reliability,” explained co-developer Dr. Tan-Phat Huynh, also of the Technion, whose work focuses on the development of self-healing electronic skin. “One day, the self-healing sensor could serve as a platform for biosensors that monitor human health using electronic skin.”
9th November 2015
Fastest ever brain-computer interface for spelling
Researchers in China have achieved high-speed spelling with a noninvasive brain-computer interface.
Brain–computer interfaces (BCI) are a relatively new and emerging technology allowing direct communication between the brain and an external device. They are used for assisting, augmenting, or repairing cognitive or sensory-motor functions. Research on BCIs began in the 1970s and the first neuroprosthetic devices implanted in humans appeared in the mid-1990s.
The past 20 years have seen major progress in BCIs. However, they are still limited by low communication rates, caused by interference from spontaneous electroencephalography (EEG) signals. Now, a team of researchers from Tsinghua University in China, State Key Laboratory Integrated Optoelectronics, Institute of Semiconductors (IOS), and the Chinese Academy of Sciences have developed a greatly improved system. Their EEG-based BCI speller can achieve information transfer rates (ITRs) of 60 characters (∼12 words) per minute, by far the highest ever reported in BCI spellers for either noninvasive or invasive methods. In some of the tests, they reached up to 5.32 bits per second. For comparison, most other systems in recent years have been at 1 or 2 ITRs.
According to the researchers, they achieved this via an extremely high consistency of frequency and phase between the visual flickering signals and the elicited single-trial steady-state visual evoked potentials. Specifically, they developed a new joint frequency-phase modulation (JFPM) method to tag 40 characters with 0.5-seconds-long flickering signals, and created a user-specific target identification algorithm using individual calibration data. A paper describing this breakthrough appears in the 3rd November edition of the journal Proceedings of the National Academy of Sciences (PNAS).
In the not-too-distant future, this kind of technology could be applied to other uses, besides medicine. For example, it could be incorporated into smartphones and other consumer electronics to allow texting, typing or other on-screen actions by thought power alone. A partnership between the Japanese government and private sector aims to achieve this by 2020. With continued progress in the speed of BCIs, a new form of "virtual telepathy" could emerge within a few decades.