3rd August 2015
New technique for nanoscale images of the brain
A new technique for obtaining nanoscale images of the brain at higher resolution than ever before is announced by Boston scientists.
Multiple synapses of the same axon innervate multiple spines of the same postsynaptic cell. An extreme example
in which one axon (blue) innervates five dendritic spines (orange, labelled 1–5) of a basal dendrite (green) is shown.
Arrows point to other varicosities (swellings) of this axon that are innervating dendritic spines of other neurons.
Credit: Narayanan Kasthuri et al./Cell.
A new imaging tool developed by Boston scientists could do for the brain what the Hubble Space Telescope did for astronomy. In the first demonstration of how the technology works, published in the journal Cell, the researchers look inside the brain of an adult mouse at a scale previously unachievable, generating images at 3 nanometre (nm) pixel resolution. The inventors' long-term goal is to make the resource available to the scientific community in the form of a national brain observatory.
"I'm a strong believer in bottom up-science, which is a way of saying that I would prefer to generate a hypothesis from the data and test it," says senior study author Jeff Lichtman, of Harvard University. "For people who are imagers, being able to see all of these details is wonderful and we're getting an opportunity to peer into something that has remained somewhat intractable for so long. It's about time we did this, and it is what people should be doing about things we don't understand."
The researchers have begun the process of mining their imaging data by looking first at an area of the brain that receives sensory information from mouse whiskers, which help the animals orient themselves and are even more sensitive than human fingertips. The scientists used a program called VAST, developed by co-author Daniel Berger of Harvard and the Massachusetts Institute of Technology, to assign different colours and piece apart each individual "object" (e.g., neuron, glial cell, blood vessel cell, etc.).
"The complexity of the brain is much more than what we had ever imagined," says study first author Narayanan "Bobby" Kasthuri, of the Boston University School of Medicine. "We had this clean idea of how there's a really nice order to how neurons connect with each other, but if you actually look at the material it's not like that. The connections are so messy that it's hard to imagine a plan to it, but we checked and there's clearly a pattern that cannot be explained by randomness."
The researchers see great potential in the tool's ability to answer questions about what a neurological disorder actually looks like in the brain, as well as what makes the human brain different from other animals and different between individuals. Who we become is very much a product of the connections our neurons make in response to various life experiences. To be able to compare the physical neuron-to-neuron connections in an infant, a mathematical genius, and someone with schizophrenia would be a leap in our understanding of how our brains shape who we are (or vice versa).
The cost and data storage demands for this type of research are still high, but the researchers expect expenses to drop over time (as has been the case with genome sequencing). To facilitate data sharing, the scientists are now partnering with Argonne National Laboratory with the hopes of creating a national brain laboratory that neuroscientists around the world can access within the next few years.
"It's bittersweet that there are many scientists who think this is a total waste of time as well as a big investment in money and effort that could be better spent answering questions that are more proximal," Lichtman says. "As long as data is showing you things that are unexpected, then you're definitely doing the right thing. And we are certainly far from being out of the surprise element. There's never a time when we look at this data that we don't see something that we've never seen before."
20th July 2015
Nanowires boost solar fuel cell efficiency tenfold
Nanowires have been used by Dutch researchers to boost solar fuel cell efficiency tenfold, while using 10,000 times less precious material.
Researchers at Eindhoven University of Technology (EUT) and the Foundation for Fundamental Research on Matter (FOM) in the Netherlands have demonstrated a highly promising prototype of a solar cell that generates fuel, rather than electricity. The material gallium phosphide enables their cell to produce the clean fuel hydrogen gas from liquid water. By processing the gallium phosphide using tiny nanowires, the yield is boosted by a factor of ten, while using 10,000 times less precious material.
Electricity produced by a solar cell can be used to set off chemical reactions. If this generates a fuel, then one speaks of solar fuels – a hugely promising replacement for polluting fuels. One possibility is to split liquid water using the electricity that is generated (electrolysis). Among oxygen, this produces hydrogen gas that can be used as a clean fuel in the chemical industry or combusted in fuel cells – in cars for example – to drive engines.
To connect an existing silicon solar cell to a battery that splits the water may well be an efficient solution now, but is very expensive. Many researchers are therefore trying to develop a semiconductor material able to both convert sunlight to an electrical charge and split the water, all in one; a kind of "solar fuel cell". Researchers at EUT and FOM see their dream candidate in gallium phosphide (GaP), a compound of gallium and phosphorus that also serves as the basis for specific coloured LEDs.
GaP has good electrical properties, but it cannot easily absorb light when it consists of a large flat surface, as used for solar cells. The researchers overcame this problem by making a grid of tiny GaP nanowires, measuring 500 nanometres (a millionth of a millimetre) in length and just 90 nanometres thick. This design immediately boosted the yield of hydrogen to 2.9 percent – a factor of ten improvement and a record for GaP cells, even though still some way off the 15 percent achieved by silicon cells coupled to a battery.
Research leader and EUT professor Erik Bakkers said it’s not simply about the yield – where there is still a lot of scope for improvement he points out: “For the nanowires we needed 10,000 less precious GaP material than in cells with a flat surface. That makes these kinds of cells potentially a great deal cheaper. In addition, GaP is also able to extract oxygen from the water – so you then actually have a fuel cell in which you can temporarily store your solar energy. In short, for a solar fuels future we cannot ignore gallium phosphide any longer.”
The researchers describe their breakthrough in the journal Nature Communications.
13th July 2015
7 nanometre chips enable Moore's Law to continue
Researchers have announced a breakthrough in the manufacture of 7 nanometre (nm) computer chips, enabling the trend of Moore's Law to continue for the next few years.
IBM Research has announced the semiconductor industry's first 7nm (nanometre) node test chips with functioning transistors. The breakthrough was accomplished in partnership with GLOBALFOUNDRIES and Samsung at SUNY Polytechnic Institute's Colleges of Nanoscale Science and Engineering (SUNY Poly CNSE) and could result in the ability to place more than 20 billion tiny switches – transistors – on the fingernail-sized chips that power everything from smartphones to spacecraft.
To achieve the higher performance, lower power and scaling benefits promised by 7nm technology, researchers had to bypass conventional semiconductor manufacturing approaches. Among the novel processes and techniques pioneered in this collaboration were a number of industry-first innovations, most notably Silicon Germanium (SiGe) channel transistors and Extreme Ultraviolet (EUV) lithography integration at multiple levels.
Industry experts consider 7nm technology crucial to meeting the anticipated demands of future cloud computing and Big Data systems, cognitive computing, mobile products and other emerging "exponential" technologies. This accomplishment was part of IBM's $3 billion, five-year investment in chip R&D announced last year.
"For business and society to get the most out of tomorrow's computers and devices, scaling to 7nm and beyond is essential," said Arvind Krishna, senior vice president and director of IBM Research. "That's why IBM has remained committed to an aggressive basic research agenda that continually pushes the limits of semiconductor technology. Working with our partners, this milestone builds on decades of research that has set the pace for the microelectronics industry, and positions us to advance our leadership for years to come."
Microprocessors utilising 22nm and 14nm technology power today's servers, cloud data centres and mobile devices, and 10nm technology is well on the way to becoming a mature technology. The IBM Research-led alliance achieved close to 50 percent area scaling improvements over today's most advanced technology, introduced SiGe channel material for transistor performance enhancement at 7nm node geometries, process innovations to stack them below 30nm pitch and full integration of EUV lithography at multiple levels. These techniques and scaling could result in at least a 50 percent power/performance improvement for next generation systems that will power the Big Data, cloud and mobile era. These new 7nm chips are expected to start appearing in computers and other gadgets in 2017-18.
8th July 2015
The world's first 2TB consumer SSDs
Samsung has announced the first 2 terabyte solid state drives for the consumer market – continuing the exponential trend in data storage.
Samsung has announced two new SSDs – the 850 Pro and 850 EVO – both offering double the capacity of the previous generation. The 2.5" form factor drives can greatly boost performance for desktops and laptops. They will be especially useful in the accessing and storage of 4K video, which can often require enormous file sizes. The available capacities include 120GB, 250GB, 500GB, and 1TB, all the way up to 2TB.
The 850 Pro is designed for power users needing the maximum possible speed, while the 850 EVO is less powerful but somewhat cheaper. The 850 Pro features up to 550MBps sequential read and 520MBps sequential write rates and 100,000 random I/Os per second (IOPS). The 850 EVO has 540MBps sequential read and 520MBps write rates, with up to 90,000 random IOPS. Both models feature 3D V-NAND technology, which stacks 32 layers of transistors on top of each other. The drives also use multi-level cell (MLC) and triple-level cell (TLC) (2- and 3-bit per cell) technology for even greater memory density.
Until recently, consumers were forced to choose between speed or size when it came to upgrading their hard drives. For pure speed, a solid state drive was the best option, while larger sizes were typically catered for with slower and clunkier spinning drives. These new terabyte-scale SSDs are going to change that – combining both high speed and high capacity. Price may still be an issue, as Samsung's new product line doesn't come cheap. The 2TB version of the 850 Pro will retail for $999.99 and the 850 EVO is $799.99. However, given the trend in price performance witnessed in earlier generations of data storage, it is likely these high capacity SSDs will soon be a lot cheaper.
"Samsung experienced a surge in demand for 500 gigabyte (GB) and higher capacity SSDs with the introduction of our V-NAND SSDs," says Un-Soo Kim, Senior Vice President of Branded Product Marketing, Memory Business, in a press release from Samsung. "The release of the 2TB SSD is a strong driver into the era of multi-terabyte SSD solutions. We will continue to expand our ultra-high performance and large density SSD product portfolio and provide a new computing experience to users around the globe."
19th June 2015
First full genome of a living organism sequenced and assembled using technology the size of smartphone
Researchers in Canada and the U.K. have for the first time sequenced and assembled de novo the full genome of a living organism, the bacteria Escherichia Coli, using Oxford Nanopore’s MinION device – a genome sequencer that can fit in the palm of your hand.
The findings, published this week in the journal Nature Methods, provide proof of concept for the technology and the methods lay the groundwork for using it to sequence full (as opposed to partial) genomes in increasingly more complex organisms – eventually including humans – said Jared Simpson, Principal Investigator at the Ontario Institute for Cancer Research and a lead author on the study.
“The amazing thing about this device is that it is many times smaller than a normal sequencer – you just attach it to a laptop using a USB cable,” said Simpson. “And while our work is a demonstration of the capabilities of the technology, the most significant advance is in the methods. We were able to mathematically model nanopore sequencing and develop ways to reconstruct complete genomes off this tiny sequencer.”
While standard sequencing platforms can either generate vast amounts of data, or read long enough stretches of the genome to allow complete reconstruction, the Nanopore device has the potential to achieve both goals, according to Simpson: “Long reads are necessary to assemble the most repetitive parts of genomes but we need a lot of reads if we want to sequence human genomes. The small size of the MinION suggests there is room to scale up and sequence larger and more complex samples,” Simpson said.
A drawback of the technology is that the single reads it produces are currently less accurate than the reads produced by larger devices. Strong bioinformatics tools are needed to correct errors. The methods Simpson and colleagues developed are able to overcome the error rate and compute a more accurate final sequence.
"This was a fantastic example of a successful long distance research collaboration between Canada and the U.K.,” said Dr. Nicholas Loman, a co-lead author on the paper and an Independent Research Fellow from the Institute of Microbiology and Infection at University of Birmingham. “We explored new ways of working, including hosting a hackathon to explore new algorithm development and using shared computing resources on the Medical Research Council funded Cloud Infrastructure for Microbial Bioinformatics (CLIMB) based in the U.K. Midlands and Wales."
The method of assembly the authors devised had three stages. First, overlaps between sequence reads are detected and corrected using a multiple alignment process. Then the corrected reads are assembled using the Celera assembler and finally the assembly is refined using a probabilistic model of the electric signals caused by DNA moving through the nanopore.
“This work has incredible potential,” said Dr. Tom Hudson, President and Scientific Director of the Ontario Institute for Cancer Research. “Scaled up, this technology could one day be used to sequence tumour genomes. The device’s portable nature would allow for sequencing to become far more accessible, bringing the option of more personalised diagnosis and treatment to more patients.”
As the speed, accuracy and cost of whole genome sequencing continues to improve, a wide range of practical applications will become possible. Investigators at crime scenes, for example, could analyse biological evidence without having to return to the laboratory. Foreign aid workers in developing nations could identify viruses and verify water quality. Food inspectors could check for harmful pathogens in restaurants. Wildlife biologists could study genes in the field.
20th May 2015
A breakthrough in large-scale graphene fabrication
One of the barriers to using graphene at a commercial scale could be overcome using a new method demonstrated by researchers at the Department of Energy's Oak Ridge National Laboratory (ORNL).
Graphene – a material stronger and stiffer than carbon fibre – has enormous commercial potential, but has been impractical to employ on a large scale, with researchers limited to using only small flakes of it. Now, using chemical vapour deposition, a team at the ORNL has fabricated polymer composites that contain 2-inch-by-2-inch sheets of the one-atom thick, hexagonally arranged carbon atoms.
The findings, reported in the journal Applied Materials & Interfaces, could help usher in a new era of flexible electronics and change the way this reinforcing material is viewed and ultimately used.
"Before our work, superb mechanical properties of graphene were shown at a micro scale," said Ivan Vlassiouk, who led the research. "We have extended this to a larger scale, which considerably extends the potential applications and market for graphene."
While most approaches for polymer nanocomposition construction employ tiny flakes of graphene or other carbon nanomaterials that are difficult to disperse in the polymer, the team used larger sheets of graphene. This eliminates the flake dispersion and agglomeration problems, allowing the material to better conduct electricity with less actual graphene in the polymer.
"In our case, we were able to use chemical vapour deposition to make a nanocomposite laminate that is electrically conductive – with graphene loading that is fifty times less compared to current state-of-the-art samples," said Vlassiouk. This is a key to making the material competitive on the market.
If Vlassiouk and his team can reduce the cost and demonstrate scalability, researchers envision graphene being used in aerospace (structural monitoring, flame-retardants, anti-icing, conductive), the automotive sector (catalysts, wear-resistant coatings), structural applications (self-cleaning coatings, temperature control materials), electronics (displays, printed electronics, thermal management), energy (photovoltaics, filtration, energy storage) and manufacturing (catalysts, barrier coatings, filtration).
28th March 2015
10TB solid state drives may soon be possible
An innovative new process architecture can extend Moore's Law for flash storage – bringing significant improvements in density while lowering the cost of NAND flash.
Intel Corporation – in partnership with Micron – have announced the availability of 3D NAND, the world's highest-density flash memory. Flash is the storage technology used inside the lightest laptops, fastest data centres, and nearly every cellphone, tablet and mobile device.
3D NAND works by stacking the components in vertical layers with extraordinary precision to create devices with three times higher data capacity than competing NAND technologies. This enables more storage in a smaller space, bringing significant cost savings, low power usage and higher performance to a range of mobile consumer devices, as well as the most demanding enterprise deployments.
As data cells begin to approach the size of individual atoms, traditional "planar" NAND is nearing its practical scaling limits. This poses a major challenge for the memory industry. 3D NAND is poised to make a dramatic impact by keeping flash storage aligned with Moore's Law, the exponential trend of performance gains and cost savings, driving more widespread use of flash storage in the future.
"3D NAND technology has the potential to create fundamental market shifts," said Brian Shirley, vice president of Memory Technology and Solutions at Micron Technology. "The depth of the impact that flash has had to date – from smartphones to flash-optimised supercomputing – is really just scratching the surface of what's possible."
One of the most significant aspects of this breakthrough is in the foundational memory cell itself. Intel and Micron used a floating gate cell, a universally utilised design refined through years of high-volume planar flash manufacturing. This is the first use of a floating gate cell in 3D NAND, which was a key design choice to enable greater performance, quality and reliability.
The data cells are stacked vertically in 32 layers to achieve 256Gb multilevel cell (MLC) and 384Gb triple-level cell (TLC) dies within a standard package. This can enable gum stick-sized SSDs with 3.5TB of storage and standard 2.5-inch SSDs with greater than 10TB. Because capacity is achieved by stacking cells vertically, individual cell dimensions can be considerably larger. This is expected to increase both performance and endurance and make even the TLC designs well-suited for data centre storage.
Key product features of this 3D NAND design include:
• Large Capacities – Triple the capacity of existing technology, up to 48GB of NAND per die, enabling 750GB to fit in a single fingertip-sized package.
• Reduced Cost per GB – First-generation 3D NAND is architected to achieve better cost efficiencies than planar NAND.
• Fast – High read/write bandwidth, I/O speeds and random read performance.
• Green – New sleep modes enable low-power use by cutting power to inactive NAND die (even when other dies in the same package are active), dropping power consumption significantly in standby mode.
• Smart – Innovative new features improve latency and increase endurance over previous generations, and also make system integration easier.
The 256Gb MLC version of 3D NAND is sampling with select partners today, and the 384Gb TLC design will be sampling later this spring. The fab production line has already begun initial runs, and both devices will be in full production by the fourth quarter of this year. Both companies are also developing individual lines of SSD solutions based on 3D NAND technology and expect those products to be available within the next year.
21st March 2015
Google files patent for wearable medical device
Google has filed a patent application for a wearable medical device, able to use nanoparticles to detect and treat illnesses such as cancer.
For those wishing to protect their health and extend their lifespan, a futuristic medical device may become available in the next several years. Details of this wearable technology – known as a Nanoparticle Phoresis – have been published online by Google, via the World Intellectual Property Organisation.
The patent application describes a strap, or band, mounted on the lower arm. Similar in appearance to a wristwatch, it would "automatically modify or destroy one or more targets in the blood that have an adverse health effect." This would be achieved by beaming energy into blood vessels to stimulate cells and molecules, increasing their effectiveness at fighting diseases. It could even be used on synthetic nanoparticles. Millions of these tiny objects would be introduced into the wearer's bloodstream, then activated by magnets in the wristband and directed to specific locations.
In addition to its physical treatment abilities, the Nanoparticle Phoresis could generate vast amounts of data – not only helpful to the user, but also to researchers and doctors. It could accept inputs from the wearer regarding his or her health state, such as "feeling cold," "feeling tired," "pollen allergy symptoms today," "stressed," "feeling energetic," etc. According to the patent, these user inputs "may be used to complement any other physiological parameter data that the wearable device may collect and establish effective signal levels for and timing of modification of the target."
Analysts forecast that wearable technology will see huge growth in the coming years, with unit sales potentially reaching into the hundreds of millions. This new device from Google – if successfully developed – could become part of that rapidly evolving ecosystem. Initially aimed at patients who are seriously ill, this product (or its derivatives) could also be offered to mainstream consumers who aren't necessarily in bad health, but wish to monitor and improve their well-being.
For those with a needle phobia, injections might be possible using high-pressure jets. Although the patent itself makes no mention of this, we can speculate that such a procedure would eventually be incorporated into a wristwatch form factor. Similar to the "hypospray" on Star Trek, these jets would ensure that the skin is not punctured. High-pressure jet injection was covered on our blog in May 2012.
Looking further ahead, the prospects become even more exciting. Bill Maris – who helped form Google Calico – this month stated his belief that humans will live to be many centuries old in the future, while today's cancer treatments will seem "primitive" within just 20 years. His comments echo those of futurist and inventor Ray Kurzweil, also employed at Google and currently involved in AI research for the company. Kurzweil predicts that nanoparticles will be superseded by nanobots – small and compact enough to feature motors, sensors and other tools, allowing them to be controlled with extreme precision directly inside cells. If this idea sounds like science fiction, then consider this: a handheld smartphone today contains more processing power than a room-sized supercomputer of the 1980s. With ongoing advances in miniaturisation, together with new materials such as graphene, the future trend seems inevitable.
As humans become ever more dependent on technology, our bodies will gradually begin to incorporate these and similar devices on a permanent basis. Later in the 21st century, the line between man and machine could become blurred.