An innovative new process architecture can extend Moore's Law for flash storage – bringing significant improvements in density while lowering the cost of NAND flash.
Intel Corporation – in partnership with Micron – have announced the availability of 3D NAND, the world's highest-density flash memory. Flash is the storage technology used inside the lightest laptops, fastest data centres, and nearly every cellphone, tablet and mobile device.
3D NAND works by stacking the components in vertical layers with extraordinary precision to create devices with three times higher data capacity than competing NAND technologies. This enables more storage in a smaller space, bringing significant cost savings, low power usage and higher performance to a range of mobile consumer devices, as well as the most demanding enterprise deployments.
As data cells begin to approach the size of individual atoms, traditional "planar" NAND is nearing its practical scaling limits. This poses a major challenge for the memory industry. 3D NAND is poised to make a dramatic impact by keeping flash storage aligned with Moore's Law, the exponential trend of performance gains and cost savings, driving more widespread use of flash storage in the future.
"3D NAND technology has the potential to create fundamental market shifts," said Brian Shirley, vice president of Memory Technology and Solutions at Micron Technology. "The depth of the impact that flash has had to date – from smartphones to flash-optimised supercomputing – is really just scratching the surface of what's possible."
One of the most significant aspects of this breakthrough is in the foundational memory cell itself. Intel and Micron used a floating gate cell, a universally utilised design refined through years of high-volume planar flash manufacturing. This is the first use of a floating gate cell in 3D NAND, which was a key design choice to enable greater performance, quality and reliability.
The data cells are stacked vertically in 32 layers to achieve 256Gb multilevel cell (MLC) and 384Gb triple-level cell (TLC) dies within a standard package. This can enable gum stick-sized SSDs with 3.5TB of storage and standard 2.5-inch SSDs with greater than 10TB. Because capacity is achieved by stacking cells vertically, individual cell dimensions can be considerably larger. This is expected to increase both performance and endurance and make even the TLC designs well-suited for data centre storage.
Key product features of this 3D NAND design include:
• Large Capacities – Triple the capacity of existing technology, up to 48GB of NAND per die, enabling 750GB to fit in a single fingertip-sized package.
• Reduced Cost per GB – First-generation 3D NAND is architected to achieve better cost efficiencies than planar NAND.
• Fast – High read/write bandwidth, I/O speeds and random read performance.
• Green – New sleep modes enable low-power use by cutting power to inactive NAND die (even when other dies in the same package are active), dropping power consumption significantly in standby mode.
• Smart – Innovative new features improve latency and increase endurance over previous generations, and also make system integration easier.
The 256Gb MLC version of 3D NAND is sampling with select partners today, and the 384Gb TLC design will be sampling later this spring. The fab production line has already begun initial runs, and both devices will be in full production by the fourth quarter of this year. Both companies are also developing individual lines of SSD solutions based on 3D NAND technology and expect those products to be available within the next year.
Google has filed a patent application for a wearable medical device, able to use nanoparticles to detect and treat illnesses such as cancer.
For those wishing to protect their health and extend their lifespan, a futuristic medical device may become available in the next several years. Details of this wearable technology – known as a Nanoparticle Phoresis – have been published online by Google, via the World Intellectual Property Organisation.
The patent application describes a strap, or band, mounted on the lower arm. Similar in appearance to a wristwatch, it would "automatically modify or destroy one or more targets in the blood that have an adverse health effect." This would be achieved by beaming energy into blood vessels to stimulate cells and molecules, increasing their effectiveness at fighting diseases. It could even be used on synthetic nanoparticles. Millions of these tiny objects would be introduced into the wearer's bloodstream, then activated by magnets in the wristband and directed to specific locations.
In addition to its physical treatment abilities, the Nanoparticle Phoresis could generate vast amounts of data – not only helpful to the user, but also to researchers and doctors. It could accept inputs from the wearer regarding his or her health state, such as "feeling cold," "feeling tired," "pollen allergy symptoms today," "stressed," "feeling energetic," etc. According to the patent, these user inputs "may be used to complement any other physiological parameter data that the wearable device may collect and establish effective signal levels for and timing of modification of the target."
Analysts forecast that wearable technology will see huge growth in the coming years, with unit sales potentially reaching into the hundreds of millions. This new device from Google – if successfully developed – could become part of that rapidly evolving ecosystem. Initially aimed at patients who are seriously ill, this product (or its derivatives) could also be offered to mainstream consumers who aren't necessarily in bad health, but wish to monitor and improve their well-being.
For those with a needle phobia, injections might be possible using high-pressure jets. Although the patent itself makes no mention of this, we can speculate that such a procedure would eventually be incorporated into a wristwatch form factor. Similar to the "hypospray" on Star Trek, these jets would ensure that the skin is not punctured. High-pressure jet injection was covered on our blog in May 2012.
Looking further ahead, the prospects become even more exciting. Bill Maris – who helped form Google Calico – this month stated his belief that humans will live to be many centuries old in the future, while today's cancer treatments will seem "primitive" within just 20 years. His comments echo those of futurist and inventor Ray Kurzweil, also employed at Google and currently involved in AI research for the company. Kurzweil predicts that nanoparticles will be superseded by nanobots – small and compact enough to feature motors, sensors and other tools, allowing them to be controlled with extreme precision directly inside cells. If this idea sounds like science fiction, then consider this: a handheld smartphone today contains more processing power than a room-sized supercomputer of the 1980s. With ongoing advances in miniaturisation, together with new materials such as graphene, the future trend seems inevitable.
As humans become ever more dependent on technology, our bodies will gradually begin to incorporate these and similar devices on a permanent basis. Later in the 21st century, the line between man and machine could become blurred.
Basic wound healing has been advanced with a synthetic platelet that accumulates at sites of injury, clots and stops bleeding three times faster. The synthetic platelets have realistic size, disk-shape, flexibility, and the same surface proteins as real platelets.
Artificial platelets made by the University of California and Case Western Reserve University have been shown to halt bleeding in mouse experiments much faster than nature can on its own. For the first time, they have been able to integratively mimic the shape, size, flexibility and surface chemistry of real blood platelets on albumin-based particle platforms. The researchers believe these four design factors together are vital for inducing clots to form faster at vascular injury sites while preventing harmful clots from forming elsewhere in the body.
The new technology, reported in the journal ACS Nano, is aimed at stemming bleeding in patients suffering from traumatic injury, undergoing surgeries or suffering clotting disorders from platelet defects or a lack of platelets. Further, it could be used to deliver drugs to target sites in patients suffering atherosclerosis, thrombosis or other platelet-involved pathologic conditions.
Anirban Sen Gupta, associate professor of biomedical engineering at Case Western Reserve, previously designed peptide-based surface chemistries that mimic the clot-relevant activities of real platelets. Building on this work, he now focuses on incorporating morphological and mechanical cues that are naturally present in platelets to further refine their design.
"Morphological and mechanical factors influence the margination of natural platelets to the blood vessel wall, and only when they are near the wall can the critical clot-promoting chemical interactions take place," he said.
These cues motivated Sen Gupta to team up with Samir Mitragotri, a professor of chemical engineering at the University of California. In his laboratory, Mitragotri has recently developed albumin-based technologies to mimic the geometry and mechanical properties of red blood cells and platelets. Together, the team has developed artificial platelet-like nanoparticles (PLNs) that combine morphological, mechanical and surface chemical properties of natural platelets.
The researchers believe this refined design can simulate natural platelet's ability to collide effectively with larger and softer red blood cells in systemic blood flow. The collisions cause "margination" – pushing the platelets out of the main flow and closer to the blood vessel wall – increasing the probability of them interacting with an injury site. The surface coatings enable the artificial platelets to anchor to injury-site-specific proteins, von Willebrand Factor and collagen, while inducing the natural and artificial platelets to aggregate faster at the injury site.
Testing in mouse models showed that injection of the artificial platelets formed clots at the site of injury three times faster than natural platelets alone in the control mice. The ability to interact selectively with injury site proteins, as well as remaining mechanically flexible like natural platelets, enables these artificial versions to safely ride through the smallest of blood vessels without causing damage.
Albumin, a protein found in blood serum and eggs, is already used in cancer drugs and considered a safe material. Artificial platelets that don't become involved in a clot and continue to circulate are metabolised within one to two days. The researchers believe their new artificial platelet design may be even more effective in larger volume flows where margination to the blood vessel wall is more prominent. They will soon begin testing that capability.
In addition to stemming bleeding, Sen Gupta believes this technology could also be useful in delivering clot-busting medicines directly to clots, to treat heart attack or stroke without having to systemically suspend the body's coagulation mechanism. The artificial platelets may also be used to deliver cancer medicines to metastatic tumours with high platelet interactions.
The fastest ever integrated circuit has been announced by DARPA – achieving one terahertz (1012 Hz), or a trillion cycles per second.
Guinness World Records has officially recognised DARPA's Terahertz Electronics program for creating the fastest solid-state amplifier integrated circuit ever measured. The ten-stage common-source amplifier operates at a speed of one terahertz (1,000,000,000,000 Hz), or one trillion cycles per second — 150 billion cycles faster than the existing world record of 850 gigahertz set in 2012.
“This breakthrough could lead to revolutionary technologies such as high-resolution security imaging systems, improved collision-avoidance radar, communications networks with many times the capacity of current systems and spectrometers that could detect potentially dangerous chemicals and explosives with much greater sensitivity,” said Dev Palmer, DARPA program manager.
Developed by Northrop Grumman Corporation, the Terahertz Monolithic Integrated Circuit (TMIC) exhibits power gains several orders of magnitude beyond the current state of the art, using a super-scaled 25 nanometer gate-length. Gain, which is measured logarithmically in decibels, similar to how earthquake intensity is measured on the Richter scale, describes the ability of an amplifier to increase the power of a signal from the input to the output. The Northrop Grumman TMIC showed a measured gain of nine decibels at 1.0 terahertz and 10 decibels at 1.03 terahertz. By contrast, current smartphone technology operates at one to two gigahertz and wireless networks at 5.7 gigahertz.
“Gains of six decibels or more start to move this research from the laboratory bench to practical applications — nine decibels of gain is unheard of at terahertz frequencies” said Palmer. “This opens up new possibilities for building terahertz radio circuits.”
For years, researchers have been looking to exploit the tremendously high-frequency band beginning above 300 gigahertz where the wavelengths are less than one millimetre. The terahertz level has proven to be somewhat elusive though, due to a lack of effective means to generate, detect, process and radiate the necessary high-frequency signals.
Current electronics using solid-state technologies have largely been unable to access the sub-millimetre band of the electromagnetic spectrum due to insufficient transistor performance. To address the “terahertz gap,” engineers have traditionally used frequency conversion — converting alternating current at one frequency to alternating current at another frequency — to multiply circuit operating frequencies up from millimetre-wave frequencies. This approach, however, restricts the output power of electrical devices and adversely affects signal-to-noise ratio. Frequency conversion also increases device size, weight and power requirements.
DARPA has made a series of strategic investments in terahertz electronics through its HiFIVE, SWIFT and TFAST programs. Each program has built on the successes of the previous one, providing the foundational research necessary for frequencies to reach the terahertz threshold.
An international team has announced "the most significant breakthrough in a decade" toward developing DNA-based electrical circuits.
The central technological revolution of the 20th century was the development of computers, leading to the communication and Internet era. The main measure of this evolution has been miniaturisation: making machines smaller. A computer with the memory of the average laptop today was the size of a tennis court in the 1970s. Yet while scientists made great strides in reducing the size of individual components through microelectronics, they have been less successful at reducing the distance between transistors, the main element of our computers. These spaces between transistors have been much more challenging and extremely expensive to miniaturise – an obstacle that limits the future development of computers.
Molecular electronics, which uses molecules as building blocks for the fabrication of electronic components, was seen as the ultimate solution to the miniaturisation challenge. To date, however, no one has actually been able to make complex electrical circuits using molecules. The only known molecules that can be pre-designed to self-assemble into complex miniature circuits, which could in turn be used in computers, are DNA molecules. Nevertheless, nobody has so far been able to demonstrate reliably and quantitatively the flow of electrical current through long DNA molecules.
Now, an international group led by Prof. Danny Porath, at the Hebrew University of Jerusalem, reports reproducible and quantitative measurements of electricity flow through long molecules made of four DNA strands. The research, which could re-ignite interest in the use of DNA-based wires and devices in the development of programmable circuits, appears in the journal Nature Nanotechnology under the title "Long-range charge transport in single G-quadruplex DNA molecules."
Prof. Porath is affiliated with the Hebrew University's Institute of Chemistry and its Centre for Nanoscience and Nanotechnology. The molecules were produced by the group of Alexander Kotlyar from Tel Aviv University, who has been collaborating with Porath for 15 years. The measurements were performed mainly by Gideon Livshits, a PhD student in the Porath group. The research was carried out in collaboration with groups from Denmark, Spain, the US, Italy and Cyprus.
According to Prof. Porath, "This research paves the way for implementing DNA-based programmable circuits for molecular electronics, which could lead to a new generation of computer circuits that can be more sophisticated, cheaper and simpler to make."
A new method of producing solar cells could reduce the amount of silicon per unit area by 90 per cent compared to the current standard. With the high prices of pure silicon, this could help cut the cost of solar power.
Researchers at the Norwegian University of Science and Technology (NUST) have pioneered a new approach to manufacturing solar cells that requires less silicon and can accommodate silicon 1,000 times less pure than is currently the standard. This breakthrough means that solar cells could be made much more cheaply than at present.
“We're using less expensive raw materials, and smaller amounts of them, we have fewer production steps and our total energy consumption is potentially lower,” explains PhD candidate Fredrik Martinsen and Professor Ursula Gibson, from NUST's Department of Physics.
The researchers’ solar cells are composed of silicon fibres coated in glass. A silicon core is inserted into a glass tube about 30 mm in diameter. This is then heated so that the silicon melts and the glass softens. The tube is stretched out into a thin glass fibre filled with silicon. The process of heating and stretching makes the fibre up to 100 times thinner.
This is the widely accepted industrial method used to produce fibre optic cables. But the NUST researchers – in collaboration with Clemson University in the USA – are the first to use silicon-core fibres made this way in solar cells. The active part of these solar cells is the silicon core, with a diameter of about 100 micrometres.
This production method also enabled them to solve another problem: traditional solar cells require very pure silicon. Manufacturing pure silicon wafers is laborious, energy intensive and expensive. Using their new process, it takes only one-third of the energy to manufacture solar cells compared to the traditional approach of producing silicon wafers.
“We can use relatively dirty silicon – and the purification occurs naturally as part of the process of melting and re-solidifying in fibre form. This means that you save energy, and several steps in production,” says Gibson.
These new solar cells are based on the vertical rod radial-junction design, a relatively new approach.
“The vertical rod design still isn’t common in commercial use. Currently, silicon rods are produced using advanced and expensive nano-techniques that are difficult to scale,” says Martinsen. “But we’re using a tried-and-true industrial bulk process, which can make production a lot cheaper.”
The power produced by these prototype cells is not yet up to commercial standards. The efficiency of modern solar cells is typically about 20 per cent, while the NTNU's version has only managed 3.6 per cent. However, Martinsen claims their work has great potential for improvement – so this new production method is something we might see appearing in future decades, as nanotechnology continues to advance.
“These are the first solar cells produced this way, using impure silicon. So it isn’t surprising that the power output isn’t very high. It’s a little unfair to compare our method to conventional solar cells, which have had 40 years to fine-tune the entire production process. We’ve had a steep learning curve, but not all the steps of our process are fully developed yet. We’re the first to show that you can make solar cells this way. The results are published and the process is set in motion.”
Laser physicists have found a way to make atomic-force microscope probes 20 times more sensitive and capable of detecting forces as small as the weight of an individual virus.
The technique – developed by researchers in the Quantum Optics Group of the Australian National University, Canberra – uses laser beams to cool a nanowire probe to minus 265 degrees Celsius.
“The level of sensitivity achieved after cooling is accurate enough for us to sense the weight of a large virus, 100 billion times lighter than a mosquito,” said Professor Ping Koy Lam, the leader of the Quantum Optics Group.
This could be used to improve the resolution of atomic-force microscopes, which are state-of-the-art tools for measuring nanoscopic structures and the tiny forces between molecules. Atomic force microscopes can achieve ultra-sensitive measurements of microscopic features by scanning a wire probe over a surface. However, such probes – around 500 times finer than a human hair – are prone to vibration.
“At room temperature the probe vibrates, just because it is warm, and this can make your measurements noisy,” said co-author Dr Ben Buchler. “We can stop this motion by shining lasers at the probe.”
Credit: Quantum Optics Group, Australian National University
The force sensor, pictured above, was a 200 nm-wide silver gallium nanowire coated with gold.
“The laser makes the probe warp and move due to heat. But we have learned to control this warping effect, and were able to use the effect to counter the thermal vibration of the probe,” said Giovanni Guccione, a PhD student on the team.
However, the probe cannot be used while the laser is on, as the laser effect overwhelms the sensitive probe. So the laser has to be turned off and any measurements quickly made before the probe heats up within a few milliseconds. By making measurements over a number of heating/cooling cycles, accurate values can be determined.
“We now understand this cooling effect really well,” says Harry Slatyer, another PhD student. “With clever data processing, we might be able to improve the sensitivity, and even eliminate the need for a cooling laser.”
Sensitive electro-optical imaging and target-acquisition systems will achieve new levels of range and sensitivity thanks to a UK company’s breakthrough in developing a "super black" material.
Vantablack – created by Surrey NanoSystems – is revolutionary in its ability to be applied to lightweight, temperature-sensitive structures such as aluminium whilst absorbing 99.96% of incident radiation, the highest level ever recorded.
“Vantablack is a major breakthrough by UK industry in the application of nanotechnology to optical instrumentation", says Ben Jensen, the company's Chief Technology Officer. "It reduces stray-light, improving the ability of sensitive telescopes to see the faintest stars, and allows the use of smaller, lighter sources in space-borne black body calibration systems. Its ultra-low reflectance improves the sensitivity of terrestrial, space and air-borne instrumentation.”
Vantablack is the result of applying a low-temperature carbon nanotube growth process. The manufacture of "super-black" carbon nanotube materials has traditionally required high temperatures – preventing their direct application to sensitive electronics or materials with relatively low melting points. This, along with poor adhesion, prevented their application to critical space and air-borne instrumentation. Over a period of two years, the development and testing programme by Surrey NanoSystems successfully transferred its low-temperature manufacturing process from silicon to aluminium structures and pyroelectric sensors. Qualification to European Cooperation on Space Standardisation (ECSS) standards was also achieved.
Vantablack has the highest thermal conductivity and lowest mass-volume of any material that can be used in high-emissivity applications. It has virtually undetectable levels of outgassing and particle fallout – thus eliminating a key source of contamination in sensitive imaging systems. It can withstand launch shock, staging and long-term vibration, and is suitable for coating internal components, such as apertures, baffles, cold shields and Micro Electro Mechanical Systems (MEMS)-type optical sensors.
“We are now scaling up production to meet the requirements of our first customers in the defence and space sectors, and have already delivered our first orders. Our strategy includes both the provision of a sub-contract coating service from our own UK facility, and the formation of technology transfer agreements with various international partners”, added Jensen.
As a spin-off from its work in applying nanomaterials to semiconductor device fabrication, Surrey NanoSystems’ manufacturing process also enables Vantablack to be applied to flat and three-dimensional structures in precise patterns with sub-micron resolution.
Researchers have announced the creation of an imaging technology more powerful than anything that has existed before, and is fast enough to observe life processes as they actually happen at the molecular level.
Researchers today announced the creation of an imaging technology more powerful than anything that has existed before, and is fast enough to observe life processes as they actually happen at the molecular level.
Chemical and biological actions can now be measured as they are occurring or, in old-fashioned movie parlance, one frame at a time. This will allow creation of improved biosensors to study everything from nerve impulses to cancer metastasis as it occurs.
The measurements, created by short pulse lasers and bioluminescent proteins, are made in femtoseconds, which is one-millionth of one-billionth of a second. A femtosecond, compared to one second, is about the same as one second compared to 32 million years. That’s a pretty fast shutter speed and will change the way biological research and physical chemistry are being done, scientists say.
“With this technology we’re going to be able to slow down the observation of living processes and understand the exact sequences of biochemical reactions,” said Chong Fang, assistant professor of chemistry in OSU College of Science, and lead author. “We believe this is the first time ever that you can really see chemistry in action inside a biosensor,” he said. “This is a much more powerful tool to study, understand and tune biological processes.”
The system uses advanced pulse laser technology that is fairly new and builds upon the use of “green fluorescent proteins” that are popular in bioimaging and biomedicine. These remarkable proteins glow when light is shined upon them. Their discovery in 1962, and the applications that followed, were the basis for a Nobel Prize in 2008.
Existing biosensor systems, however, are created largely by random chance or trial and error. By comparison, the speed of the new approach will allow scientists to “see” what is happening at the molecular level and create whatever kind of sensor they want by rational design. This will improve the study of everything from cell metabolism to nerve impulses, how a flu virus infects a person, or how a malignant tumor spreads.
“For decades, to create the sensors we have now, people have been largely shooting in the dark,” Fang said. “This is a fundamental breakthrough in how to create biosensors for medical research from the bottom up. It’s like daylight has finally come.”
The technology, for instance, can follow the proton transfer associated with the movement of calcium ions – one of the most basic aspects of almost all living systems, and also one of the fastest. This movement of protons is integral to everything from respiration to cell metabolism and even plant photosynthesis. Scientists will now be able to identify what is going on, one step at a time, and then use that knowledge to create customised biosensors for improved imaging of life processes.
“If you think of this in photographic terms,” Fang said, “we now have a camera fast enough to capture the molecular dance of life. We’re making molecular movies. And with this, we’re going to be able to create sensors that answer some important, new questions in biophysics, biochemistry, materials science and biomedical problems.”
Findings on the new technology were published yesterday in Proceedings of the National Academy of Sciences (PNAS).
Researchers have overcome a major issue in carbon nanotube technology by developing a flexible, energy-efficient hybrid circuit combining carbon nanotube thin film transistors with other thin film transistors. This hybrid could take the place of silicon as the traditional transistor material used in electronic chips, since carbon nanotubes are more transparent, flexible, and can be processed at a lower cost.
Prof. Chongwu Zhou – along with graduate students from the University of Southern California – developed this energy-efficient circuit by integrating carbon nanotube (CNT) thin film transistors (TFT) with thin film transistors comprised of indium, gallium and zinc oxide (IGZO).
“I came up with this concept in January 2013,” said Dr. Zhou, who works in the Department of Electrical Engineering. “Before then, we were working hard to try to turn carbon nanotubes into n-type transistors and then one day, the idea came to me. Instead of working so hard to force nanotubes to do something that they are not good for, why don’t we just find another material which would be ideal for n-type transistors – in this case, IGZO – so we can achieve complementary circuits?”
Carbon nanotubes are so small that they can only be viewed through a scanning electron microscope. This hybridisation of carbon nanotube thin films and IGZO thin films was achieved by combining their types, p-type and n-type, respectively, to create circuits that can operate complimentarily, reducing power loss and increasing efficiency. The inclusion of IGZO thin film transistors was necessary to provide power efficiency to increase battery life. If only carbon nanotubes had been used, then the circuits would not be power-efficient. By combining the two materials, their strengths have been joined and their weaknesses hidden.
Zhou likened the coupling of carbon nanotube TFTs and IGZO TFTs to the Chinese philosophy of yin and yang: “It’s like a perfect marriage. We are very excited about this idea of hybrid integration and we believe there is a lot of potential for it.”
Potential applications for this kind of integrated circuitry are numerous – including Organic Light Emitting Diodes (OLEDs), radio frequency identification (RFID) tags, sensors, wearable electronics, and flash memory devices. Even heads-up displays on vehicle dashboards could soon be a reality.
The new technology also has major medical implications. Currently, memory used in computers and phones is made with silicon substrates, the surface on which memory chips are built. To obtain medical information from a patient such as heart rate or brainwave data, stiff electrode objects are placed on several fixed locations on the patient’s body. With this new hybridised circuit, however, electrodes could be placed all over the patient’s body with just a single large but flexible object.
With this development, Zhou and his team have circumvented the difficulty of creating n-type carbon nanotube TFTs and p-type IGZO TFTs by creating a hybrid integration of p-type carbon nanotube TFTs and n-type IGZO TFTs and demonstrating a large-scale integration of circuits. As a proof of concept, they achieved a scale ring oscillator consisting of over 1,000 transistors. Up to this point, all carbon nanotube-based transistors had a maximum number of 200 transistors.
“We believe this is a technological breakthrough, as no one has done this before,” said Haitian Chen, research assistant and electrical engineering PhD student at USC. “This gives us further proof that we can make larger integrations so we can make more complicated circuits for computers and circuits.”
The next step for Zhou and his team will be to build more complicated circuits using a CNT and IGZO hybrid that achieves more complicated functions and computations, as well as to build circuits on flexible substrates.
“The possibilities are endless, as digital circuits can be used in any electronics,” Chen said. “One day we’ll be able to print these circuits as easily as newspapers.”
Zhou and Chen believe that carbon nanotube technology, including this new CNT-IGZO hybrid, will be commercialized in the next 5-10 years.
“I believe that this is just the beginning of creating hybrid integrated solutions,” said Zhou. “We will see a lot of interesting work coming up.”
Polymer scientists at the University of Akron in Ohio have developed a transparent electrode that could change the face of smartphones, literally, by making their displays shatterproof.
In a recently published paper, researchers show how a transparent layer of nanowire-based electrodes on a polymer surface could be extraordinarily tough and flexible, withstanding repeated scotch tape peeling and bending tests. This could revolutionise and replace conventional touchscreens, according to Yu Zhu, UA assistant professor of polymer science. Currently used coatings made of indium tin oxide (ITO) are more brittle, most likely to shatter, and increasingly costly to manufacture.
“These two pronounced factors drive the need to substitute ITO with a cost-effective and flexible conductive transparent film,” Zhu says, adding that the new film provides the same degree of transparency as ITO, yet offers greater conductivity. The novel film retains its shape and functionality after tests in which it has been bent 1,000 times. Due to its flexibility, the transparent electrode can be fabricated in economical, mass-quantity rolls.
“We expect this film to emerge on the market as a true ITO competitor,” Zhu says. “The annoying problem of cracked smartphone screens may be solved once and for all with this flexible touchscreen.”
At the Computex conference in Taipei, chipmaker Intel has revealed a fanless mobile PC reference design using the first of its next-generation 14nm "Broadwell" processors.
The 2 in 1 pictured here is a 12.5" screen that is just 7.2 mm thick with keyboard detached and weighs 670 grams. The Surface Pro 3 – for comparison – is 9.1 mm thick and weighs 800 grams. It includes a media dock that provides additional cooling for a burst of performance. The next-generation chip is purpose-built for 2 in 1s and will hit the market later in 2014. Called the Intel Core M, it will be the most energy-efficient Intel Core processor in the company's history with power usage cut by up to 45 percent, resulting in 60 percent less heat. The majority of designs based on this new chip are expected to be fanless, with up to 32 hours of battery life, offering both a lightning-fast tablet and razor-thin laptop.
Intel is also delivering innovation and performance for the most demanding PC users. During the conference, the company introduced its 4th generation Core i7 and i5 processor "K" SKU – the first from Intel to deliver four cores at up to 4 GHz base frequency. This desktop processor, built for enthusiasts, enables new levels of overclocking capability. Production shipments begin this month.
Intel also outlined progress towards a vision to deliver 3-D camera and voice recognition technologies to advance more natural, intuitive interaction with computing devices. The latest RealSense software development kit will be made available in the third quarter of 2014, providing opportunity for developers of all skill levels to create user interfaces.
Computer processors continue to get smaller, faster and cheaper thanks to Moore's Law – expanding the scale and potential for technology in everything from cloud computing and the Internet of Things, to mobile phones and wearable technology.
"The lines between technology categories are blurring as the era of integrated computing takes hold where form factor matters less than the experience delivered when all devices are connected to each other and to the cloud," said Renée James, Intel Corporation President. "Whether it's a smartphone, smart shirt, ultra-thin 2-in-1, or a new cloud service delivered to smart buildings outfitted with connected systems, together Intel and the Taiwan ecosystem have the opportunity to accelerate and deliver the value of a smart, seamlessly connected and integrated world of computing."
Scientists have transferred data by quantum teleportation over a distance of 10 feet with zero percent error rate.
Teleporting people through space, as done in Star Trek, is impossible with our current knowledge of physics. Teleporting information is another matter, however, thanks to the extraordinary world of quantum mechanics. Researchers at Delft University of Technology in the Netherlands have succeeded in transferring the information contained in a qubit – the quantum equivalent of a classical bit – to a different quantum bit over a distance of three metres (10 feet), without the information having travelled through the intervening space. This was achieved with a zero percent error rate.
The breakthrough is a vital step towards a future quantum network for communication between ultra-fast quantum computers – a "quantum internet". Quantum computers will solve many important problems that even today's best supercomputers are unable to tackle. Furthermore, a quantum internet will enable completely secure information transfer, as eavesdropping will be fundamentally impossible in such a network. To achieve teleportation, researchers in this study made use of an unusual phenomenon known as entanglement.
"Entanglement is arguably the strangest and most intriguing consequence of the laws of quantum mechanics," argues the head of the research project, Prof. Ronald Hanson. "When two particles become entangled, their identities merge: their collective state is precisely determined, but the individual identity of each of the particles has disappeared. The entangled particles behave as one, even when separated by a large distance. The distance in our tests was three metres – but in theory, the particles could be on either side of the universe. Einstein didn't believe in this prediction and he called it 'spooky action at a distance'. Numerous experiments, on the other hand, agree with the existence of entanglement."
This animation shows schematically how to teleport the state of a spin between two distant locations.
Using entanglement as a means of communication has been achieved in previous work by scientists – but the error rates have been so high as to make those methods impractical for real-world applications. In this new effort, Hanson has solved the error rate problem, bringing it down to zero. His team is the first to have succeeded in teleporting information accurately between qubits in different computer chips: "The unique thing about our method is that the teleportation is guaranteed to work 100%," he says. "The information will always reach its destination, so to speak. And, moreover, it also has the potential of being 100% accurate."
Hanson's team produce solid-state qubits using electrons in diamonds at very low temperatures and shooting them with lasers: "We use diamonds because 'mini prisons' for electrons are formed in this material whenever a nitrogen atom is located in the position of one of the carbon atoms. The fact that we're able to view these miniature prisons individually makes it possible to study and verify an individual electron and even a single atomic nucleus. We're able to set the spin (rotational direction) of these particles in a predetermined state, verify this spin and subsequently read out the data. We do all this in a material that can be used to make chips out of. This is important, as many believe that only chip-based systems can be scaled up to a practical technology," he explains.
This image shows the experimental setup used to teleport the state of a spin between two distant diamonds. The diamonds are hosted in two low-temperature microscopes, that can be seen on the far corners of the table.
Hanson plans to repeat the experiment this summer over a much larger distance of 1300m (4265 ft), using chips located in various buildings on the university campus. This experiment could be the first that meets the criteria of the "loophole-free Bell test", and could provide the ultimate evidence to disprove Einstein's rejection of entanglement. Various groups, including Hanson's, are currently striving to be the first to realise a loophole-free Bell test – considered the Holy Grail within quantum mechanics.
The results of this study are published this week in Science.
Researchers at the University of Texas in Austin have built the smallest, fastest and longest-running tiny synthetic motor to date.
The team’s nanomotor is an important step toward developing miniature machines that could one day move through the body to administer insulin for diabetics when needed, or target and treat cancer cells without harming good cells.
With the goal of powering these yet-to-be invented devices, UT Austin engineers focused on building a reliable, ultra-high-speed nanomotor that can convert electrical energy into mechanical motion, on a scale 500 times smaller than a grain of salt. The researchers' three-part device can rapidly mix and pump biochemicals and move through liquids, which is important for future applications.
With all its dimensions under 1 micrometre in size, the nanomotor could fit inside a human cell and is capable of rotating for 15 continuous hours at a speed of 18,000 RPMs, the same as a motor in a jet airplane engine. Previous nanomotors run significantly slower, from 14 RPMs to 500 RPMs, and have only rotated for a few seconds up to a few minutes.
Looking forward, nanomotors could advance the field of nanoelectromechanical systems (NEMS), an area focused on developing miniature machines that are more energy efficient and less expensive to produce. In the near future, the UT Austin researchers believe their work may provide a new approach to controlled biochemical drug delivery to live cells.
Using components similar to those that control electrons in microchips, researchers have designed a new device that can sort, store, and retrieve individual cells for study.
An international research team has developed a chip-like device that could be scaled up to sort and store hundreds of thousands of individual living cells in a matter of minutes. The chip is similar to random-access memory (RAM), but moves cells rather than electrons. It is hoped the cell-sorting system will revolutionise medical research by allowing the fast, efficient control and separation of individual cells, which could then be studied in vast numbers.
“Most experiments grind up a bunch of cells and analyse genetic activity by averaging the population of an entire tissue rather than looking at the differences between single cells within that population,” says Benjamin Yellen, associate professor at Duke University's Pratt School of Engineering. “That’s like taking the eye colour of everyone in a room and finding that the average colour is grey, when not a single person in the room has grey eyes. You need to be able to study individual cells to understand and appreciate small but significant differences in a similar population.”
Yellen and his collaborator – Cheol Gi Kim, from the Daegu Gyeongbuk Institute of Science and Technology (DGIST) in South Korea – printed thin electromagnetic components like those found on microchips onto a slide. These patterns create magnetic tracks and elements like switches, transistors and diodes to guide magnetic beads and single cells tagged with magnetic nanoparticles through a thin liquid film.
Like a series of small conveyer belts, localised rotating magnetic fields move the beads and cells along specific directions etched on a track, while built-in switches direct traffic to storage sites on the chip. The result is an integrated circuit that controls small magnetic objects much like the way electrons are controlled on computer chips.
In their study, the engineers demonstrate a 3-by-3 grid of compartments that allow magnetic beads to enter but not leave. By tagging cells with magnetic particles and directing them to different compartments, the cells can be separated, sorted, stored, studied and retrieved.
In a random-access memory chip, similar logic circuits manipulate electrons on a nanometre scale, controlling billions of compartments in a square inch. Cells are much larger than electrons, however, which would limit the new devices to hundreds of thousands of storage spaces per square inch.
But Yellen and Kim say that’s still plenty small for their purposes.
“You need to analyse thousands of cells to get the statistics necessary to understand which genes are being turned on and off in response to pharmaceuticals or other stimuli,” said Yellen. “And if you’re looking for cells exhibiting rare behavior, which might be one cell out of a thousand, then you need arrays that can control hundreds of thousands of cells.”
“Our technology can offer new tools to improve our basic understanding of cancer metastasis at the single cell level, how cancer cells respond to chemical and physical stimuli, and to test new concepts for gene delivery and metabolite transfer during cell division and growth,” said Kim.
They now plan to demonstrate a larger grid of 8-by-8 or 16-by-16, then scale it up to hundreds of thousands of compartments with cells. If successful, their technology would lend itself well to manufacturing, giving scientists around the world access to single-cell experimentation.
“Our idea is a simple one,” said Kim. “Because it is a system similar to electronics and is based on the same technology, it would be easy to fabricate. That makes the system relevant to commercialisation.”
Researchers have demonstrated, for the first time, the maximum theoretical limit of energy needed to control the magnetisation of a single atom. This fundamental work has major implications for magnetic research and future nanotechnology.
Magnetic devices like hard drives, magnetic random access memories (MRAMs), molecular magnets and quantum computers depend on the manipulation of magnetic properties. In an atom, magnetism arises from the spin and orbital momentum of its electrons. "Magnetic anisotropy" describes how an atom’s magnetic properties depend on the orientation of the electrons' orbits, relative to the structure of a material. It also provides directionality and stability to magnetisation. Publishing in Science, researchers led by Ecole Polytechnique Fédérale de Lausanne (EPFL) combine various experimental and computational methods to measure, for the first time, the energy needed to change the magnetic anisotropy of a single Cobalt atom.
Their methodology and findings could impact a range of fields – from studies of single atom and single molecule magnetism, to quantum computing and the design of spintronic device architectures.
In theory, every atom or molecule has the potential to be magnetic, since this depends on the movement of its electrons. Electrons move in two ways: spin, which can be loosely thought of as spinning around themselves; and orbit, which refers to an electron’s movement around the nucleus of its atom. Spin and orbital motion give rise to magnetisation, similar to an electric current circulating in a coil and producing a magnetic field. The spinning direction of the electrons therefore defines the direction of the magnetisation in a material.
The magnetic properties of a material have a certain "preference" or "stubbornness" towards a specific direction. This phenomenon is referred to as "magnetic anisotropy," and is described as the "directional dependence" of a material’s magnetism. Changing its "preference" requires a certain amount of energy. The total energy of a material’s magnetic anisotropy is a fundamental obstacle when it comes to downscaling of technology like MRAMs, computer hard drives and even quantum computers, which use different electron spin states as distinct information units, or "qubits".
The team at EPFL – in collaboration with scientists from ETH Zurich, the Paul Scherrer Institute and IBM Almaden Research Center – developed a method to determine the maximum possible magnetic anisotropy for a single Cobalt atom. This metal is widely used in permanent magnets, as well as in magnetic recording materials for data storage applications.
The researchers used a technique called "inelastic electron tunnelling spectroscopy" to probe the quantum spin states of a single cobalt atom bound to a layer of magnesium oxide (MgO), as shown in the illustration at the top of this page. This technique uses an atom-sized scanning tip, which allows the passage (or "tunnelling") of electrons to the cobalt atom. When electrons were tunnelled through by researchers, this transferred energy and induced changes in the spin properties.
The experiments revealed the maximum magnetic anisotropy energy of a single atom (~58 millielectron volts) and the longest spin lifetime for a single transition metal atom. When placed on the ultra-thin layer of magnesium oxide, these individual cobalt atoms were found to have triple the magnetism, atom for atom, than a layer made of pure cobalt. In addition, these single-atom magnets were very stable against external perturbations, which is a prerequisite for technological applications. These fundamental findings open the way for a better understanding of magnetic anisotropy and present a single-atom model system that could be used as a future "qubit".
"Quantum computing uses quantum states of matter – and magnetic properties are such a quantum state," says Harald Brune, from EPFL. "They have a life-time, and you can use the individual surface-adsorbed atoms to make qubits. Our system is a model for such a state. It allows us to optimise the quantum properties, and it is easier than previous ones, because we know exactly where the cobalt atom is in relation to the MgO layer."
"Miniaturisation is limited physically by the atomic structure of the material," said Professor Pietro Gambardella, ETH Zurich. "In our work, we have now shown that it is possible to create stable magnetic components out of single atoms; i.e. the smallest possible structure."
The superior light-emitting properties of quantum dots can be applied to solar energy, helping to more efficiently harvest sunlight.
A house window that doubles as a solar panel could be on the horizon, thanks to recent work by Los Alamos National Laboratory researchers in collaboration with scientists from the University of Milano-Bicocca (UNIMIB), Italy. Their project demonstrates that superior light-emitting properties of quantum dots can be applied in solar energy by helping more efficiently harvest sunlight.
Quantum dots are ultra-small nanocrystals of semiconductor matter that are synthesized with nearly atomic precision. Their emission colour can be tuned by simply varying their dimensions. Colour tunability is combined with high emission efficiencies approaching 100%. These properties have recently become the basis of a new technology – quantum dot displays – employed, for example, in the newest generation of the Kindle Fire e-reader.
A luminescent solar concentrator (LSC) is a photon management device, representing a slab of transparent material that contains highly efficient emitters such as dye molecules or quantum dots. Sunlight absorbed in the slab is re-radiated at longer wavelengths and guided towards the slab edge equipped with a solar cell.
Quantum dot LSC devices under ultraviolet illumination.
Lead researcher Victor Klimov explained: “The LSC serves as a light-harvesting antenna – which concentrates solar radiation collected from a large area onto a much smaller solar cell – and this increases its power output.”
“LSCs are especially attractive because, in addition to gains in efficiency, they can enable new interesting concepts such as photovoltaic windows that can transform house facades into large-area energy generation units,” said his colleague, Sergio Brovelli.
To implement their concept, Los Alamos researchers created a series of cadmium selenide/cadmium sulfide (CdSe/CdS) quantum dots, which were then incorporated by the Italian team into large slabs of transparent polymer. The particles are tiny, only about 10 nanometres (nm) across. For comparison, human hairs are typically 50,000 nm wide.
Spectroscopic measurements indicated virtually no losses to re-absorption on distances of tens of centimetres. Tests using simulated solar radiation demonstrated high photon harvesting efficiencies of around 10% per absorbed photon – achievable in nearly transparent samples – perfectly suited for utilisation as photovoltaic windows.
These findings are published in Nature Photonics. According to a report earlier this year, the quantum dot and quantum dot display (QLED) markets are expected to see a 42-fold growth in the next five years, reaching $6.4 billion by 2019.
Graphene has the potential to usher in a new era of next generation electronic devices, including flexible displays and wearable technology.
Samsung Electronics have announced a breakthrough synthesis method to speed the commercialisation of graphene, a unique material ideally suited for electronic devices. Samsung Advanced Institute of Technology (SAIT), in partnership with Sungkyunkwan University, became the first in the world to develop this new method.
“This is one of the most significant breakthroughs in graphene research in history,” said the laboratory leaders at SAIT’s Lab. “We expect this discovery to accelerate the commercialisation of graphene, which could unlock the next era of consumer electronic technology.”
Graphene has 100 times greater electron mobility than silicon, the most widely used material in semiconductors today. It is more durable than steel and has high heat conductibility as well as flexibility, which makes it the perfect material for use in flexible displays, wearables and other next generation electronic devices.
Through its partnership with Sungkyungkwan University’s School of Advanced Materials Science and Engineering, SAIT uncovered a new method of growing large area, single crystal wafer scale graphene. Engineers around the world have invested heavily in research for the commercialisation of graphene, but have faced many obstacles due to the challenges associated with it. In the past, researchers have found that multi-crystal synthesis – the process of synthesising small graphene particles to produce large-area graphene – deteriorated the electric and mechanical properties of the material, limiting its application range.
The new method developed by SAIT and Sungkyunkwan University synthesises large-area graphene into a single crystal on a semiconductor, maintaining its electric and mechanical properties. The new method repeatedly synthesises single crystal graphene on the current semiconductor wafer scale.
Over the past several decades, the growth of the semiconductor industry has been driven by the ability to grow the area of a silicon wafer, while steadily decreasing the process node. In order to commercialise graphene to displace the industry’s reliance on silicon, it is vital to develop a new method to grow a single crystal graphene into a large area.
The research results are published in Science Magazine and ScienceExpress, one of the world’s most prestigious science journals.
Nanotechnology startup company, StoreDot, has unveiled a ground-breaking battery capable of charging your smartphone and other devices in under 30 seconds.
At Microsoft’s Think Next symposium in Tel Aviv, StoreDot demonstrated the prototype of its ultra-fast-charge battery for the first time. This company specialises in technology that is inspired by natural processes. They have produced "nanodots" derived from bio-organic material that, due to their size, have both increased electrode capacitance and electrolyte performance. These nanodots – described as "stable, robust spheres" – have a diameter of just 2.1 nanometres and are made of chemically synthesized peptide molecules, short chains of amino acids that form the building blocks of proteins.
StoreDot’s bio-organic devices, which include smartphone displays, provide much more efficient power consumption, and are eco-friendly. While other nanodot and quantum-dot technologies currently in use are heavy metal based, and therefore toxic, StoreDot's are biocompatible and superior to all previous discoveries in the field. Using their method, the company is hoping to synthesize new nanomaterials for use in a wide variety of applications. Nano-crystals in memory chips, for example, could triple the speed of traditional flash memory, while image sensors could be five times more sensitive.
Furthermore, the nanodots are relatively inexpensive, as they originate naturally, and utilise a basic biological mechanism of self-assembly. They can be made from a vast range of bio-organic raw materials that are readily available and environmentally friendly.
The battery seen in the video above remains in the prototype stage, with a rather bulky form factor. However, the CEO of Storedot, Doron Myersdorf, says he is confident that a smaller version can be developed and on the market by 2017.
“The fast-charge battery is the result of our focus on commercialising the materials we have discovered," he explained. "We’re particularly pleased that this innovative nanotechnology, inspired by nature, not only changes the rules of mobile device capabilities, but is also environmentally-friendly.”