24th November 2014
Ocean Spiral – an underwater city
Japanese engineering firm, Shimizu Corp, has announced plans for "Ocean Spiral", an underwater city that would form a nine mile (15 km) structure plunging down to the sea floor. Costing three trillion yen ($25 billion), it would feature residential, hotel and business zones at its top, with resource development facilities at its base to harvest rare earth metals and minerals. Electrical power could be generated by exploiting the wide differences in water temperature between the top and bottom of the ocean. Construction would be achieved with industrial-scale 3D printers using resin components instead of concrete. Shimizu believes the technology required for this project could be available by 2030. The company has been behind a number of previous futuristic concepts, including a "Luna Ring" of solar panels going around the Moon and a floating botanical city that could absorb CO2.
“We had this in Japan in the 1980s when the same corporations were proposing underground and ‘swimming’ cities and 1 kilometre-high towers as part of the rush to development during the height of the bubble economy," says Christian Dimmer, assistant professor in urban studies at Tokyo University. “It’s good that many creative minds are picking their brains as to how to deal with climate change, rising sea levels and the creation of resilient societies – but I hope we don’t forget to think about more open and democratic urban futures in which citizens can take an active role in their creation, rather than being mere passengers in a corporation’s sealed vision of utopia.”
For more information on the Ocean Spiral, see its press release.
Click to enlarge
21st November 2014
AI software can identify objects in photos and videos at near-human levels
A new AI software program developed by researchers at Google and Stanford University can recognise objects in photos and videos at near-human levels of understanding.
It was only recently that computer systems became smart enough to identify unknown objects in photographs. Even then, it has generally been limited to individual objects. Now, two separate teams of researchers at Google and Stanford University have created software able to describe entire scenes. This could lead to much better and more intelligent algorithms in the future.
Stanford's work, entitled "Deep Visual-Semantic Alignments for Generating Image Descriptions", explains how specific details found in photographs and videos can be translated into written text. Google's version of the technology, in a study titled "Show and Tell: A Neural Image Caption Generator", produced similar results.
While each team used a slightly different approach, they both combined deep convolutional neural networks with recurrent neural networks that excel at text analysis and natural language processing. The programs were able to "learn" from each new interaction, with algorithms enabling the system to improve its accuracy by scanning scene after scene, looking for patterns, and then using the accumulation of previously described scenes to extrapolate what is being depicted in the next unknown image.
"The system can analyse an unknown image and explain it in words and phrases that make sense," says Fei-Fei Li, a professor of computer science and director of the Stanford Artificial Intelligence Lab. "This is an important milestone. It's the first time we've had a computer vision system that could tell a basic story about an unknown image by identifying discrete objects and also putting them into some context."
These latest algorithms are being trained on a visual dictionary – the ImageNet project – with a database of more than 14 million objects. Each object is described by a mathematical term, or vector, that enables the machine to recognise the shape the next time it is encountered. Those mathematical definitions are linked to the words humans would use to describe the objects.
“I was amazed that even with the small amount of training data that we were able to do so well,” said Oriol Vinyals, a Google computer scientist who worked with members of the Google Brain project. “The field is just starting, and we will see a lot of increases.”
In the near term, computer vision systems that can discern the story in a picture will enable people to search photo or video archives and find highly specific images. Eventually, these advances will lead to robotic systems able to navigate unknown situations. Driverless cars would also be made safer. However, it also raises the prospect of even greater levels of government surveillance.
"A group of young people playing a game of Frisbee."
"A person riding a motorcycle on a dirt road."
"A pizza sitting on top of a pan on top of a stove."
19th November 2014
Lightning strikes will increase due to global warming
Global warming will cause lightning strikes in the U.S. to increase 50% by 2100, according to a study by the University of California (UC).
New climate models predict a 50 percent increase in lightning strikes across the United States during this century as a result of warming temperatures associated with climate change.
Reporting in the peer-reviewed journal Science, UC Berkeley climate scientist David Romps and his colleagues look at predictions of precipitation and cloud buoyancy in 11 different climate models and conclude that their combined effect will generate more frequent electrical discharges to the ground.
“With warming, thunderstorms become more explosive,” says Romps, an assistant professor of earth and planetary science and a faculty scientist at Lawrence Berkeley National Laboratory. “This has to do with water vapour, which is the fuel for explosive deep convection in the atmosphere. Warming causes there to be more water vapour in the atmosphere – and if you have more fuel lying around, when you get ignition, it can go big time.”
More lightning strikes mean more human injuries; estimates of people struck each year range from hundreds to nearly a thousand, with many deaths. But another significant impact of increased lightning strikes would be more wildfires, since half of all fires – and often the hardest to fight – are ignited by lightning, Romps said. More lightning would also generate more nitrogen oxides in the atmosphere, exerting a strong control on atmospheric chemistry.
While some studies have shown changes in lightning associated with seasonal or year-to-year variations in temperature, there have been no reliable analyses to indicate what the future may hold. Romps and graduate student Jacob Seeley hypothesised that two atmospheric properties — precipitation and cloud buoyancy — together might be a predictor of lightning, and looked at observations during 2011 to see if there was a correlation.
“Lightning is caused by charge separation within clouds, and to maximise charge separation, you have to loft more water vapour and heavy ice particles into the atmosphere,” he said. “We already know that the faster the updrafts, the more lightning, and the more precipitation, the more lightning.”
Precipitation – the total amount of water hitting the ground in the form of rain, snow, hail or other forms – is basically a measure of how convective the atmosphere is, and convection generates lightning. The ascent speeds of those convective clouds are determined by a factor called CAPE — convective available potential energy — which is measured by balloon-borne instruments, called radiosondes, released around the United States twice a day.
“CAPE is a measure of how potentially explosive the atmosphere is – that is, how buoyant a parcel of air would be if you got it convecting, if you got it to punch through overlying air into the free troposphere,” Romps said. “We hypothesised that the product of precipitation and CAPE would predict lightning.”
Using U.S. Weather Service data on precipitation, radiosonde measurements of CAPE and lightning-strike counts from the National Lightning Detection Network at the University of Albany, State University of New York (UAlbany), they concluded that 77 percent of the variations in lightning strikes could be predicted from knowing just these two parameters.
“We were blown away by how incredibly well that worked to predict lightning strikes,” he said.
The intensity of lightning flashes averaged over the year in the lower 48 states during 2011. Data from NLDN.
They then looked at 11 different climate models that predict precipitation and CAPE through this century and are archived in the most recent Coupled Model Intercomparison Project (CMIP5). CMIP was established as a resource for climate scientists, providing a repository of output from global climate models that can be used for comparison and validation.
“With CMIP5, we now have for the first time the CAPE and precipitation data to calculate these time series,” Romps said.
On average, the models predicted an 11 percent increase in CAPE in the U.S. per degree Celsius rise in global average temperature by the end of the 21st century. Because the models predict little average precipitation increase nationwide over this period, the product of CAPE and precipitation gives about a 12 percent rise in cloud-to-ground lightning strikes per degree in the contiguous U.S., or a roughly 50 percent increase by 2100 if Earth sees the expected 4-degree Celsius increase (7 degrees Fahrenheit) in temperature.
Exactly why CAPE increases as the climate warms is still an area of active research, Romps said, though it is clear that it has to do with the fundamental physics of water. Warm air typically contains more water vapour than cold air; in fact, the amount of water vapour that air can “hold” increases exponentially with temperature. Since water vapour is the fuel for thunderstorms, lightning rates can depend very sensitively on temperature.
In the future, Romps plans to look at the distribution of lightning-strike increases around the U.S. and also explore what lightning data can tell climatologists about atmospheric convection.
15th November 2014
Hottest October on record
A new global temperature record for October has been set, according to data from the Japan Meteorological Agency (JMA).
Globally, last month was the hottest October on record – by far – according to data just released by the Japan Meteorological Agency (JMA). This follows the hottest March–May, June, August and September, also recorded this year. Near-surface land and sea surface temperatures were 0.67°C (1.2°F) higher than the 20th century average. Despite oft-repeated claims of a "pause", it seems increasingly likely that 2014 is on course to be the all-time hottest year since the JMA began record-keeping in 1891. Data from the National Oceanic and Atmospheric Administration (NOAA) – the U.S. equivalent of Japan's agency – presents a similar trend, with October 2013 to September 2014 being the warmest 12-month period among all months since 1880. These records have occurred even without the latest El Niño, which has yet to begin, meaning that 2015 could be even hotter.
The Intergovernmental Panel on Climate Change (IPCC) has just released the final part of its Fifth Assessment Report. This further discusses the future impacts of climate change and – it is hoped – will pave the way for a global, legally binding treaty on carbon emissions at the UN Climate Change Conference in Paris during late 2015. This week in Beijing, Chinese President Xi Jinping met with Barack Obama to announce a "historic" agreement that would see U.S. emissions fall 26%-28% below 2005 levels by 2025, while China's would peak by 2030. By announcing these targets now, they hope to inject momentum into the global climate negotiations and inspire other countries to join in coming forward with ambitious actions as soon as possible, preferably before the first quarter of 2015. The two Presidents resolved to work closely together over the next year to address major impediments to reaching a successful treaty in Paris. UN climate chief, Christiana Figueres, said: "These two crucial countries have today announced important pathways towards a better and more secure future for humankind."
Unfortunately for Barack Obama, the U.S. midterm election was a disaster for the Democrats. They will now lose control of the Senate, for the first time since January 2007, with Republicans also increasing their majority in the House. The incoming Senate Majority Leader, Mitch McConnell, stated that his top priority is to "get the EPA reined in" and to dismantle the new emissions rules for coal power plants. In a related development, the controversial Keystone XL was approved yesterday with a 252-161 vote. This 875-mile (1,408 km) pipeline will carry tar sands oil from Alberta, Canada, to the US state of Nebraska where it joins pipes running down to Texas. While creating only 35 permanent jobs, it will transport 51 coal plants' worth of CO2 and do nothing to lower U.S. gas prices.
Meanwhile, the G20 summit now underway in Brisbane, Australia, has seen hundreds of people staging a "head in the sand" protest over the lack of discussions on climate change. Australian Prime Minister, Tony Abbott, recently declared that "coal is good for humanity" while opening a new coal plant and expressing his belief that "the trajectory should be up and up and up in the years and decades to come ... The future for coal is bright."
A new report from the Overseas Development Institute (ODI) and Oil Change International highlights the fact that G20 governments are now spending almost £56bn ($90bn) a year on finding new oil, gas and coal reserves. This is despite clear evidence that two-thirds of fossil fuels must be left in the ground to avoid tipping the world into a climate catastrophe. Phasing out these perverse subsidies may form a crucial part of the negotiations at the Paris conference in 2015.
The science of global warming is clearer than ever. Back in April, a report by McGill University concluded "with confidence levels greater than 99% and most likely greater than 99.9%" that recent warming is not caused by natural factors but is man-made. A new generation of supercomputers – able to crunch hundreds of terabytes' worth of data – has led to what one researcher calls "a golden age for high-resolution climate modelling" with accurate simulations of intense weather and climate events. These models will only get better in the years ahead. On current trends, it should be possible to achieve resolutions down to a square metre by 2030. And yet, even without these models or the IPCC, we know the problem is real.
13th November 2014
Genomes of the world's oldest people are published
The genomes from 17 of the oldest people have been published. Researchers were unable to find genes associated with extreme longevity.
Supercentenarians are the world's oldest people, living beyond 110 years of age. There are 74 alive worldwide, with 22 in the USA. The longest confirmed human lifespan on record is that of Jeanne Calment (1875–1997), a French woman who reached 122 years and 164 days. The oldest person alive today is Misao Okawa, a Japanese woman aged 116. She is the last living Japanese person to have been born during the 1800s.
In a study published yesterday by the journal PLOS ONE, whole-genome sequencing was performed on 17 supercentenarians to explore the genetic basis underlying extreme human longevity. The researchers – Hinco Gierman and colleagues from Stanford University – were unable to find any rare protein-altering variants significantly associated with extreme longevity compared to control genomes. However, they did find that one supercentenarian carries a variant associated with a heart condition, which had little or no effect on his/her health, as this person lived over 110 years. The authors say it is recommended by the American College of Medical Genetics and Genomics to report the results to this individual as an incidental finding.
Although the authors didn't find significant association with extreme longevity, they have publicly published the genomes, making them available as a resource for future studies on longevity.
10th November 2014
Synthetic platelets could accelerate healing of injuries
Basic wound healing has been advanced with a synthetic platelet that accumulates at sites of injury, clots and stops bleeding three times faster. The synthetic platelets have realistic size, disk-shape, flexibility, and the same surface proteins as real platelets.
Artificial platelets made by the University of California and Case Western Reserve University have been shown to halt bleeding in mouse experiments much faster than nature can on its own. For the first time, they have been able to integratively mimic the shape, size, flexibility and surface chemistry of real blood platelets on albumin-based particle platforms. The researchers believe these four design factors together are vital for inducing clots to form faster at vascular injury sites while preventing harmful clots from forming elsewhere in the body.
The new technology, reported in the journal ACS Nano, is aimed at stemming bleeding in patients suffering from traumatic injury, undergoing surgeries or suffering clotting disorders from platelet defects or a lack of platelets. Further, it could be used to deliver drugs to target sites in patients suffering atherosclerosis, thrombosis or other platelet-involved pathologic conditions.
Anirban Sen Gupta, associate professor of biomedical engineering at Case Western Reserve, previously designed peptide-based surface chemistries that mimic the clot-relevant activities of real platelets. Building on this work, he now focuses on incorporating morphological and mechanical cues that are naturally present in platelets to further refine their design.
"Morphological and mechanical factors influence the margination of natural platelets to the blood vessel wall, and only when they are near the wall can the critical clot-promoting chemical interactions take place," he said.
These cues motivated Sen Gupta to team up with Samir Mitragotri, a professor of chemical engineering at the University of California. In his laboratory, Mitragotri has recently developed albumin-based technologies to mimic the geometry and mechanical properties of red blood cells and platelets. Together, the team has developed artificial platelet-like nanoparticles (PLNs) that combine morphological, mechanical and surface chemical properties of natural platelets.
The researchers believe this refined design can simulate natural platelet's ability to collide effectively with larger and softer red blood cells in systemic blood flow. The collisions cause "margination" – pushing the platelets out of the main flow and closer to the blood vessel wall – increasing the probability of them interacting with an injury site. The surface coatings enable the artificial platelets to anchor to injury-site-specific proteins, von Willebrand Factor and collagen, while inducing the natural and artificial platelets to aggregate faster at the injury site.
Testing in mouse models showed that injection of the artificial platelets formed clots at the site of injury three times faster than natural platelets alone in the control mice. The ability to interact selectively with injury site proteins, as well as remaining mechanically flexible like natural platelets, enables these artificial versions to safely ride through the smallest of blood vessels without causing damage.
Albumin, a protein found in blood serum and eggs, is already used in cancer drugs and considered a safe material. Artificial platelets that don't become involved in a clot and continue to circulate are metabolised within one to two days. The researchers believe their new artificial platelet design may be even more effective in larger volume flows where margination to the blood vessel wall is more prominent. They will soon begin testing that capability.
In addition to stemming bleeding, Sen Gupta believes this technology could also be useful in delivering clot-busting medicines directly to clots, to treat heart attack or stroke without having to systemically suspend the body's coagulation mechanism. The artificial platelets may also be used to deliver cancer medicines to metastatic tumours with high platelet interactions.
8th November 2014
Amazon launches "Echo" speaker with interactive AI
Online retail giant Amazon has unveiled a new hi-tech speaker system with a wide range of interactive features. Called Echo, the cylindrical device is controlled by your voice, activated by a special "wake word" and uses far-field listening to hear from anywhere in the room. It can provide real-time information, music, news, weather, a timer/alarm, and many more services – even telling jokes. Crisp vocals with dynamic bass are fine-tuned to deliver an immersive sound from 360° omni-directional speakers.
With an always-on connection, it uses the cloud to continually learn and increase functionality over time – adapting to speech patterns, vocabulary and users' personal preferences. For now, Echo is only available to those with an invitation, but you can request an invite on its product page. It is currently priced at $199, but Prime members can obtain it for $99 for a limited time. Although its technology appears impressive, to some people it might seem rather Orwellian. Let us know your opinion in the comments below.
7th November 2014
The world's first solar-powered road
A project creating the first solar-powered bicycle path will be officially opened in the Netherlands next week. If successful, it could be applied to 20% of the country's roads in the future.
Developed by the Netherlands' TNO research institute, SolaRoad is the first road in the world that converts sunlight into electricity. The pilot project of just a hundred metres will be used as a bike path and consists of concrete modules each measuring 2.5 by 3.5 metres. Solar cells are fitted in one travelling direction underneath an extremely strong top layer of glass with a dirt and abrasion-resistant coating about 1 cm thick.
There are no solar cells on the other side of the road and this is used to test various top layers. In time, energy generated from the road will be used for practical applications in street lighting, traffic systems, electric cars (which will drive on the surface) and households. This first section of SolaRoad is located in Krommenie, along the provincial road N203, next to the Texaco garage on the railway track side (see Google Street View).
For a three-year period, various measurements will be taken and tests performed to enable SolaRoad to undergo further development. The tests must answer questions such as: How does it behave in practice? How much energy does it produce? What is it like to cycle over? In the run-up to the surface being laid, laboratory tests were conducted to ensure all safety and other requirements were met. The modules were found to successfully carry the weight of heavy vehicles such as tractors, though how they respond to longer term wear and tear remains to be seen.
A spokesperson for the project, Sten de Wit, claims that up to 20% of the Netherlands' 140,000 km (87,000 miles) of road could potentially be adapted. The pilot road will be officially opened on 12th November by Dutch Minister of Economic Affairs, Henk Kamp.
A similar concept – Solar Roadways – is being developed in the US, though its technical and financial viability seems to have come under a lot of criticism in the blogosphere and elsewhere. Perhaps this Dutch effort can prove to be more successful.
7th November 2014
The clearest ever image of planets forming around a young star
The birth of planets has been revealed in astonishing detail by a telescope with 10 times the resolution of Hubble.
The Atacama Large Millimetre Array (ALMA) is a new radio telescope in northern Chile that became fully operational in 2013. Costing $1.4 billion, it is the most expensive ground-based telescope in the world, and the most sensitive at millimetre and submillimetre wavelengths. A cluster of 66 high-precision antennas work in unison to achieve phenomenal resolution.
ALMA was designed to open an entirely new "window" on the universe. Its capabilities have been demonstrated once again with a stunning image released by astronomers this week, showing extraordinarily fine detail in the planet-forming disk around a young star. These new results are a major step forward in the understanding of protoplanetary disks and the formation of planets.
HL Tau is a million-year-old Sun-like star, located 450 light-years from Earth in the constellation of Taurus. The photo seen here exceeds all expectations and reveals a series of concentric and bright rings, separated by gaps. These new substructures have never been seen before and are believed to show the possible positions of planets forming in the dark patches – similar to how our own Solar System would have looked more than 4 billion years ago.
ALMA Deputy Director, Stuart Corder: "These features are almost certainly the result of young planet-like bodies forming in the disk. This is surprising, since such young stars are not expected to have large planetary bodies capable of producing the structures we see in this image."
Catherine Vlahakis, Deputy Program Scientist: "When we first saw this image, we were astounded at the spectacular level of detail. HL Tauri is no more than a million years old, yet already its disk appears to be full of forming planets. This one image alone will revolutionise theories of planet formation."
ALMA's new high-resolution capabilities were achieved by spacing the antennas up to 15 kilometres apart. This baseline at millimetre wavelengths enabled a resolution of 35 milliarcseconds – equivalent to a penny seen from over 110 kilometres away. An even larger cluster of telescopes known as the Square Kilometre Array is planned for operation in 2024. This will have 50 times the resolution of ALMA and 500 times that of Hubble.
5th November 2014
Scientists uncover potential drug to tackle 'undruggable' fault in third of cancers
Scientists have found a possible way to halt one of the most common faults in many types of cancer, according to research presented at the National Cancer Research Institute (NCRI) Cancer Conference in Liverpool today.
||Molecular structure of KRAS, part of the Ras family of proteins. The 3 Ras genes (HRAS, KRAS, and NRAS) are the most common oncogenes in human cancer.
A team of scientists at the Max Planck Institute of Molecular Physiology in Germany has uncovered a new strategy and new potential drug to target an important signalling protein in cells called Ras, which is faulty in a third of cancers. When the Ras protein travels from the centre of a cell to the edge of the cell membrane, it becomes ‘switched on’ and sends signals which tell cells to grow and divide. Faulty versions of this protein cause too many of these signals to be produced – leading to cancer.
For decades, scientists have been attempting to target Ras, but with little success. The reason the protein is so difficult to target is because it lacks an obvious spot on its surface that potential drug molecules can fit into in order to switch it off, like a key closing a lock.
But now the researchers have shown that instead of directly targeting the faulty protein itself, they can stop it moving to the surface of the cell by blocking another protein which transports Ras – preventing it from triggering cancer in the first place. By targeting a link in the chain reaction that switches on the Ras protein, the scientists have opened opportunities to develop new treatments in the future.
Dr Herbert Waldmann at the Max Planck Institute of Molecular Physiology, said: “We’ve been scratching our heads for decades to find a solution to one of the oldest conundrums in cancer research. And we’re excited to discover that it’s actually possible to completely bypass this cancer-causing protein rather than attack it directly.
“We’re making new improvements on compounds for potential drugs, although the challenge still lies in developing a treatment that exploits this discovery without ruining the workings of healthy cells.”
Professor Matt Seymour, clinical research director at the NCRI: “This is an exciting approach to targeting one of the most common faults in cancer, which could lead to a new way of treating the disease. The research is still at a very early stage, and it will be years before it can benefit patients – but it is a key step forward in the field.”
4th November 2014
DARPA circuit achieves speed of 1 terahertz (THz)
The fastest ever integrated circuit has been announced by DARPA – achieving one terahertz (1012 Hz), or a trillion cycles per second.
Guinness World Records has officially recognised DARPA's Terahertz Electronics program for creating the fastest solid-state amplifier integrated circuit ever measured. The ten-stage common-source amplifier operates at a speed of one terahertz (1,000,000,000,000 Hz), or one trillion cycles per second — 150 billion cycles faster than the existing world record of 850 gigahertz set in 2012.
“This breakthrough could lead to revolutionary technologies such as high-resolution security imaging systems, improved collision-avoidance radar, communications networks with many times the capacity of current systems and spectrometers that could detect potentially dangerous chemicals and explosives with much greater sensitivity,” said Dev Palmer, DARPA program manager.
Developed by Northrop Grumman Corporation, the Terahertz Monolithic Integrated Circuit (TMIC) exhibits power gains several orders of magnitude beyond the current state of the art, using a super-scaled 25 nanometer gate-length. Gain, which is measured logarithmically in decibels, similar to how earthquake intensity is measured on the Richter scale, describes the ability of an amplifier to increase the power of a signal from the input to the output. The Northrop Grumman TMIC showed a measured gain of nine decibels at 1.0 terahertz and 10 decibels at 1.03 terahertz. By contrast, current smartphone technology operates at one to two gigahertz and wireless networks at 5.7 gigahertz.
“Gains of six decibels or more start to move this research from the laboratory bench to practical applications — nine decibels of gain is unheard of at terahertz frequencies” said Palmer. “This opens up new possibilities for building terahertz radio circuits.”
For years, researchers have been looking to exploit the tremendously high-frequency band beginning above 300 gigahertz where the wavelengths are less than one millimetre. The terahertz level has proven to be somewhat elusive though, due to a lack of effective means to generate, detect, process and radiate the necessary high-frequency signals.
Current electronics using solid-state technologies have largely been unable to access the sub-millimetre band of the electromagnetic spectrum due to insufficient transistor performance. To address the “terahertz gap,” engineers have traditionally used frequency conversion — converting alternating current at one frequency to alternating current at another frequency — to multiply circuit operating frequencies up from millimetre-wave frequencies. This approach, however, restricts the output power of electrical devices and adversely affects signal-to-noise ratio. Frequency conversion also increases device size, weight and power requirements.
DARPA has made a series of strategic investments in terahertz electronics through its HiFIVE, SWIFT and TFAST programs. Each program has built on the successes of the previous one, providing the foundational research necessary for frequencies to reach the terahertz threshold.
3rd November 2014
3D printer is 10 times faster than current models
Hewlett-Packard (HP) has unveiled a 3D printer that it claims will be 10 times faster than current models.
HP has introduced its vision for the future of computing and 3D printing by unveiling its new "Blended Reality" ecosystem. Designed to break down the barriers between the digital and physical worlds, this ecosystem is underpinned by two key advancements:
- HP Multi Jet Fusion: A revolutionary technology engineered to resolve critical gaps in the combination of speed, quality and cost, and deliver on the potential of 3D printing.
- Sprout by HP: A first-of-its-kind Immersive Computing platform that will redefine the user experience and that creates a foundation for future immersive technologies.
"We are on the cusp of a transformative era in computing and printing," said Dion Weisler, executive vice president, Printing & Personal Systems (PPS). "Our ability to deliver Blended Reality technologies will reduce the barriers between the digital and physical worlds, enabling us to express ourselves at the speed of thought – without filters, without limitations. This ecosystem opens up new market categories that can define the future, empowering people to create, interact and inspire like never before."
"As we examined the existing 3D print market, we saw a great deal of potential but also saw major gaps in the combination of speed, quality and cost," said Stephen Nigro, vice president of Inkjet and Graphic Solutions at HP. "HP Multi Jet Fusion is designed to transform manufacturing across industries by delivering on the full potential of 3D printing with better quality, increased productivity, and break-through economics."
Multi Jet Fusion is built on HP Thermal Inkjet technology and features a unique synchronous architecture that significantly improves the commercial viability of 3D printing and has the potential to change the way we think about manufacturing.
- 10 times faster: Images entire surface areas versus one point at a time to achieve breakthrough functional build speeds, 10 times faster than the fastest technology in the market today.
- New levels of quality, strength and durability: Multi-agent printing process utilising HP Thermal Inkjet arrays that simultaneously apply multiple liquid agents to produce best-in-class quality that combines greater accuracy, resiliency and uniform part strength in all three axis directions.
- Accuracy and detail: Capable of delivering fully functional parts with more accuracy, finer details and smooth surfaces, and able to manipulate part and material properties, including form, texture, friction, strength, elasticity, electrical, thermal properties and more.
- Achieves breakthrough economics: Unifies and integrates various steps of the print process to reduce running time, cost, energy consumption and waste to significantly improve 3D printing economics.
Sprout – the first product available in HP's Blended Reality ecosystem – combines the power of an advanced desktop computer with an immersive, natural user interface to create a new computing experience. As shown in the image above, this puts a scanner, depth sensor, hi-resolution camera and projector into a single device, allowing users to take physical items and seamlessly merge them into a digital workspace. The system also delivers an unmatched collaboration platform, allowing users in multiple locations to collaborate on and manipulate a single piece of digital content in real-time.
"We live in a 3D world, but today we create in a 2D world on existing devices," said Ron Coughlin, senior vice president, Consumer PC & Solutions, HP. "Sprout by HP is a big step forward in reimagining the boundaries of how we create and engage with technology to allow users to move seamlessly from thought to expression."
Together, HP says these advancements have the potential to revolutionise production and offer small businesses a new way to produce goods and parts for customers. HP aims to invite open collaboration and partnerships in 2015 to further develop its 3D print system, with general consumer availability in the second half of 2016.