future timeline technology singularity humanity
 
  Follow us »
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed
 
     
     
 
       
 
 
 

Blog » AI & Robotics

 
     
 

10th March 2014

Table tennis: man vs machine

In 1997, Deep Blue became the first computer to win against a human chess champion, when it defeated Garry Kasparov. In 2011, IBM's Watson competed on the Jeopardy! quiz show against former winners Brad Rutter and Ken Jennings, defeating them both. Now, another competition between man and machine is about to unfold. On Tuesday 11th March, KUKA – a German manufacturer of high-end industrial robots – will open its first plant in Shanghai, China. The opening will be celebrated with a table tennis match between their KR AGILUS robot and Tim Boll, the German champion. This event is intended to demonstrate the speed, precision and flexibility of KUKA's industrial robots. For more information, click here.

 

 

  speech bubble Comments »
 

 

 

1st February 2014

Philosophy of mind and robotics with Dan Barry

Dan Barry is an engineer and scientist, currently serving as the Co-Chair of Artificial Intelligence and Robotics at Singularity University. In 2005, he started his own company, Denbar Robotics, that creates robotic assistants for home and commercial use. In 2011 he co-founded 9th Sense, a company that sells telepresence robots. He has seven patents, has published over 50 articles in scientific journals, and is a former NASA astronaut. In this video, Barry asks the question: "How are we going to know that a robot is self-aware?"

 

 

  speech bubble Comments »
 

 

 

9th January 2014

IBM forms Watson Group to meet growing demand for cognitive innovations

Headquartered in New York City's "Silicon Alley", the new Watson Group formed by IBM will fuel innovative products and startups – introducing cloud solutions to accelerate research, visualise Big Data and enable analytics exploration.

 

 

IBM today announced it will establish the IBM Watson Group, a new business unit dedicated to the development and commercialisation of cloud-delivered cognitive innovations. The move signifies a strategic shift by IBM to accelerate into the marketplace a new class of software, services and apps that can "think", improve by learning, and discover answers and insights to complex questions from massive amounts of Big Data.

IBM will invest more than $1 billion into the Watson Group, focusing on research and development to bring cloud-delivered cognitive applications and services to market. This will include $100 million available for venture investments to support IBM's recently launched ecosystem of start-ups and businesses, which are building a new class of cognitive apps powered by Watson, in the IBM Watson Developers Cloud.

According to technology research firm Gartner, smart machines will be the most disruptive change ever brought about by information technology, and can make people more effective, empowering them to do "the impossible."

The IBM Watson Group will have a new headquarters at 51 Astor Place in New York City's "Silicon Alley" technology hub, leveraging the talents of 2,000 professionals, whose goal is to design, develop and accelerate the adoption of Watson cognitive technologies that transform industries and professions. The new group will tap subject matter experts from IBM's Research, Services, Software and Systems divisions, as well as industry experts who will identify markets that cognitive computing can disrupt and evolve, such as healthcare, financial services, retail, travel and telecommunications.

Nearly three years after its triumph on the TV show Jeopardy!, IBM has advanced Watson from a quiz game innovation into a commercial technology. Now delivered from the cloud and powering new consumer apps, Watson is 24 times faster and 90 percent smaller – IBM has shrunk Watson from the size of a master bedroom to three stacked pizza boxes.

Named after IBM founder Thomas J. Watson, the machine was developed in IBM’s Research labs. Using natural language processing and analytics, Watson handles information akin to how people think, representing a major shift in the ability to quickly analyse, understand and respond to Big Data. Watson’s ability to answer complex questions in natural language with speed, accuracy and confidence will transform decision making across a range of industries.

"Watson is one of the most significant innovations in IBM's 100 year history, and one that we want to share with the world," says IBM Senior Vice President Mike Rhodin (pictured below), who will lead the group. "These new cognitive computing innovations are designed to augment users’ knowledge – be it the researcher exploring genetic data to create new therapies, or a business executive who needs evidence-based insights to make a crucial decision."

 

mike rhodin IBM Watson

 

  speech bubble Comments »
 

 

 

7th January 2014

Intel at CES 2014

At the Consumer Electronics Show (CES) in Las Vegas, Intel Corporation has been showing off its latest innovative technologies. These include an intelligent 3D camera system, a range of new wearable electronics, and a 22nm dual-core PC the size of an SD card.

 

intel edison 22nm dual core pc 2014 technology

 

Intel CEO Brian Krzanich has outlined a range of new products, initiatives and strategic relationships aimed at accelerating innovation across a range of mobile and wearable devices. He made the announcements during the pre-show keynote for the 2014 Consumer Electronics Show in Las Vegas, the biggest gathering of the tech industry in the USA.

Krzanich's keynote painted a vision of how the landscape of computing is being re-shaped and where security is too important not to have it embedded in all devices. The world is entering a new era of integrated computing defined not by the device, but the integration of technology into people's lifestyles in ways that offer new utility and value. As examples, he highlighted several immersive and intuitive technologies that Intel will begin offering in 2014, such as Intel RealSense – hardware and software that will bring human senses to Intel-based devices. This will include 3D cameras that deliver more intelligent experiences – improving the way people learn, collaborate and are entertained.

The first Intel RealSense 3D camera features a best-in-class depth sensor and a full 1080p colour camera. It can detect finger level movements enabling highly accurate gesture recognition, facial features for understanding movement and emotions. It can understand foregrounds and backgrounds to allow control, enhance interactive augmented reality (AR), simply scan items in three dimensions, and more.

This camera will be integrated into a growing spectrum of Intel-based devices including 2 in 1, tablet, Ultrabook, notebook, and all-in-one (AIO) designs. Systems with the new camera will be available beginning in the second half of 2014 from Acer, Asus, Dell, Fujitsu, HP, Lenovo and NEC.

To advance the computer's "hearing" sense, a new generation of speech recognition technology will be available on a variety of systems. This conversational personal assistant works with popular websites and applications. It comes with selectable personalities, and allows for ongoing dialogue with Intel-based devices. People can simply tell it to play music, get answers, connect with friends and find content – all by using natural language. This assistant is also capable of calendar checks, getting maps and directions, finding flights or booking a dinner reservation. Available offline, people can control their device, dictate notes and more without an Internet connection.

 

 

Krzanich then explained how Intel aims to accelerate wearable device innovation. A number of reference designs were highlighted including: smart earbuds providing biometric and fitness capabilities, a smart headset that is always ready and can integrate with existing personal assistant technologies, a smart wireless charging bowl, a smart baby onesie and a smart bottle warmer that will start warming milk when the onesie senses the baby is awake and hungry.

The smart earbuds (pictured below) provide full stereo audio, monitor heart rate and pulse all while the applications on the user's phone keep track of running distance and calories burned. The product includes software to precision-tune workouts by automatically choosing music that matches the target heart rate profile. As an added bonus, it harvests energy directly from the audio microphone jack, eliminating the need for a battery or additional power source to charge the product.

 

intel smart earbuds 2014 technology

 

The Intel CEO announced collaborations to increase dialogue and cooperation between fashion and technology industries to explore and bring to market new smart wearable electronics. He also kicked-off the Intel "Make it Wearable" challenge – a global effort aimed at accelerating creativity and innovation with technology. This effort will call upon the smartest and most creative minds to consider factors impacting the proliferation of wearable devices and ubiquitous computing, such as meaningful usages, aesthetics, battery life, security and privacy.

In addition to reference designs for wearable technology, Intel will offer a number of accessible, low-cost entry platforms aimed at lowering entry barriers for individuals and small companies, allowing them to create innovative web-connected wearables or other small form factor devices. Underscoring this point, Krzanich announced Intel Edison – a low-power, 22nm-based computer in an SD card form factor with built-in wireless abilities and support for multiple operating systems. From prototype to production, Intel Edison will enable rapid innovation and product development by a range of inventors, entrepreneurs and consumer product designers when available this summer.

 

intel edison 22nm dual core pc 2014 technology

 

"Wearables are not everywhere today, because they aren't yet solving real problems and they aren't yet integrated with our lifestyles," said Krzanich. "We're focused on addressing this engineering innovation challenge. Our goal is: if something computes and connects, it does it best with Intel inside."

Krzanich also discussed how Intel is addressing a critical issue for the industry as a whole: conflict minerals from the Democratic Republic of the Congo (DRC). Intel has achieved a critical milestone and the minerals used in microprocessor silicon and packages manufactured in Intel's factories are now "conflict-free", as confirmed by third-party audits.

"Two years ago, I told several colleagues that we needed a hard goal, a commitment to reasonably conclude that the metals used in our microprocessors are conflict-free," Krzanich said. "We felt an obligation to implement changes in our supply chain to ensure that our business and our products were not inadvertently funding human atrocities in the Democratic Republic of the Congo. Even though we have reached this milestone, it is just a start. We will continue our audits and resolve issues that are found."

 

intel conflict minerals

 

  speech bubble Comments »
 

 

 

20th December 2013

'Transcendence' teaser trailer

A teaser has been released for an upcoming movie about the technological singularity – a hypothetical point when AI begins to vastly exceed human intelligence. Due for release on 17th April 2014, 'Transcendence' will star Johnny Depp, Rebecca Hall, Morgan Freeman and Cillian Murphy. It marks the directorial debut of Wally Pfister, with Christopher Nolan and Emma Thomas as executive producers.

The movie synopsis is as follows:

Dr. Will Caster (Johnny Depp) is the foremost researcher in the field of Artificial Intelligence, working to create a sentient machine that combines the collective intelligence of everything ever known with the full range of human emotions. His highly controversial experiments have made him famous, but they have also made him the prime target of anti-technology extremists who will do whatever it takes to stop him.

However, in their attempt to destroy Will, they inadvertently become the catalyst for him to succeed—to be a participant in his own transcendence. For his wife Evelyn (Rebecca Hall) and best friend Max Waters (Paul Bettany), both fellow researchers, the question is not if they can…but if they should.

Their worst fears are realized as Will's thirst for knowledge evolves into a seemingly omnipresent quest for power, to what end is unknown. The only thing that is becoming terrifyingly clear is there may be no way to stop him.

The teaser below has a Matrix-like feel to it. However, although the Matrix dealt with AI, virtual reality and other futuristic technology, no major movie has focussed specifically on the phrase "technological singularity" – until now. Transcendence could therefore be a pivotal moment in terms of introducing the concept to a mainstream audience and bringing greater public awareness of exponential trends. For more information, visit the official website.

 

 

  speech bubble Comments »
 

 

 

14th December 2013

Valkyrie: NASA's superhero robot

The DARPA Robotics Challenge is a $2 million prize competition run by the Defense Advanced Research Projects Agency. Open since October 2012 and concluding in December 2014, it aims to develop robots that can do "complex tasks in dangerous, degraded, human-engineered environments."

Among the competing entries is "Valkyrie" – a humanoid machine being developed by NASA. This is a variant of the earlier "Robonaut" that was delivered to the International Space Station in February 2011. As seen in the video below, it now features legs and interchangeable arms with 44 axes of movement, alongside a wide array of cameras and sensors. Valkyrie will eventually walk around untethered, pick up and manipulate objects while navigating a variety of terrain and even have the ability to drive vehicles.

In the future, it is hoped that robots like Valkyrie will be used in missions to Mars. Nicolaus Radford, Principal Investigator and team leader of the NASA JSC Dextrous Robotics Lab: "NASA saw a considerable overlap between what the DRC was trying to accomplish and NASA's goals as an agency. We want to get to Mars. Likely, NASA will send robots ahead of the astronauts to the planet. These robots will start preparing the way for the human explorers, and when the humans arrive, the robots and the humans will work together."

For more information, see the official website.

• In a related story, Google yesterday acquired Boston Dynamics, which has multi-million dollar contracts with DARPA and is behind such robots as ATLAS, BigDog and PETMAN. This is just the latest in a whole series of robotics companies recently bought by Google.

 

 

  speech bubble Comments »
 

 

 

11th December 2013

Crime-predicting robot to patrol streets from 2015

Knightscope, a Silicon Valley-based robotics company, is developing the K5 Autonomous Data Machine. This 5-foot-tall mobile robot is equipped with night-time video cameras, thermal imaging, license plate recognition skills, radar, audio and other sensors – in combination with behavioural analysis software – which it can use to predict crimes. The model seen in this video is only a prototype, but the full version being launched in 2015 will have facial recognition software and the ability to detect chemical or biological weapons along with airborne pathogens. Later models will include the ability to traverse curbs and other terrain. Clients will be able to rent these machines for $6.25/hour, or $1,000/month, which is competitive with low-wage human security guards. For more info, see the company's website at knightscope.com.

 

 

  speech bubble Comments »
 

 

 

7th December 2013

Ultra-thin 'electronic skin' provides diagnosis and therapy

Researchers have developed a futuristic new medical device, resembling an electronic tattoo, which provides continuous patient monitoring and treatment.

 

diagnostic skin

 

An international team from the University of Illinois at Urbana/Champaign and the National Institute of Biomedical Imaging and Bioengineering (NIBIB) has created this form of "electronic skin". The device, measuring just 1 x 2 cm (0.39 x 0.78"), adheres like a sticking plaster and is highly flexible, conforming to contours and remaining in place even when skin is stretched or pinched. It provides non-invasive measurements of blood flow and temperature from any part of the body, with minimal patient discomfort, while delivering therapeutic functions.

The array features a combination of miniature power coils, transistors, sensors and heating elements. It was measured alongside an infrared camera to compare their abilities in detecting local variations of skin temperature and blood flow. These tests used a range of mental and physical stimuli to trigger readings. The results were virtually identical using the two methods, meaning the electronic skin matches the “gold standard” of infrared technology. Another test, using pulses of heat from the array, demonstrated its success in accurately measuring skin perspiration and overall hydration.

Future versions will incorporate a wireless power coil and antenna for remote data transfer. New sensors could eventually be developed that reveal blood cell counts, the precise levels of a circulating medication, or the activity of metabolites (such as alcohols, antioxidants, nucleotides, organic and amino acids, sugars and vitamins). The heating elements could deliver heat therapy to specific regions – increasing blood flow in the affected area for accelerated healing, pain relief, decreased joint stiffness, muscle spasm relief, or reduced inflammation. It could even incorporate actuators that deliver an electrical charge, or nanoparticles.

Such diagnostic and therapeutic functions could be performed while patients go about their daily business, with data relayed via cellphone to a doctor or AI program. Looking further into the future, these devices might be incorporated into clothing and shoes. Perhaps eventually, later this century, they will be sufficiently compact and distributed that almost every part of the human body could be treated and monitored in real-time. With a comprehensive merging of the organic and inorganic, the age of transhumanism would truly be upon us.

 

  speech bubble Comments »
 

 

 

2nd December 2013

"Amazon Prime Air" will use drones for 30 minute delivery

Online retailer Amazon has revealed a new rapid delivery method that will use unmanned aerial vehicles to send packages to customers within 30 minutes. Assuming the Federal Aviation Administration (FAA) approves it, this futuristic service – "Amazon Prime Air" – could be introduced by 2015. Read more at the company's press release.

 

 

  speech bubble Comments »
 

 

 

13th November 2013

It takes human researchers 12 years "to do what this robot can do in a week."

Every week at a National Institutes of Health (NIH) drug-testing lab, a robotics system performs millions of experiments faster and with greater precision than any human could. The simple goal: to find new treatments and cures.

 

 

 

  speech bubble Comments »
 

 

 

31st October 2013

Flying sphere exploits collisions to move around

A new flying robot called GimBall has been created by a team at the Swiss Federal Institute of Technology in Lausanne. Using a compass, altitude sensor and novel gyroscopic system, the machine can navigate its way through cramped and cluttered environments – maintaining stability even during collisions.

The researchers believe it could be useful in difficult or dangerous situations, like smoke-filled areas, for example, in which most ordinary robots would struggle to operate. GimBall will be presented at the iREX conference, a robot exhibition in Tokyo, Japan, from 5th-9th November.

 

 

  speech bubble Comments »
 

 

 

29th October 2013

Artificial intelligence breakthrough: CAPTCHA 'Turing Test' is passed

A new software algorithm is capable of solving CAPTCHAs – a test commonly used in computing to determine whether or not the user is human.

 

CAPTCHA

 

Vicarious, a startup developing artificial intelligence software, has announced that its algorithms can now reliably solve modern CAPTCHAs, including Google's reCAPTCHA, the world's most widely used test of a machine's ability to act human.

A CAPTCHA (which stands for "Completely Automated Public Turing test to tell Computers and Humans Apart") is considered broken if an algorithm is able to achieve a precision of at least 1%. Leveraging core insights from machine learning and neuroscience, the Vicarious AI can achieve success rates of up to 90% on modern CAPTCHAs from Google, Yahoo, PayPal, Captcha.com, and others. This advancement, the company says, renders text-based CAPTCHAs no longer effective as a Turing test.

"Recent AI systems like IBM’s Watson and deep neural networks rely on brute force: connecting massive computing power to massive datasets. This is the first time this distinctively human act of perception has been achieved, and it uses relatively minuscule amounts of data and computing power. The Vicarious algorithms achieve a level of effectiveness and efficiency much closer to actual human brains", said Vicarious co-founder D. Scott Phoenix.

 

 

"Understanding how brain creates intelligence is the ultimate scientific challenge. Vicarious has a long-term strategy for developing human level artificial intelligence, and it starts with building a brain-like vision system. Modern CAPTCHAs provide a snapshot of the challenges of visual perception, and solving those in a general way required us to understand how the brain does it", said co-founder Dr. Dileep George.

Solving CAPTCHA is the first public demonstration of Recursive Cortical Network (RCN) technology. Although still many years away, the commercial applications of RCN will have broad implications for robotics, medical image analysis, image and video search, and many other fields.

"We should not underestimate the significance of Vicarious crossing this milestone," said Facebook co-founder and board member Dustin Moskovitz. "This is an exciting time for artificial intelligence research, and they are at the forefront of building the first truly intelligent machines."

 

  speech bubble Comments »
 

 

 

16th October 2013

Lowering the cost of solar installation with robots and cleaning technology

California-based startup company, Alion Energy, has developed a new automated system that is seven times faster than conventional solar installation. In addition, its cleaning technology can boost the efficiency of the panels. It is hoped this new system could significantly lower costs, making solar more competitive with fossil fuels.

 

 

  speech bubble Comments »
 

 

 

16th October 2013

A blueprint for restoring touch with a prosthetic hand

New research is laying the groundwork for touch-sensitive prosthetic limbs that could provide real-time sensory information to amputees via direct interface with the brain.

 

robot hand
Credit: PNAS, 2013

 

The research, published in Proceedings of the National Academy of Sciences, marks an important step toward new technology that – if developed successfully – would increase the functionality of robotic prosthetic limbs, making them act more like real limbs.

“To restore sensory motor function of an arm, you not only have to replace the motor signals that the brain sends to the arm to move it around, but you also have to replace the sensory signals that the arm sends back to the brain,” said the study’s senior author, Sliman Bensmaia, PhD, assistant professor at the University of Chicago. “We think the key is to invoke what we know about how the brain of the intact organism processes sensory information, and then try to reproduce these patterns of neural activity through stimulation of the brain.”

Bensmaia’s research is part of Revolutionising Prosthetics – a multi-year DARPA project that aims to create a modular, artificial upper limb to restore natural motor control and sensation in amputees. Managed by the Johns Hopkins University Applied Physics Laboratory, it has brought together an interdisciplinary team of experts from academic institutions, government agencies and private companies.

Bensmaia and colleagues at the University of Chicago are working specifically on the sensory aspects of these limbs. In a series of experiments with monkeys, whose sensory systems closely resemble those of humans, they identified patterns of neural activity that occur during natural object manipulation and then successfully induced these patterns through artificial means.

 

monkey experiments

 

The first set of experiments focused on contact location, or sensing where the skin has been touched. The animals were trained to identify several patterns of physical contact with their fingers. Researchers then connected electrodes to areas of the brain corresponding to each finger and replaced physical touches with electrical stimuli delivered to the appropriate areas of the brain. The result: the animals reacted the same way to artificial stimulation as they did to physical contact.

Next, the researchers focused on the sensation of pressure. In this case, they developed an algorithm to generate the appropriate amount of electrical current to elicit a sensation of pressure. Again, the animals’ response was the same whether the stimuli were felt through their fingers or through artificial means.

Finally, the team studied the sensation of contact events. When the hand first touches or releases an object, it produces a burst of activity in the brain. Again, the researchers established that these bursts of brain activity can be mimicked through electrical stimulation.

The result of these experiments is a set of instructions that can be incorporated into a robotic prosthetic arm to provide sensory feedback to the brain through a neural interface. Bensmaia believes such feedback will bring these devices closer to being tested in human clinical trials.

“The algorithms to decipher motor signals have come quite a long way, where you can now control arms with seven degrees of freedom. It’s very sophisticated. But I think there’s a strong argument to be made that they will not be clinically viable until the sensory feedback is incorporated,” Bensmaia said. “When it is, the functionality of these limbs will increase substantially.”

 

robotic prosthetic hand
Credit: Johns Hopkins University Applied Physics Laboratory

 

  speech bubble Comments »
 

 

 

5th October 2013

Noam Chomsky: The Singularity is science fiction

Dr. Noam Chomsky is a famed linguist, philosopher, cognitive scientist and political activist. Most of his career has been spent at the Massachusetts Institute of Technology (MIT), where he is currently Professor Emeritus. He has authored over 100 books. A prominent cultural figure, he was voted the world's top public intellectual in a 2005 poll. In this recent video, Chomsky talks to Nikola Danaylov, creator of Singularity Weblog. They discuss a range of topics including the balance between his academic and political life; artificial intelligence and reverse engineering of the human brain; why in his view both Deep Blue and Watson are little more than PR; the slow but substantial progress of our civilisation; and the technological singularity.

 

 

  speech bubble Comments »
 

 

 

4th October 2013

Latest videos of the ATLAS, LS3, and WildCat robots

Engineering and robotics firm Boston Dynamics has released three new videos showing the latest progress on its hi-tech machines.

First up is the ATLAS, an early prototype of which was seen in October 2012. Since then, this humanoid robot has undergone a number of tests and design revisions. It is now able to walk over rock-strewn terrain with ease, as well as remaining balanced when hit by an object from the side.

 

 

 

Next is the Legged Squad Support System (LS3). Due to enter service in 2014, this robotic pack mule will assist soldiers with carrying heavy equipment through rugged terrain. It is designed to automatically follow a squad, using advanced computer vision, so does not require a dedicated driver. Each LS3 will carry up to 400 lbs of gear and enough fuel for a 20-mile mission lasting 24 hours. This video shows field testing at Twentynine Palms, California.

 

 

 

 

Finally, here is the WildCat. This machine is a more advanced version of the Cheetah robot seen last year. Although not quite as fast (yet), it can operate without being tethered to a power supply. Here it is shown running outdoors at a maximum speed of 16 mph (compared with the Cheetah's 28 mph).

 

 

 

  speech bubble Comments »
 

 

 

29th September 2013

World's first mind-controlled bionic leg

This week, the science of bionics helped over 1 million Americans with leg amputations take a giant step forward.

 

 

The Rehabilitation Institute of Chicago (RIC) has revealed clinical applications for the first thought-controlled bionic leg in New England Journal of Medicine. This innovative technology represents a significant milestone in the rapidly growing field of bionics. Until now, only thought-controlled bionic arms were available to amputees.

Levi Hargrove, PhD, the lead scientist of this research at RIC's Center for Bionic Medicine, developed a system to use neural signals to safely improve limb control of a bionic leg.

"This new bionic leg features incredibly intelligent engineering," said Hargrove. "It learns and performs activities unprecedented for any leg amputee – including seamless transitions between sitting, walking, ascending and descending stairs and ramps and repositioning the leg while seated."

This method improves upon prosthetic legs that only use robotic sensors and remote controls and do not allow for intuitive thought control of the prosthetic.

The case study focuses on Zac Vawter, an amputee who had targeted muscle reinnervation surgery in 2009 to redirect nerves from damaged muscle in his amputated limb to healthy hamstring muscle above his knee. When the redirected nerves instruct the muscles to contract, sensors on the patient’s leg detect tiny electrical signals from the muscles. Specially-designed computer software analyses these signals and data from sensors in the robotic leg. It instantaneously decodes the type of movement the patient is trying to perform and then sends those commands down to the robotic leg. Using muscle signals, instead of robotic sensors, makes the system safer and more intuitive.

"The bionic leg is a big improvement compared to my regular prosthetic leg,” said Vawter. “It responds quickly and more appropriately, allowing me to interact with my environment in a way that is similar to how I moved before my amputation. For the first time since my injury, the bionic leg allows me to seamlessly walk up and down stairs and even reposition the prosthetic by thinking about the movement I want to perform. This is a huge milestone for me and for all leg amputees."

The US Army’s Telemedicine and Advanced Technology Research Center (TATRC) funded the study with an $8 million grant to improve the control of robotic leg prostheses by adding neural information to the control system. Due to this unusually large grant, RIC was able to accomplish these breakthrough innovations in only four years. It is unknown what the eventual cost of supplying these bionic legs to patients will be. However, TATRC and RIC intend to make them available for in-home testing for both military and civilian populations within the next three to five years.

 

 

  speech bubble Comments »
 

 

 

19th September 2013

Nearly half of US jobs could be at risk of computerisation within 20 years

A study by the Oxford Martin School shows that nearly half of US jobs could be at risk of computerisation within 20 years. Transport, logistics and office roles are most likely to come under threat.

 

automation

 

The new study, a collaboration between Dr Carl Benedikt Frey (Oxford Martin School) and Dr Michael A. Osborne (Department of Engineering Science, University of Oxford), found that jobs in transportation, logistics, as well as office and administrative support, are at "high risk" of automation. More surprisingly, occupations within the service industry are also highly susceptible, despite recent job growth in this sector.

"We identified several key bottlenecks currently preventing occupations being automated," says Dr. Osborne. "As big data helps to overcome these obstacles, a great number of jobs will be put at risk."

The study examined over 700 detailed occupation types, noting the types of tasks workers perform and the skills required. By weighting these factors, as well as the engineering obstacles currently preventing computerisation, the researchers assessed the degree to which these occupations may be automated in the coming decades.

 

automation graph

 

"Our findings imply that as technology races ahead, low-skilled workers will move to tasks that are not susceptible to computerisation – i.e., tasks that require creative and social intelligence," the paper states. "For workers to win the race, however, they will have to acquire creative and social skills."

"While computerisation has been historically confined to routine tasks involving explicit rule-based activities, algorithms for big data are now rapidly entering domains reliant upon pattern recognition and can readily substitute for labour in a wide range of non-routine cognitive tasks. In addition, advanced robots are gaining enhanced senses and dexterity, allowing them to perform a broader scope of manual tasks. This is likely to change the nature of work across industries and occupations."

The low susceptibility of engineering and science occupations to computerisation, on the other hand, is largely due to the high degree of creative intelligence they require. However, even these occupations could be taken over by computers in the longer term.

Dr Frey said the United Kingdom is expected to face a similar challenge to the US. "While our analysis was based on detailed datasets relating to US occupations, the implications are likely to extend to employment in the UK and other developed countries," he said.

Full version of the paper:
http://www.futuretech.ox.ac.uk/files/The_Future_of_Employment_OMS_Working_Paper_1.pdf

 

automation graph

 

  speech bubble Comments »
 

 

 

11th September 2013

Robots taking over the economy: sudden rise of interacting machines trading at speeds too fast for humans

Researchers have discovered a "global ecology" of interacting machines that trade on the global markets at speeds too fast for humans, causing periodic outages. These high frequency trading algorithms could lead to increasingly large crashes, as the volume of data in the world continues to grow exponentially.

 

stock markets

 

Recently, the global financial market experienced a series of computer glitches that abruptly brought operations to a halt. This was so serious that – on one day – it resulted in a third fewer shares being traded in the USA. One reason for these "flash freezes" may be the sudden emergence of mobs of ultrafast robots, which trade on the global markets and operate at speeds beyond human capability, thus overwhelming the system. The appearance of this "ultrafast machine ecology" is documented in a new study published today in Nature Scientific Reports.

The findings suggest that for time scales less than one second, the financial world makes a sudden transition into a cyber jungle inhabited by packs of aggressive trading algorithms. "These algorithms can operate so fast that humans are unable to participate in real time, and instead, an ultrafast ecology of robots rises up to take control," explains Neil Johnson, professor of physics in the College of Arts and Sciences at the University of Miami (UM).

"Our findings show that, in this new world of ultrafast robot algorithms, the behaviour of the market undergoes a fundamental and abrupt transition to another world where conventional market theories no longer apply," Johnson says.

Society's push for ever faster systems that outpace competitors has led to algorithms capable of operating faster than the response time for a human. For instance, the quickest a person can react to potential danger is about one second. Even a chess grandmaster takes around 650 milliseconds to realise that he is in trouble – yet microchips for trading can operate in a fraction of a millisecond (1 millisecond is 0.001 seconds).

In this study, the researchers assembled and analysed a high-throughput millisecond-resolution price stream of multiple stocks and exchanges. From January 2006, through to February 2011, they found 18,520 extreme events lasting less than 1.5 seconds, including both crashes and spikes.

 

computer trading

 

The team realised that as the duration of these ultrafast extreme events fell below human response times, the number of crashes and spikes increased dramatically. They created a model to understand the behaviour and concluded that the events were the product of ultrafast computer trading and not attributable to other factors, such as regulations or mistaken trades. Johnson, who is head of the inter-disciplinary research group on complexity at UM, compares the situation to an ecological environment.

"As long as you have the normal combination of prey and predators, everything is in balance, but if you introduce predators that are too fast, they create extreme events," Johnson says. "What we see with the new ultrafast computer algorithms is predatory trading. In this case, the predator acts before the prey even knows it's there."

Johnson explains that in order to regulate these ultrafast computer algorithms, we need to understand their collective behaviour. This is a daunting task, but is made easier by the fact that the algorithms that operate below human response times are relatively simple, because simplicity allows faster processing.

"There are relatively few things that an ultrafast algorithm will do," Johnson says. "This means that they are more likely to start adopting the same behaviour, and hence form a cyber crowd or cyber mob which attacks a certain part of the market. This is what gives rise to the extreme events that we observe," he says. "Our math model is able to capture this collective behaviour by modelling how these cyber mobs behave."

In fact, Johnson believes this new understanding of cyber-mobs may have other important applications outside of finance – such as dealing with cyber-attacks and cyber-warfare.

 

  speech bubble Comments »
 

 

 

3rd September 2013

Affordable robotics in agriculture

Harvest Automation is a robotics startup established by former employees of iRobot – the company behind the famous "Roomba" vacuum cleaner. They recently engineered what they claim are the first practical, scalable, affordable robots for a range of agricultural applications.

 

harvest automation robot

 

The U.S. agriculture industry's production value is over $300bn each year, with roughly half in livestock and half in crops. While tremendous gains in production efficiency have occurred since the Industrial Revolution, one-third of the developed world's crop production was left behind. In these sectors, mechanisation solutions offered by today's farming equipment simply can't perform many required tasks, resulting in low labour productivity.

Industry sectors with significant manual labour needs derive approximately $40K of revenue per employee vs. $175K per employee for those sectors that have effective mechanisation. Today, that represents more than $21bn spent annually on inefficient labour in the U.S. and Europe alone. Harvest Automation claims that 40% of this currently manual labour can be performed by its robots. Backed by a team of world-class robotics innovators, the company seeks to resolve acute manual labour problems across multiple industries – starting with agriculture.

In the video below is a demonstration of the company's first robot. Known as "Harvey", or the HV-100, this can move potted plants around in nurseries and greenhouses. Young potted plants are packed close together. As they grow older, they need spacing out. During cold winters, they are packed in together again. For human workers, this is often a strenuous and repetitive task, requiring constant work year-round.

 

 

 

The HV-100 is programmed to identify which size pot to look for, using a 3D Laser Interferometry Detection and Ranging (LIDAR) sensor. It can lift a payload of 22 lb (10 kg) with high placement accuracy, performing up to 200 moves per hour. The machine requires only minimal training and setup, features a quick swap rechargeable battery, is designed to work on rough terrain and operates in all weather and lighting conditions, 24 hours a day. If a human crosses its path, it will immediately stop to avoid a collision.

Harvest Automation is selling the HV-100 for $30,000. It enables growers to create a sustainable workforce of robots, working safely alongside people to increase efficiency, reliability and plant quality. The agriculture industry is facing unprecedented labour volatility and tighter federal regulations on migrant workers. Harvest’s robots can perform as much manual labour as required by each grower, creating more capacity for human workers to focus on other tasks. The robots can also increase plant quality by optimising placement in the fields and reducing non-labour production costs including the use of water, pesticides, herbicides and fertilizers.

So far, around 10 companies in the U.S. have purchased fleets of HV-100s. From early 2014, Harvest plans to begin selling in Europe, where the potted plant market is twice as big as in America. Future capabilities of these robots may include plant maintenance, pruning leaves and collecting fruit. These machines could appear in significant numbers by 2016.

 

 

 

  speech bubble Comments »
 

 

 

28th August 2013

Nissan plans to sell affordable self-driving cars by 2020

Japanese automaker Nissan has announced that it will provide multiple, commercially-viable autonomous drive vehicles by 2020.

Nissan claims that its engineers have been carrying out intensive research on the technology for years, alongside teams from the world's top universities including MIT, Stanford, Oxford, Carnegie Mellon and the University of Tokyo.

Work is already underway in Japan to build a dedicated autonomous driving proving ground, to be completed by the end of fiscal year 2014. Featuring real townscapes – masonry not mock-ups – it will be used to push vehicle testing beyond the limits possible on public roads to ensure the technology is safe.

The company says its autonomous driving will be achieved at realistic prices for consumers. The goal is availability across the model range within two vehicle generations.

"Nissan Motor Company's willingness to question conventional thinking and to drive progress – is what sets us apart," said CEO Carlos Ghosn. "In 2007 I pledged that – by 2010 – Nissan would mass market a zero-emission vehicle. Today, the Nissan LEAF is the best-selling electric vehicle in history. Now I am committing to be ready to introduce a new ground-breaking technology, Autonomous Drive, by 2020, and we are on track to realise it."

 

 

A revolutionary concept like autonomous drive will have implications throughout the design and construction of cars. Collision-avoidance by machines able to react more rapidly and with more complex movements than a human driver will place new demands on the chassis and traction control, for example.

Six million crashes in the US per year cost $160 billion and rank as the leading cause of death for four- to 34-year olds. 93% of accidents are due to human error, typically due to inattention. With autonomous drive, companies like Nissan will have the technology to detect and avoid these life-threatening situations.

In the future, autonomous drive also means less input from the driver. US drivers average 48 minutes per day on the road – hundreds of hours per year that could be used more productively. For the aged, or those with disabilities, there is another benefit: true independence and mobility for all.

Last week, a report from Navigant Research claimed that sales of autonomous vehicles will reach almost 100 million annually by 2035.

 

  speech bubble Comments »
 

 

 

21st August 2013

Autonomous vehicles will reach nearly 100 million in annual sales by 2035

Combinations of advanced driver assistance features – such as adaptive speed control, automatic emergency braking, and lane departure warning – are now being brought together in some 2014 vehicle models, making semi-autonomous driving a reality in many markets for the first time. Increasing production volumes and technology improvements, leading to cost reductions, are now making it feasible to install the multiple sensors necessary for such capabilities.

According to a new report from Navigant Research, sales of autonomous vehicles will grow from less than 8,000 annually in 2020 to more than 95.4 million in 2035, representing 75 percent of all light-duty vehicle sales by that time.

 

self driving car
Image: Google

 

“Fully autonomous vehicles are unlikely to reach the market suddenly,” says David Alexander, senior research analyst with Navigant Research. “Instead, progressively more capable systems that can assume control of certain aspects of driving will be introduced gradually. The first features will most likely be self-parking, traffic jam assistance, and freeway cruising – well-defined situations that lend themselves to control by upgraded versions of today’s onboard systems.”

One of the main barriers to fully automated vehicles driving is the legal requirement in many countries that all vehicles must have a driver in control at all times. Some U.S. states and European countries have begun to issue licenses for companies to conduct testing of autonomous driving features on public highways under controlled conditions. However, before full autonomous driving capability becomes available, liability issues must be clarified. Automakers will be reluctant to assume responsibility for not only supplying the vehicles, but also safely operating them, says the report.

The report, “Autonomous Vehicles”, provides a detailed examination of the emerging market for advanced driver assistance systems leading to semi-autonomous and autonomous driving. The report includes profiles of the leading vehicle manufacturers and suppliers, along with analysis of the drivers and inhibitors for sales of these vehicles. Forecasts for revenue and sales volumes, segmented by region, extend through 2035. It also includes a review of the core driver assistance technologies that make self-driving vehicles possible. An Executive Summary is available for free download.

 

Autonomous vehicles will reach nearly 100 million in annual sales by 2035

 

  speech bubble Comments »
 

 

 

21st August 2013

Meet Shanice, the new holographic receptionist

There's a friendly new face in reception to greet visitors to Brent Council's registration and nationality service, she's called Shanice and she's a hologram or Virtual Assistant.

What's more, she's the first of her kind to be used by any council in Britain. In her day-to-day role, she will help point people in the right direction if they've come to register a birth or death, apply for citizenship, or even a marriage licence.

Situated on the ground floor of the Civic Centre, Shanice sits behind a desk just like a real receptionist, but on closer inspection she's actually projected onto a see-through screen.

Unlike a real receptionist, she's also got a 'touch screen' function, so that visitors can key in the reason for their visit and get a tailor made presentation about where they need to go and what documentation they will need.

Brent's Lead Member for Customers and Citizens, Cllr. James Denselow said: "This is the sort of space age technology you hear about but never really expect to see, especially in council buildings. The best thing is it's going to save us lots of money, without compromising our service. Nowadays we're constantly having to look at innovative ways to cut costs and they don't come more cutting edge than Shanice. I hope people will come down and visit her the next time they're in the Civic Centre, she looks great and she's always very friendly."

A similar form of technology was deployed at New York's airports last year. It is likely that more and more of these holographic people will appear in workplaces in order to save costs and improve customer services. In the future, they will become more interactive and intelligent.

 

 

 

  speech bubble Comments »
 

 

 

7th August 2013

Paul Wolpe: Kurzweil's Singularity prediction is wrong

Bioethicist Paul Root Wolpe argues that the Singularity envisioned by Ray Kurzweil isn't quite right.

This video is from Big Think.

 

 

 

  speech bubble Comments »
 

 

 

3rd August 2013

Peter Joseph on Singularity 1 on 1

Peter Joseph is a film-maker and social activist best known as the man behind the Zeitgeist film trilogy and founder of the Zeitgeist movement. His films have become a counter-culture phenomenon on the internet, with tens of millions of views. He has not shied away from controversy and has dared to push a strong vision for the future.

In this video, Peter talks to Nikola Danaylov, creator of Singularity Weblog. They discuss a range of topics including his goals and motivation, projects he is working on, the measure and meaning of progress, resource-based economies, sustainability and central planning, artificial intelligence and the technological singularity.

From the video: "Real success will not be driven by greater and greater technological advancement, or greater lifespans, or any of the materialistic notions that we've put forward. Real success will be when we finally realise exactly what we're a part of in the natural world – and gain complete alignment with that."

 

 

 

  speech bubble Comments »
 

 

 

9th July 2013

A bio-inspired, 3D printed spider robot

Founded in 2010, Amoeba Robotics Ltd. is a research, engineering and design company based in Hong Kong. Their latest creation is "T8" – a wirelessly controlled, bio-inspired, octopod robot made with high resolution 3D printed parts. It uses a total of 26 motors: 3 in each leg and 2 in the abdomen. It is powered by the Bigfoot™ Inverse Kinematics Engine which performs all of the necessary calculations for smoothly controlling the motions of the robot in real time. The machine is available for pre-order now and will be released in September. For more information, visit robugtix.com

 

 

  speech bubble Comments »
 

 

 

22nd June 2013

BigBrain: An ultra-high resolution 3-D roadmap of the human brain

International neuroscientists have produced a fully 3D map of a human brain – scanning and digitising thousands of ultrathin slices to determine its structure at extremely high resolution.

 

brain scan

 

The map is being made freely available to researchers worldwide. It has a spatial resolution of just 20 micrometres (µm), far exceeding the typical 1 mm (1000 µm) from MRI studies. For comparison, a red blood cell is 8 µm wide.

In recent years, major efforts have been getting underway to probe and map the brain, in the hope of conquering physical and mental illnesses, while better understanding the nature of consciousness. In January, the European Commission awarded €1 billion (US$1.3 bn) to the Human Brain Project, intended to create the world's largest experimental facility for brain mapping. In February, Barack Obama announced the Brain Activity Map Project – a decade-long effort to map the activity of every neuron in the human brain. In March, the Human Connectome Project released a major dataset, revealing the complexities of the brain's structure and giving a clearer picture of its role in neural disorders. Yet another major initiative is the Blue Brain Project, founded in Switzerland in 2005, which aims to create a synthetic brain by reverse-engineering the mammalian brain down to the molecular level.

The BigBrain project, seen in the video below, was reported yesterday in the journal Science. Its fine-grained resolution will allow scientists to gain new insights into the basis of cognition, language and emotions. The researchers also stated that they plan to extract measurements of cortical thickness in order to study aging and neurodegenerative disorders; create cortical thickness maps to compare data from in vivo imaging; integrate gene expression data; and generate an even better model with a resolution of 1 micron to capture details of single cell morphology.

The resolution, bandwidth and image reconstruction times of brain scanning technologies have been improving at an exponential rate since the 1970s. With ongoing advances in computer power, this pace of progress is likely to continue in the future. A single neuron model was developed in 2005; a neocortical column with 10,000 neurons was created in 2008; an entire cortical mesocircuit featuring 100 neocortical columns was simulated in 2011. A rat brain with 100 mesocircuits is expected in 2014. If this trend continues, it is reasonable to assume that a fully working, real-time simulation of an entire human brain is possible by the mid-2020s. This would have profound implications – not only for the treatment and understanding of illnesses, but also for the growth of artificial intelligence.

 

 

  speech bubble Comments »
 

 

 

22nd June 2013

Drone mosquitoes? U.S. companies developing tiny surveillance devices to snoop inside homes

The FBI confirmed this week that drones are carrying out surveillance within the USA, without regulations in place to address privacy concerns. Speaking to Democracy Now!, Heidi Boghosian – National Lawyers Guild executive director and author of the forthcoming book, "Spying on Democracy: Government Surveillance, Corporate Power and Public Resistance" – explains the technologies being developed to expand drone surveillance in the near future. These include drones the size of mosquitoes, capable of entering apartment buildings and remaining airborne inside to spy without detection. You can watch the full interview here.

 

 

  speech bubble Comments »
 

 

 

 
     
       
     
   
« Previous  
   
     
   

 
     
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

 
 
 
 

 


future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed