Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

Mind-Blowing talk by Thomas Reardon, CEO of CTRL-Labs, at 2018 O'Reilly AI Conference, on his company's new neural interface -- kits ship this year!


  • Please log in to reply
9 replies to this topic

#1
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,123 posts

Thomas Reardon YouTube video

The device looks like nothing more than a bracelet, but you'll be able to control robot arms, interact in VR, play games without moving your hand, type on a virtual keyboard (just pretend a keyboard is in front of you), imagine moving a 6th finger, and perhaps even typing just by imagining (without moving your fingers). It's mind-blowing what that will enable!

Possible future uses:

* Imagine you are away on vacation, and forgot to unplug the coffee machine. You could connect to your house camera remotely, and if you have a robot arm on a wheel in your home (might be standard in a few years), you could gracefully move it over to the plug and unplug the coffee machine.

 

* Maybe there's a file on your computer at home, but it's not on and connected to the internet.  You could use the robot arm to turn it on, and even type on the keyboard.
 
* Maybe you forgot to water the plants.  Tis easy to fix.
 
* You could go shopping remotely.  Reatil stores may have 5 robots with collision-avoidance systems (for clumsy users).  You could port-in remotely, and have it pick out exactly what you want, put it in a cart, and then put it in a locker to.collect later -- or, put it in a robot delivery system like Starship robotics.
 

* Or, say someone tells you about a letter they sent. You could move the arm over to the counter, brush the top mail aside, pick out the letter on the bottom, open it (if you have two bracelets and two arms), and read it.

* You could clean your house without ever getting up from the couch. Just direct the robot to pick things up and put them away.

* You could work remotely, by very precisely controlling a robot. People that live far away from a store could get a job stocking shelves and cleaning up. It could all be done over a smartphone with wifi connection.

* People at building sites or in hazardous work sites could very easily and dexterously control robots to nail beams, carry pipes, move wiring, put out fires, and other things.


  • wjfox, Zaphod, Casey and 3 others like this

#2
funkervogt

funkervogt

    Member

  • Members
  • PipPipPipPipPipPip
  • 755 posts

Wow! Impressive. Having spent so much time recently focusing on the potential of AI, this reminds me of the potential for human cybernetics. He says that BCIs and "intention capture" could allow a human to control eight arms, but I suspect the human brain is physically optimized for two arms and two legs. With every additional robot limb you add to that, some level of precision, coordination and finesse will be lost. I bet you'd have to enlarge certain parts of the brain to allow control of eight limbs. 

 

That in turn makes me think that, even after we've created AGI, human brains might be better at certain types of thinking or at handling certain tasks, owing to organic, wet structure of our brains. In that case, it might actually make sense for AGIs to keep us around so we can do work pertaining to our areas of "comparative advantage." (Note that in the original Matrix script, humans were kept captive to harvest their brains' computational powers, and not their "energy output.") 

 

Maybe an "optimized human" in that scenario would be a huge brain floating in a tank, with connection ports for limbs, sensory organs and data cables. Kind of like Mr. Potato Head.  

 

Less dramatically, I think the tech in that demo has very valuable applications for people with physical disabilities, and for virtual reality. 

 

I have one critique of the presentation: using thoughts/peripheral nerve impulses to dictate a text message might require more of your attention and time than typing it with your thumb on your smartphone's keyboard. So even with this technology it still might be unsafe to text and drive. 


  • Zaphod, Casey, Yuli Ban and 1 other like this

#3
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,123 posts
A new podcast interview with CTRL-Labs's Thomas Reardon:

YouTube video of podcast

I've mentioned this before, and he says it here: their device can pick out the signals from individual neurons, and with the form-factor of a watch!

A while back, I thought about what the biggest uses of this would be, and in addition to the ones above, here is what I think people will find most useful:
 

You can record all your days, including all your covert hand and finger movements, for efficient searching later.

For instance: say you are giving a lecture, and someone asks for lecture notes. You could work out when you gave that lecture, and then translate your recorded neural signals into a "hand-written" slides for anyone who wants them. Remember all those notes you forgot that you wrote down in a notebook you have long since thrown away? They are preserved for you for later searching -- no need to remember taking a picture with your phone.

Say you are at a restaurant with friends and want to make a note about what your favorite dish is for the next time you are there. You could simply move your hand in your pocket, pretending that you are writing with a pen, and even drawing diagrams -- and it will be recorded.

Everything you type during the day will also be recorded for efficient and accurate searching, including the files you deleted, messages you forgot about, all the lines you erased and wrote over -- it's all there for searching later on.

All the musical instruments you played, and how you played them, are also there for searching. Apps might offer you advice on your playing -- "Your rhythm is a bit off. I recommend you play the following pieces to improve."

"Where did I leave my laptop computer?" -- Maybe an app can search based on how you pinch your hand around it. You could perhaps search by asking an assistant, "When and where were the last times I pinched my hand like this [pinches as though grabbing a laptop]."

All the credit card slips and documents you signed, all the pictures you drew, all the clay statues you sculpted, all the home repair you did, all the meals you cooked, all the games of baseball you played, the times you fed the dog, the times you watered the plants, whether you unplugged the coffeemaker, the items you inspected in the grocery store, and so on -- it's all there for searching!

Nothing is forgotten!


  • Casey likes this

#4
tomasth

tomasth

    Member

  • Members
  • PipPipPipPipPip
  • 215 posts
Tele-working need cheep strong robust hardware.
If this is achieved (by china ?) , It could revolutionize many on site jobs , outsorcing them to the cheapest worker on earth.
And , it could push for other BCI , which an lead to captue the rest of what it take to do the job , by AI.


funkervogt ,
Is it still unsafe if most of the driving is autonomous ?

The AI just need to handle what is certain it can , and the rest is for people.
Not just cars , any task/job that can partially be autonomous , will have this.
  • Alislaws likes this

#5
Alislaws

Alislaws

    Democratic Socialist Materialist

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,997 posts
  • LocationLondon

Thats a good point, with this sort tele presence robotics, you could:

 

  • Automate all the tasks on a work site that it is currently practical to automate.
  • Whenever the automated system encounters a problem it cannot resolve, it calls for help
  • A human operator siting at home then uses BCI to take control of one of the nearby robots, and resolve the issue. 
  • All human operator problem solving is stored for AI teaching purposes. 
  • Work site has al the advantages of no humans, The most effective balance between replacing robots, and slowing down for safety reasons can be found without ethical issues. Air doesn't need to breathable. Temperature doesn't need to be comfortable.
  • No on site injuries, few sick days, easy to scale the number of human engineers up if things get difficult, and down as the AI improves.

So this is like the missing tech we need to bridge the gap between 70% automation and 100% automation. 



#6
tomasth

tomasth

    Member

  • Members
  • PipPipPipPipPip
  • 215 posts

It can solve japan construction worker shortage (or any other worker shortage). They can hire foreigner without having them in japan.

And for supervision bu a local , one of the paralyzed now working in waitressing.


  • Alislaws likes this

#7
funkervogt

funkervogt

    Member

  • Members
  • PipPipPipPipPipPip
  • 755 posts

funkervogt ,
Is it still unsafe if most of the driving is autonomous ?

No, but if you're being chauffered in an autonomous car, it also wouldn't be unsafe to type a text message on your phone using your fingers, or do a spoken phone conversation. 



#8
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,123 posts

For people who like to follow this sort of thing, it's worth pointing out some things mentioned about Reardon's intellectual pedigree, that were brought up in that last podcast:

 

* He claimed that he took "graduate-level math courses at MIT" when he was 14 or 15 years old.  If these were in "CS math" (e.g. "graph theory"), this is sort of impressive (not really that impressive); but if he did it in something like Algebraic Geometry, that would be really impressive.

 

* He took an interest in "the classics" at Columbia University and learned Latin, but eventually abandoned it.

 

* He was a "hacker" (in the good sense) at MIT, while still quite young.  The term originally applied not to people who break into computers, but who were superlative at writing computer code -- they tried to outdo one another in writing the cleanest, most elegant code.   This seems to be where he learned his good coding skills. 

 

* Then he went to work for Microsoft, and started their Explorer browser program.  He wrote some of the first code to build it, and then managed the project.  I get the impression that he was a major player -- one of the top names mentioned on the project.

 

* Then, he quit that, and studied neuroscience, getting at ph.d.  (at Columbia). 

 

* Now, he heads the CTRL-Labs project, which he says has one of the most impressive teams of machine learning and neuroscience talent you will find anywhere outside of Google and maybe Microsoft.  He's talked before about the level of talent.  I seem to recall he was asked once about this, since CTRL-Labs is not based in the Bay Area, but in NYC area, and recall that he said that there is more talent in ML in NYC than just about anywhere on the planet.  He said that a lot of the ML work there is secret, locked away in Wall Street companies.


  • Zaphod and Yuli Ban like this

#9
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,123 posts
We've heard what this thing can do in terms of reading and decoding motor signals; but I wonder if it can also read sensory signals. If so, there's a lot more you could do with the device:

* You could move your hand over Braille, and the device could read it.

* You could hold an object in your hand, and the device would know what it is based on shape and texture. This, of course, would make searching easier -- "What kind of fabric is this?"

* If you could put a few of these devices on your body, to record sensations, you could basically record everything you felt in a whole day. With neural stimulators, you could relive it later. Maybe it could also record sound and images, and then, for instance, you could relive an intimate moment -- even years later.

Oh, and here is a new article about CTRL-Labs, including a video:

https://variety.com/...mes-1203182938/

#10
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,123 posts
Here's a nice NPR Future You episode (published just today) on CTRL-Labs:

https://www.npr.org/...p-of-an-armband




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users