Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

When will household robots be common?

robots ai humanoid android bipedal artificial intelligence

  • Please log in to reply
25 replies to this topic

Poll: When will robots be common in households? (23 member(s) have cast votes)

When will 25% of households in highly developed countries have at least one humanoid robot that carries out household tasks?

  1. 2020 - 2025 (4 votes [17.39%] - View)

    Percentage of vote: 17.39%

  2. 2025 - 2030 (5 votes [21.74%] - View)

    Percentage of vote: 21.74%

  3. 2030 - 2040 (6 votes [26.09%] - View)

    Percentage of vote: 26.09%

  4. 2040 - 2050 (5 votes [21.74%] - View)

    Percentage of vote: 21.74%

  5. 2050 - 2075 (1 votes [4.35%] - View)

    Percentage of vote: 4.35%

  6. 2075 - 2100 (1 votes [4.35%] - View)

    Percentage of vote: 4.35%

  7. After 2100 (0 votes [0.00%])

    Percentage of vote: 0.00%

  8. Household robots wont ever become this popular (1 votes [4.35%] - View)

    Percentage of vote: 4.35%

Vote Guests cannot vote

#21
funkervogt

funkervogt

    Member

  • Members
  • PipPipPipPipPipPip
  • 634 posts

Let me highlight some pertinent predictions recently made by robotics pioneer Rodney Brooks: 

 

 

Prediction: Dexterous robot hands generally available.

[No earlier than] NET 2030

BY 2040 (I hope!)
Despite some impressive lab demonstrations we have not actually seen any improvement in widely deployed robotic hands or end effectors in the last 40 years.
 
Prediction: A robot that can navigate around just about any US home, with its steps, its clutter, its narrow pathways between furniture, etc.
Lab demo: NET 2026
Expensive product: NET 2030
Affordable product: NET 2035 What is easy for humans is still very, very hard for robots.

https://rodneybrooks...ed-predictions/



#22
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPip
  • 865 posts

I would take Rodney Brooks seriously about the part where he goes from "lab demo" to "affordable product", because he really understands the ins and outs of product development.  However, I would not take him seriously about the time required to get to that "lab demo", because Machine Learning -- in its modern incarnation (mostly the latest in Deep Learning) -- is not his field. If he were to teach a graduate course on the latest in Deep Learning applied to robotics, he would probably have to spend a lot of time familiarizing himself with all the little details that go into making it work well; and would probably have to rely heavily on his TAs.  It's not just "do backpropagation".  

 

This is not to say he doesn't have a lot of experience with Machine Learning.  He does, and has done revolutionary work in the field.  He could certainly teach a course in "classical" ML theory, explaining VC dimension, SVMs, HMMs, basic backpropagation, graphical models.  But I think he would find his background in the picky-tricky, cutting-edge stuff a bit too shaky.

 

Or, to put it another way:  if he were reviewing an NSF grant proposal on cutting-edge Deep Learning and Reinforcement Learning methods applied to dexterous robotics, where the main innovation is some technical improvements using some esoteric dense math, and given a choice between saying whether he is "very comfortable", "somewhat comfortable", "not comfortable, but can review it", and "very uncomfortable" reviewing the proposal, he would probably not choose "very comfortable".  

 

....

 

So, if you want a more accurate timeline estimate, throw away what he says about that lab demo, get a consensus report from people in DL for getting to the lab demo, and then add on what Brooks says in going from demo to product.


  • Casey and Yuli Ban like this

#23
tomasth

tomasth

    Member

  • Members
  • PipPipPipPip
  • 169 posts
They need to integrate proprioception with tactile and vision.
Berkeley had this interesting work about tactile
https://bair.berkele.../03/21/tactile/

#24
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPip
  • 865 posts
Here's another video worth pointing out, a talk by Andrew Ng from 2011:

YouTube video

Watch the first 5 minutes. He shows how even back then we had the hardware necessary for a high-performing home robot; but didn't have the software. If we had the software to run that robot, we would have a home robot capable of cleaning your home -- even making the bed, putting away your clothes, making coffee, putting a frozen meal in the microwave and turning it on and taking it out, and even doing some basic types of cooking. It wouldn't be able to walk up stairs; but we also have the hardware for that, already. Where you'd probably need a fancier robot would be to do things that require completing tasks very quickly and/or delicately -- e.g. more elaborate types of cooking, where you have to stir vigorously and add ingredient before the dish burns. That might take innovations on the hardware side.
  • Casey and Yuli Ban like this

#25
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPip
  • 865 posts
Robots on the run -- After decades of clumsiness, robots are finally learning to walk, run and grasp with grace. Such progress spells the beginning of an age of physically adept artificial intelligence.
 
https://www.nature.c...586-019-00999-w
 

Writing in Science Robotics, Hwangbo et al.2 report intriguing evidence that a data-driven approach to designing robotic software could overcome a long-standing challenge in robotics and artificial-intelligence research called the simulation–reality gap. For decades, roboticists have guided the limbs of robots using software that is built on a foundation of predictive, mathematical models, known as classical control theory. However, this method has proved ineffective when applied to the seemingly simple problem of guiding robotic limbs through the tasks of walking, climbing and grasping objects of various shapes.

A robot typically begins its life in simulation. When its guiding software performs well in the virtual world, that software is placed in a robotic body and then sent into the physical world. There, the robot will inevitably encounter limitless, and difficult to predict, irregularities in the environment. Examples of such issues include surface friction, structural flexibility, vibration, sensor delays and poorly timed actuators — devices that convert energy into movement. Unfortunately, these combined nuisances are impossible to describe fully, in advance, using mathematics. As a result, even a robot that performs beautifully in simulation will stumble and fall after a few encounters with seemingly minor physical obstacles.

Hwangbo et al. have demonstrated a way of closing this performance gap by blending classical control theory with machine-learning techniques. The team began by designing a conventional mathematical model of a medium-dog-sized quadrupedal robot called ANYmal (Fig. 1). Next, they collected data from the actuators that guide the movements of the robot’s limbs. They fed this information into several machine-learning systems known as neural networks to build a second model — one that could automatically predict the idiosyncratic movements of the AMYmal robot’s limbs. Finally, the team inserted the trained neural networks into its first model and ran the hybrid model in simulation on a standard desktop computer.


  • Zaphod, Casey, Yuli Ban and 1 other like this

#26
Yuli Ban

Yuli Ban

    Born Again Singularitarian

  • Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 20,318 posts
  • LocationNew Orleans, LA

Here's a roughly six month old article from the New York Times and my own thoughts on it (also six months old)
 
What Comes After the Roomba? | The promises and limitations of domestic robots
 
Fundamentally, the issue seems to be rooted in the limits of artificial intelligence. The Roomba works because you don't need much AI to vacuum a floor. If the Greeks had electricity, they could have made the Roomba through purely mechano-analog means.
 
Many of these other robot concepts require algorithms that can easily recognize/understand the world around them, and there has been sixty-plus years of engineers simply asking "Okay, can we make computers navigate 3D space or not?"
 
Let's just take the simplest example given: a self-cleaning litter box. These actually have seen modest commercial success in recent years. The main problem is that they're expensive. Otherwise, we have everything we need for this specific niche product— the litter box needs only to recognize when a cat has gone and will begin cleaning promptly. That's an area where home robotics has taken off.
 
Compare that to the catastrophe that has been "social companion robots". Jibo was doomed from the start. Most social companion robots are doomed at the present moment because they're more general-purpose than these other machines. They're not utility bots in any capacity, but in order for them to work, they require more technology.
 
Static social robots aren't competing with just each other— they're also competing with smart speakers like Home and Echo on top of digital assistants like Siri, Alexa, Cortana, and others. This is why Jibo failed— it was cute, yes, but that's all it was compared to Alexa or Google Home, each of which were much more capable.
 
Pepper is a personal favorite because it's actually a genuine robot. It's a humanoid that rolls, has an actual face, and even hands that can grip things (though weakly). This made it a better option as a "social robot" because it could actually move autonomously. But this also required more algorithms so that it could recognize where it was going.
 
And then there was the fundamental problem with all social robots circa 2018— they're still glorified chatbots. And before someone argues "aren't we all glorified chatbots?", I can't actually hold a reasonably-lengthed conversation with Pepper, which is the main reason why I'd want one. Pepper doesn't understand language in such a deep way; she only says what she's been programmed to say. You could find her limits within a few minutes. For a social robot, this is a crippling flaw fundamentally rooted in the state of our technology. It's as if you had a car that could only move up to 20 miles per hour, couldn't turn left or right, required gasoline changes every few miles, and sometimes just didn't start to begin with. This describes cars in the late 1800s, just as social robots describes the limit of what domestic robots can do in the 2010s.
 
This isn't even going into actual utility robotics— clothes-folding robots take such a long time for a variety of reasons. One major reason is because they're constantly scanning their environment and figuring out how best to fold any specific piece of clothing. They also need more power, more energy-dense batteries or some means of getting enough juice. Without this, they are worthless— it's cheaper to hire a maid. Hell, it's probably faster to hire a maid, have them drive over, fold your clothes, and then leave before the robot has finished with even a single stack.
 
Robotic lawnmowers: same problem mentioned several times— visual recognition algorithms are too weak. These robots can not handle variables well, so if there's a snake in the yard, it's getting mowed. If there's a piece of scrap metal, all the same. If there's a large branch, you need a new mower. So you basically need to half mow the lawn yourself, and by that point, it's just plain easier to do it yourself entirely.
 
And that's where robots fail. When it's easier to do it yourself, why have a robot? With robots, you're constantly picking up after them, setting them in the right path, or fixing their mistakes. Half the time, people own these robots just for the novelty of having them.
 
The lowly Roomba spent years in that stage before it finally truly went mainstream, and its genius is that it's just as easy to get a Roomba to clean your floor as it is to clean it yourself. Turn it on, let it go, that's it.


And remember my friend, future events such as these will affect you in the future.






Also tagged with one or more of these keywords: robots, ai, humanoid, android, bipedal, artificial intelligence

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users