Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!

These ads will disappear if you register on the forum

Photo

"Rant: Nine reasons why I don't believe in current VR/AR technology."


  • Please log in to reply
5 replies to this topic

#1
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPip
  • 734 posts
https://threadreader...1382380545.html

Some of these look like they could be solved with Brain-Computer Interfaces, and may be one reason why Facebook is investing in it. A sufficiently good BCI device won't fix all the problems, but could help with:

* Decently-fast typing without having to use voice or typing mid-air with you hands (which can be exhausting).

* No need to memorize gestures -- just do whatever feels natural, and the semantics will be read off the brain scan.

* It may be possible to predict where a person will move their hands 100 milliseconds or more into the future, and this could be used to improve the alignment between physical and virtual body.

* Better hand-tracking, without the occlusion problems. Maybe something like CTRL-Labs's neural interface would be best for this.

* If the BCI is both read and write, then you could enable people to feel virtual objects. Their hands would still pass right through; but at least they would perceive the object in space (as a bodily sensation). Haptics could also achieve this, but the range of sensations would probably be greater with a read-write BCI.

* Motion sickness might be predictable (based on BCI scan) before it happens, and then the imagery could be tuned to reduce it. I'm not exactly sure how, but I suspect it might be possible -- e.g. subtly change the lighting or colors.


I'm not suggesting this is going to be trivial -- clearly, it will involve a lot of work (a few ph.d. theses, and a few years of HCI research) to figure out how to use BCIs to help with these.
  • Raklian, Yuli Ban, Kemotx and 1 other like this

#2
Erowind

Erowind

    Psychonaut, Aspiring Mathematician and Anarchist

  • Members
  • PipPipPipPipPipPip
  • 868 posts
  • LocationIn some cafe eating--yes eating--roasted coffee beans and reading semiotext(e)s
Using the HTC vive with steam's controllers somewhat works. It's not keyboard fast but I've gotten about as fast using my thumbs on the touchpads as I have with typing on my phone. Honestly until typing is at least on par with a keyboard it's a major limitation. Maybe I'm just stubborn. I know people can type pretty fast on their phones but my brain even after 2 years on a touch screen still hasn't adjusted causing me to be pretty clumsy when typing with my phone or a touchpad.
  • Sciencerocks and starspawn0 like this

Current status: slaving away for the math gods of Pythagoras VII.


#3
Alislaws

Alislaws

    Democratic Socialist Materialist

  • Members
  • PipPipPipPipPipPipPip
  • 1,733 posts
  • LocationLondon

Using the HTC vive with steam's controllers somewhat works. It's not keyboard fast but I've gotten about as fast using my thumbs on the touchpads as I have with typing on my phone. Honestly until typing is at least on par with a keyboard it's a major limitation. Maybe I'm just stubborn. I know people can type pretty fast on their phones but my brain even after 2 years on a touch screen still hasn't adjusted causing me to be pretty clumsy when typing with my phone or a touchpad.

I'm terrible with phone touchscreens! even after years. 

 

I think you could do a lot with an AR/VR controller that was just a comfy pair of gloves, (unhygienic? Why?) but you'd basically have to learn a sign language to operate a PC that way, so I really hope BCI comes through for us on this in some kind of reasonable timeframe. 

 

 

...may be one reason why Facebook is investing in it

I think facebook just really want to use mind reading on you in order to target you with ever more personal and specific Ads. Everything else is just trying to find excuses to get you to give them access to your brain. 


  • Erowind likes this

#4
Erowind

Erowind

    Psychonaut, Aspiring Mathematician and Anarchist

  • Members
  • PipPipPipPipPipPip
  • 868 posts
  • LocationIn some cafe eating--yes eating--roasted coffee beans and reading semiotext(e)s
/\ accurate gestures like sign language would be really cool. I learned how to touch type so I could definitely learn digital sign language. We need a universal standard. In fact. I wonder if we couldn't just literally use existing sign language.

Current status: slaving away for the math gods of Pythagoras VII.


#5
Alislaws

Alislaws

    Democratic Socialist Materialist

  • Members
  • PipPipPipPipPipPipPip
  • 1,733 posts
  • LocationLondon

/\ accurate gestures like sign language would be really cool. I learned how to touch type so I could definitely learn digital sign language. We need a universal standard. In fact. I wonder if we couldn't just literally use existing sign language.

Plus everyone learning an existing sign language would be super convenient for deaf people!

 

I wonder if sign language (Assuming you had a computer system that could interpret it) has a higher or lower bandwidth than speech or keyboard/mouse in terms of human/computer interaction. 


  • Erowind likes this

#6
Erowind

Erowind

    Psychonaut, Aspiring Mathematician and Anarchist

  • Members
  • PipPipPipPipPipPip
  • 868 posts
  • LocationIn some cafe eating--yes eating--roasted coffee beans and reading semiotext(e)s

/\ accurate gestures like sign language would be really cool. I learned how to touch type so I could definitely learn digital sign language. We need a universal standard. In fact. I wonder if we couldn't just literally use existing sign language.

Plus everyone learning an existing sign language would be super convenient for deaf people!
 
I wonder if sign language (Assuming you had a computer system that could interpret it) has a higher or lower bandwidth than speech or keyboard/mouse in terms of human/computer interaction.

There are a fair amount of deaf people at my community college (10ish.) I know the alphabet and some basic words in American Sign Language. I can't communicate quickly because I often have to say things letter by letter. But the other kids who are competent seem to be able to communicate as fast or almost as fast as people who speak. I had a crush on a deaf girl once. On one hand it was endearing that we had to type to each other on a phone because it was quicker than trying to sign. On the other hand it was difficult to build a friendship first because getting past small talk levels of complexity was difficult.

Current status: slaving away for the math gods of Pythagoras VII.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users