Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

"Rant: Nine reasons why I don't believe in current VR/AR technology."


  • Please log in to reply
9 replies to this topic

#1
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPip
  • 866 posts
https://threadreader...1382380545.html

Some of these look like they could be solved with Brain-Computer Interfaces, and may be one reason why Facebook is investing in it. A sufficiently good BCI device won't fix all the problems, but could help with:

* Decently-fast typing without having to use voice or typing mid-air with you hands (which can be exhausting).

* No need to memorize gestures -- just do whatever feels natural, and the semantics will be read off the brain scan.

* It may be possible to predict where a person will move their hands 100 milliseconds or more into the future, and this could be used to improve the alignment between physical and virtual body.

* Better hand-tracking, without the occlusion problems. Maybe something like CTRL-Labs's neural interface would be best for this.

* If the BCI is both read and write, then you could enable people to feel virtual objects. Their hands would still pass right through; but at least they would perceive the object in space (as a bodily sensation). Haptics could also achieve this, but the range of sensations would probably be greater with a read-write BCI.

* Motion sickness might be predictable (based on BCI scan) before it happens, and then the imagery could be tuned to reduce it. I'm not exactly sure how, but I suspect it might be possible -- e.g. subtly change the lighting or colors.


I'm not suggesting this is going to be trivial -- clearly, it will involve a lot of work (a few ph.d. theses, and a few years of HCI research) to figure out how to use BCIs to help with these.
  • Raklian, Yuli Ban, Kemotx and 2 others like this

#2
Erowind

Erowind

    Member

  • Members
  • PipPipPipPipPipPip
  • 936 posts
Using the HTC vive with steam's controllers somewhat works. It's not keyboard fast but I've gotten about as fast using my thumbs on the touchpads as I have with typing on my phone. Honestly until typing is at least on par with a keyboard it's a major limitation. Maybe I'm just stubborn. I know people can type pretty fast on their phones but my brain even after 2 years on a touch screen still hasn't adjusted causing me to be pretty clumsy when typing with my phone or a touchpad.
  • Sciencerocks and starspawn0 like this

Current status: slaving away for the math gods of Pythagoras VII.


#3
Alislaws

Alislaws

    Democratic Socialist Materialist

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,860 posts
  • LocationLondon

Using the HTC vive with steam's controllers somewhat works. It's not keyboard fast but I've gotten about as fast using my thumbs on the touchpads as I have with typing on my phone. Honestly until typing is at least on par with a keyboard it's a major limitation. Maybe I'm just stubborn. I know people can type pretty fast on their phones but my brain even after 2 years on a touch screen still hasn't adjusted causing me to be pretty clumsy when typing with my phone or a touchpad.

I'm terrible with phone touchscreens! even after years. 

 

I think you could do a lot with an AR/VR controller that was just a comfy pair of gloves, (unhygienic? Why?) but you'd basically have to learn a sign language to operate a PC that way, so I really hope BCI comes through for us on this in some kind of reasonable timeframe. 

 

 

...may be one reason why Facebook is investing in it

I think facebook just really want to use mind reading on you in order to target you with ever more personal and specific Ads. Everything else is just trying to find excuses to get you to give them access to your brain. 


  • Erowind likes this

#4
Erowind

Erowind

    Member

  • Members
  • PipPipPipPipPipPip
  • 936 posts
/\ accurate gestures like sign language would be really cool. I learned how to touch type so I could definitely learn digital sign language. We need a universal standard. In fact. I wonder if we couldn't just literally use existing sign language.

Current status: slaving away for the math gods of Pythagoras VII.


#5
Alislaws

Alislaws

    Democratic Socialist Materialist

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,860 posts
  • LocationLondon

/\ accurate gestures like sign language would be really cool. I learned how to touch type so I could definitely learn digital sign language. We need a universal standard. In fact. I wonder if we couldn't just literally use existing sign language.

Plus everyone learning an existing sign language would be super convenient for deaf people!

 

I wonder if sign language (Assuming you had a computer system that could interpret it) has a higher or lower bandwidth than speech or keyboard/mouse in terms of human/computer interaction. 


  • Erowind likes this

#6
Erowind

Erowind

    Member

  • Members
  • PipPipPipPipPipPip
  • 936 posts

/\ accurate gestures like sign language would be really cool. I learned how to touch type so I could definitely learn digital sign language. We need a universal standard. In fact. I wonder if we couldn't just literally use existing sign language.

Plus everyone learning an existing sign language would be super convenient for deaf people!
 
I wonder if sign language (Assuming you had a computer system that could interpret it) has a higher or lower bandwidth than speech or keyboard/mouse in terms of human/computer interaction.

There are a fair amount of deaf people at my community college (10ish.) I know the alphabet and some basic words in American Sign Language. I can't communicate quickly because I often have to say things letter by letter. But the other kids who are competent seem to be able to communicate as fast or almost as fast as people who speak. I had a crush on a deaf girl once. On one hand it was endearing that we had to type to each other on a phone because it was quicker than trying to sign. On the other hand it was difficult to build a friendship first because getting past small talk levels of complexity was difficult.
  • Alislaws likes this

Current status: slaving away for the math gods of Pythagoras VII.


#7
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPip
  • 866 posts
Tweets about Mike Ambinder's (Valve) talk today (Friday, March 22, 2019) at GDC 2019 on BCIs and the future of gaming:

https://mobile.twitt...139182897618945

 

#GDC19TALK Ambinder: when we add physiological data to the traditional inputs we should be able to make new kinds of games.

https://mobile.twitt...139478491226113




#GDC19TALK Ambinder: what if you didn’t have to remember anything to play a game? Wouldn’t that change games?

https://mobile.twitt...139831848747008




#GDC19TALK Ambinder: what happens if you didn’t have tonuse current interfaces at all?

https://mobile.twitt...140312830566400




#GDC19TALK Ambinder: what about all of the data we’re missing from the player’s experience? Are they sad or happy? Engaged or bored? What if we had internal states, emotions, cognition?

https://mobile.twitt...142165798608896




#GDC19TALK Ambinder: what if we could intercept motion intent a motion signal that takes 100ms, 30ms before they produce movements of the hand? Would competitive players be interested? How dar are they willing to go with invasive methods?

[Mentions EEG, MEG, and invasive options.]

https://mobile.twitt...142983100661760
 

#GDC19TALK Ambinder: right now we can already measure learning, suprise, relaxation, affect, engagement. We only need to convince people to wear a helmet...

https://mobile.twitt...144800169611264




#GDC19TALK Ambinder: moment to moment insights can tell you how the players experiences events in the game in real-time. More objective data would allow us to know if the player is experiencing exactly what we desired.

https://mobile.twitt...145109516279808




#GDC19TALK Ambinder: New data (known when somebody is about to quit, which players are toxic!, if the tutorial worked or not)

https://mobile.twitt...145563977535488




#GDC19TALK Ambinder: middle line is arousal level with varying zombie counts. Is this curve what the game designer actually wanted?

https://mobile.twitt...146198600908800




#GDC19TALK Ambinder: Adaptive enemies will make difficulty levels a thing of the past, because it would be adjusted accordingly to what the player likes, not only win/lose data. Toxic players could be detected and automatically muted.

https://mobile.twitt...146764584484864




#GDC19TALK Ambinder: how do you feel about the trees, or the heroes, or the healthbar in a game?

https://mobile.twitt...147741832110080




#GDC19TALK Ambinder: knowing your internal cognitive states, games would start to learn how the players want to play. They will be adaptive. They will become a game just for you.

https://mobile.twitt...148240769761280




#GDC19TALK Ambinder: the end game of BCI is replacing interfaces completely and have a two way connection between the player intent and the game.

https://mobile.twitt...148924487467008




#GDC19TALK Ambinder: when you replace the interfaces with BCI this can happen. Type with tour thoughts, control prosthetics with your brains, improve senses, augmented cognition (focused attention), increase short memory capacity, learn quicker or even interface with machines.

https://mobile.twitt...150061114458113




#GDC19TALK Ambinder: the challenges: neuroscience is in its infancy (how do you solve a problem or feel love?). Working with neurological data now is like listening to the crowd in a football game and trying to know what’s happening: you can figure out the large events only.

[I would say that the challenge is: how do we build better sensors?]

https://mobile.twitt...151790845427712
 

#GDC19TALK Ambinder: we get data from the players at the beginning for a few to hours as calibration to figure out what’s really happening, because each player is different.


  • Yuli Ban likes this

#8
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPip
  • 866 posts
Some YouTube dude made a video summarizing the contents of these and other Tweets:

YouTube video

And Arstechnica wrote up their take on Ambinder's talk:

https://arstechnica....n-game-rewards/
  • Yuli Ban likes this

#9
Alislaws

Alislaws

    Democratic Socialist Materialist

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,860 posts
  • LocationLondon

Glad someone is thinking about this. Its a shame valve very rarely do anything. 

 

Still I there are suspicions valve is planning some kind of VR console or something similar, somewhere in the pipeline so BCI could play into that. 



#10
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPip
  • 866 posts
Video of Mike Ambinder's talk from GDC 2019:

https://www.gdcvault...es-One-Possible




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users