Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

BCIs & Neurotechnology News and Discussions

cyberkinesis BCI psychotronics transhumanism bionics human enhancement brain computer interface transhuman cyborgs neuroscience

  • Please log in to reply
74 replies to this topic

#61
Yuli Ban

Yuli Ban

    Born Again Singularitarian

  • Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 22,221 posts
  • LocationNew Orleans, LA

Thinking about BCIs a bit tonight. 

 

Nothing profound or technical. Just a thought or two.

 

Imagine being a writer and having access to a brain-computer interface that has already been adjusted to your thought patterns and recognizes general concepts and symbols just by recording the amount of water in the blood passing through the brain, or something along those lines.

 

You might be like me and have an extremely active and vivid imagination.

 

This BCI could theoretically allow you to write a story just by maintaining a train of thoughts. The output might not be what you desire... But that's why you have a BCI, which can allow you to immediately rectify a mistake using the same process. 

 

If it can reconstruct words, it can build images as well (the same ability NLG models have, but operating through different means). 

 

This writer could conceivably construct picture books and, potentially, entire films just with his or her mind. 

 

They could then send it neurally onto the internet, where it could be read by others who neurally scroll through an e-reader. 

 

This is my particular dream above all: hands-free digital feedback. At my fastest, I can type at 86 words per minute; when I'm on a roll but not speeding, I can average roughly 45 to 60 words a minute for a couple hours non-stop.

 

A sufficiently advanced BCI might allow me to "type" the equivalent of 500 words a minute or more. 

To put that into perspective, the lower limit of a "novel" starts at 50,000 words. That means that this hypothetical BCI could allow the writer to finish a novel in roughly an hour and a half.


And remember my friend, future events such as these will affect you in the future.


#62
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,017 posts

I have written about something related to that, but not exactly that, in the past.  What I wrote was about "program synthesis":  say you instruct a computer to make you a videogame, and describe it in spoken words -- "I want this level of the game to be in outer space on a big planet."  That one sentence is so under-specified.  What color is the planet?  How big is "big"?   Is it an ice planet, a watery plant, a desert planet, a planet like earth, like the moon, how would one describe it?  How about the sun or suns?  Lots and lots of questions that have to be answered.  By the time you answer all those questions, you might as well have written the program in Java or Python or something, instead of in natural language.

 

Now, the program can attempt to make guesses about what would make a good game; but then you have lost "creative control".  BCIs offer a way to get around this, to maintain creative control.  For, as you are speaking about that game level, you are also thinking about it; and your thoughts convey all sorts of information about what you are expecting.  With a sufficient amount of training, a computer should be able to make a really good educated guess about what you would find pleasing.

 

So, that was what I wrote in a post a few months ago.  But I think it would also work with writing works of fiction -- both as a means of producing works you would find enjoyable, and also producing works that you share with others.  With a sufficient amount of training data, the computer ought to get pretty good at "filling in the blanks" from what you say, or even just what you think silently in your own mind.



#63
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,017 posts
Gabe Newell on Brain-computer Interfaces: ‘We’re way closer to The Matrix than people realize’

https://www.roadtovr...people-realize/

Lots of quotable lines in this piece. Newell says:

“It’s an extinction-level event for every entertainment form that’s not thinking about [BCI]. If you’re in the entertainment business and you’re not thinking about this, you’re going to be thinking a lot more about it in the future.”


I suspect he has been in contact with some bleeding-edge BCI makers; and these aren't all working on EEG (which is limited).

#64
Yuli Ban

Yuli Ban

    Born Again Singularitarian

  • Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 22,221 posts
  • LocationNew Orleans, LA

Exactly. Just thinking about this has actually given me some cold feet on writing some of my own stories because I figured I'd get it all done with synthetic media + BCIs anyway (though I realize that it's not coming quite soon enough for that).


And remember my friend, future events such as these will affect you in the future.


#65
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,017 posts
Facebook's mind-reading plans just took another step forward

A team of researchers sponsored by the social media giant has reached a new milestone in brain-computer interface technology.

https://www.zdnet.co...r-step-forward/
 

Multiple studies have already succeeded in understanding words and sentences based on human brain signals picked up by electrodes, but the accuracy and speed of the decoding process are still pretty low; UCSF's researchers said that error rates average 60% for 100-words vocabularies.

Using advanced machine learning for speech recognition and language translation, however, the team managed to translate neural activity into English sentences with an error rate of only 3% for vocabularies of up to 300 words. The new milestone follows up on a previous publication from the same team last summer, which detailed how the researchers successfully decoded speech directly from the brain in real-time.


Probably not many of you will believe this, but I'll write it again anyways: there are two main limitations of these methods. One is the amount of bits of information you can pull out of the brain each second; and the other is the amount of training data. The former is solvable with next-gen wearable, non-invasive BCIs; and the latter is also solvable, once those BCIs arrive.

You don't need to UNDERSTAND how the brain works to decode. These works you are seeing in the press on decoding are not breakthroughs in understanding the brain; they're more engineering feats about how to leverage limited data better.

In a very short period of time we will have thousands of times more data to train these machine learning models, and accuracy will improve at an exponential rate. It will be as hard to fathom as the spread and rising death toll of Covid-19. After the fact, people will say, "Well, it was predictable. It wasn't really a surprise. Why should anybody be astonished here in 2022 that we can decode speech from the brain using a wearable, about as accurately as a speech recognition system?"

Basically, what's happening is that there is a "technological overhang": we had the advanced ML and compute needed long before we had the wearable scanners; so, when the scanners come along, it will be like a dam bursting, as the data blasts forth into the ML models, improving the decoding.


#66
tomasth

tomasth

    Member

  • Members
  • PipPipPipPipPip
  • 302 posts

So brain decoding is going the way deep neural net did in the 2010's ? One major improvement (speech decoding the way image recognition did) driving many other unexpected uses and driving new capabilities and accumulating new overhangs ?

 

 

 

With massive scanning of animal/human brains , a more detailed tracking of cognition will improve the artificial versions (what can neuroscience advance if it had 1 000 000 time scanner+human analyzers to study natural intelligence ?).



#67
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,017 posts

It's going to happen even faster than the post-2010 Deep Learning revolution.  Deep Learning around 2010 took off rapidly due to:

 

1.  More GPU being thrown at problems.

 

2.  Large, new datasets like ImagNeet were becoming available.  

 

3.  Engineering tweaks that made neural net models work better -- like better initialization, for example.

 

BCIs will enable #2 to happen very quickly -- large datasets will be available; and, at the same time, we won't have to wait for #1 and #2 to mature -- they are already mature.  I don't see anything stopping the phenomenally fast advancement of this field.  Anybody who says otherwise hasn't thought it through.



#68
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,017 posts

The documentary film "I am Human" is now available for streaming:

 

https://www.digitalt...an-documentary/

 

It was made by Taryn Southern, Elena Gaby (and I think also Bryan Johnson).  Bryan Johnson appears several times in the film.  His company Kernel is mentioned a few times.  

 

....

 

I think BCIs might help with tetraplegia, but I doubt they will help much with Parkinson's, ALS, and Alzheimer's, except to diagnose those diseases very early -- but it won't help people who are already too far gone.  Some of these diseases are due to motor-neuron death.  The right solution to something like that is going to be something like stem-cell injections, CRISPR therapies of some kind, and the like.  BCIs are the wrong tool.

 

For tetraplegia, a better solution might be attempting to regrow neurons and fibers that have been damaged.  So, it's going to be more a genetics-type solution, not a digital electronics one.



#69
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,017 posts
About 47 minutes in (at least in the Google Play version of the film), Bryan Johnson gives a glimpse into Kernel's wearable BCI. At the very end, 1 hour, 26 minutes, 17 seconds in, they say that, "They plan to release the world's first high resolution, wearable brain interface in 2021". But you'll hear more about what it can do before then, probably.

#70
Yuli Ban

Yuli Ban

    Born Again Singularitarian

  • Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 22,221 posts
  • LocationNew Orleans, LA

Brain implant and signal decoder have done the impossible and reversed paralysis

Paralysis used to mean a life sentence of immobility with no way out—until now.
 
Back in 2010, Ian Burkhart suffered a devastating injury that would leave him mostly paralyzed. Even though he was still able to move his shoulders and elbows, he had lost sensation in his hands. That was until Patrick Ganzer at Battelle Memorial Institute fast-forwarded biotech into the future by developing a brain implant that would turn Burkhart’s life around. When the implant connects to a specialized brain-computer interface, it does something that has never been done before and has restored both movement and touch in his right hand.


And remember my friend, future events such as these will affect you in the future.


#71
Yuli Ban

Yuli Ban

    Born Again Singularitarian

  • Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 22,221 posts
  • LocationNew Orleans, LA

Why are BCIs stuck in a limbo of relevance? People seem to regard them as both ultra-high tech as well as over hyped tools which are nowhere near as useful as they ought to be.

 

Of course, the reason is simple: our toolset is outdated. The two most common methods are EEGs and direct brain stimulation. 

 

However, EEGs are rather indirect. We've had EEG headsets for a century now, and most of the progress in that century has only resulted in less noisy results. But EEG is really recording electrical signals on the skull. Even with AI to clean up the signal, there's only so high of a resolution you can get.

 

Invasive nodes, however, require surgery and directly messing with your neurons. Most people aren't willing to go to such lengths, which tend towards the expensive. 

Now, I owe EEG tech everything for getting me fascinated in BCIs in the first place— it was the Mindflex toy from 2009 that made me realize this tech was possible at all. 

 

But the thing is, you could rerelease the Mindflex today and it'd still be just as mindblowing and futuristic despite being based on the same principles and of a comparable power. Maybe you could use an Emotiv Insight headset this time around? The point is really that this is a fairly slow developing field. 

Over the past two decades, whenever some start-up talked about using non-invasive BCIs to do something futuristic (e.g. control a video game, type on a smartphone or PC, use your mood to curate a playlist), it was always based on EEG technology since that was the cheapest to do and easiest to commercialize. But we've largely done everything we can do with EEGs, leading to a sense that there's nothing else exciting to be done with them.

Even news of Chinese scientists managing to record a shockingly high number of characters per second using EEGs left no impact. Sure, that's an improvement over previous accomplishments, but it's just eking out better performance.

 

And few people trust sticking wires and nodes into their brains. This is one of the biggest drawbacks to the Neuralink, even though it might prove to be the most accurate of all BCI methods when its time comes. There'd need to be a massive, pro-transhumanist culture change for people to widely accept invasive BCIs in any large number.

 

People are disillusioned by BCIs. To them, all the good stuff is too far out into the future.

 

Perhaps we need something to shake things up and re-introduce an element of growth. Maybe even future-shock people....

True, for a period of time, plenty of people aren't going to pay attention just because of BCIs being fairly disappointing for a long time. But it'll catch on soon enough.


And remember my friend, future events such as these will affect you in the future.


#72
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 2,017 posts

And, like Deep Learning, there are large revenue streams the next-gen BCI companies can plug into, to generate money to keep them rolling along for years; but they're probably not as easy to tap as the ones for Deep Learning; and there will be a lot of competition. 

 

Deep Learning is important for image, speech, and face recognition; and companies are willing to pay very large sums of money to have those capabilities.  But what is the equivalent for BCIs? 

 

The elephant in the room here is "neuromarketing".  I'm guessing it's nowhere near as large yet in value as what DL can offer; but it's still considerable -- and will grow at an exponential rate, with better BCI technology to run experiments.  

 

Yet another controversial use that has large pots of money behind it is to be found in authoritarian governments.  You can bet that China will develop the technology very quickly, now that it is known to be possible to do so at a near consumer market price range (and, soon, the price of a smartphone).  It will be used to monitor factory workers and dissidents, once the price falls even lower, down to the $250 price range.  

 

The U.S. government might also find the tech to have "strategic interest", and pump money into U.S. BCI companies -- not unlike how they routinely pump money in projects like electric cars; and, of course, various projects by "defense contractors".  The military applications of this technology are immense; and I could see the DOD, specifically, throwing some of their $$ towards the most innovative BCI companies to be used on the battlefield.  The way it would probably work is through some kind of purchase agreement, whereby the USG buys hundreds or thousands of BCIs at full market price.  This might take some pitching by the BCI companies.  

 

Another source of funding might be DOD (e.g. DARPA) grants.  I don't happen to know of any that fund private corporations like this; but I'm sure they exist.  (All the NSF grants I'm familiar with are for funding of universities, not private entities.)  

 

Addendum:  Of course, there is also buy-out potential.  Facebook is ahead of other companies in developing BCIs.  If they want to catch up they will need to buy a company.  



#73
Yuli Ban

Yuli Ban

    Born Again Singularitarian

  • Moderators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 22,221 posts
  • LocationNew Orleans, LA


And remember my friend, future events such as these will affect you in the future.


#74
Erowind

Erowind

    Anarchist without an adjective

  • Members
  • PipPipPipPipPipPipPip
  • 1,519 posts

What kind of battery does neuralink use? Sticking a lithium ion inside someone is an exploding head waiting to happen. Implantable computing should ideally be powered by the body itself, or some other cybernetic implant that can convert caloric intake into electricity. Something like pic related.

 

GHN0Xgf.png

I want a dynamo in my belly



#75
wjfox

wjfox

    Administrator

  • Administrators
  • PipPipPipPipPipPipPipPipPipPipPip
  • 12,572 posts
  • LocationLondon

Well, here's my opinion on it. An extremely bold and visionary concept, with potential to have massive impact in the future. But I think I'll wait a few decades before trying one for myself. :) It's still just a prototype, and at such an early stage, the risk of something going wrong is too high.

 

But imagine what future versions might look like.

That bit in the livestream where it shows the pig's brain signals, makes me think of those old clips you sometimes see from the 1970s where they show off "the latest computer" and it looks so laughably primitive compared to now. That's how this Neuralink device might appear to people in 40 years' time. So imagine a version in, say, 2050 or 2060 that's a million or even a billion times more advanced – it could store your memories, or record your dreams, for example, and then perhaps by 2100 you could literally upload your entire consciousness to a new body.







Also tagged with one or more of these keywords: cyberkinesis, BCI, psychotronics, transhumanism, bionics, human enhancement, brain computer interface, transhuman, cyborgs, neuroscience

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users