Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

The Spreadsheet of the 21st Century


  • Please log in to reply
1 reply to this topic

#1
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,961 posts

I don't know how many of you have ever played around with Google's Colab, but it is a wonder to behold!:

https://colab.research.google.com/

If you've suffered through using Matlab and other systems to analyze data (as I have), you will know how great the improvement is!

You can write little Python programs in your web browser, and run them on Google servers, for free. This is especially useful if you want to do data analysis using a Chromebook, say. Using Numpy and Pandas and a few other libraries (e.g. Tensorflow), you can do just about anything -- e.g. import and analyze spreadsheet files, mpeg 3 and 4 files, and so on. So very convenient...

(Also, I recommend looking up various Jupyter Notebooks by researchers that you can run on Colab. There are a lot of them out there!)

I've been thinking about analyzing some of my EEG data this way, which can be exported to Colab as a CSV file (basically, a spreadsheet, with one column per EEG channel).

What this has got me wondering is just how much further it can be pushed, and what its impact will be. A few things to wish for:

* Removal of the last little bit of interface plumbing you have to bother with. No need to mount and unmount drives and things. Make it as simple and painless as possible, so that you don't have to spend hours on this unimportant stuff.

* Add the ability to write "programs" in natural language, like in Wolfram Alpha. For example, something like:
 

Import my EEG1.csv file. Tell me the average value of each channel over 10 seconds.


And maybe the file has columns "Time step (milliseconds)", "Channel 1", "Channel 2", etc.; and it infers that 10 seconds means you need to average over 10,000 rows.

* Add a data analysis "Assistant" to answer any questions you may have.

Anything else? I'm sure if there is, Google will think of it.

I think in the years ahead we will see a dramatic expansion in the use of machine learning and data analysis, now that you don't need large computational resources (Chromebook will suffice), and don't need to waste lots of time dealing with the technical issues that these libraries solve. I mean, basically, just about anybody out in the middle of the cornfields or somewhere can do very high level data analysis on their laptop now. It's like a spreadsheet for the 21st century -- and goes far, far beyond what was possible before.



#2
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPipPip
  • 1,961 posts
Incidentally, among the vast, vast number of Colab Notebooks you can run are implementations of Microsoft's DialoGPT chatbot with the decoder. I was playing around with one of these just now. You can find a link to it here:

https://colab.resear...qgrvOQLqumUyOdA

(I saw it on the Machine Learning Reddit.)

If you click on this, it will take you to the Notebook. Click to open it in "Playground" mode. Then, in each of the code cells, click the play icon on the left to run the code -- do this for each of the cells, down to the bottom. The whole process should take just a few minutes to load it. It has to load and configure various libraries, including one from Huggingface, which makes a "Transformers" library that includes a GPT-2 model and decoder.

When you get to the bottom of the page, the very last cell will bring up a terminal that you can use to talk to DialoGPT. It's very good over a single turn, and does ok over longer turns -- but it clearly has its limits. I asked it a few questions about music, its age, and its favorite books, and it did a passable job -- not good enough to pass a Turing Test, but not bad, either.

There are other Notebooks with implementations of DialoGPT that might do a better job. Small tweaks have the potential to make large improvements to the output.

....

If you're interested in using Colab, there are a series of YouTube videos that Google has put out. For example, here is one that discusses how to use a free GPU and/or TPU:

YouTube video

All it takes is one click of the mouse.

I think once we enter the era when there are massive-scale neuroimaging datasets to play with, Colab will make it very easy to use that data to build AI models. I've been reading about the data standards that people use, e.g. the BIDS standard, and it looks pretty complicated. But, fortunately, there are libraries that will make the job a lot easier, and these might end up getting pre-installed in Colab -- making it even easier still.

I mean, if you personally had a high-res BCI, what could you do?

You could, for example, scan your own brain as you listen to audio recordings with your eyes closed, while lying down in a dark room; and then, you could save a copy of your brain scan file to Google Drive, along with a copy of the audio track. Next, you could write a little Python program in Colab that loads up these two files and converts them into data arrays (using Pandas and Numpy), temporally aligns them, and then in just a few lines of code, you could train a neural net model to predict your brain responses, given current and previous audio, and previous brain responses. You don't even have to know how to differentiate or how the Backpropagation algorithm works -- all that's handled for you by the libraries. You do have to specify the architecture -- but you could probably find some standard ones to use in the literature (or do some trial-and-error).

I mean, it's like just a small number of hours of work to set this up -- it's that easy with Colab. A high school student without a lot of technical training could do it. That's not an exaggeration!

So, I hope you can see why I'm excited by it!

And of course Colab will also make brain decoding a lot, lot easier to pull off. It will be enormously easier to train little machine learning models to decode your brain -- tell what you are thinking -- once you have enough data to play with (when next-gen BCIs arrive).




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users