today we will talkabout Neuralink, Elon Musk’s neural engineering company that he created to develop brain-machineinterfaces. And your first question likely is, why talkabout Neuralink now? There was a recent event, and another onelast year as well, why did I not cover that? Well, the launch event from last year indeedpromised a great deal. In this series, we often look at researchworks that are just one year apart, and marvel at the difference scientists have been ableto make in that tiny-tiny timeframe. So first, let’s talk about their paper from 2019, which will be incredible, and then, see how far they have come in a year, which,as you will see, is even more incredible.
The promise is to be able to read and written information to and from the brain. To accomplish this, as of 2019, they usedthis robot to insert the electrodes into your brain tissue. You can see the insertion process here. From the close-up image you might think thatthis is a huge needle, but in fact, this needle is extremely tiny, you can see a penny forscale here. This is the story of how this rat got equippedwith a USB port. As this process is almost like inserting microphonesinto its brain, now, we are able to read the neural signals of this rat. Normally, these are analog signals, whichare read and digitized by Neuralink’s implant, and now this brain data is represented asa digital signal. Well, at first, this looks a bit like gibberish.
Do we really have to undergo a brain surgeryto get a bunch of these squiggly curves? What do these really do for us?
Well, now that they are digitized, we canhave the Neuralink chip analyze these signals and look for action potentials in them. These are also referred to as “spikes”because of their shape. That sounds a bit better, but still, whatdoes this do for us? Let’s see. Here we have a person with a mouse in theirhand. This is an outward movement with the mouse,and then, reaching back. Simple enough. Now, what you see here below is the activityof an example neuron. When nothing is happening, there is some firing,but not much activity, and now, look! When reaching out, this neuron fires a greatdeal, and suddenly, when reaching back, again, nearly no activity. This means that this neuron is tuned for anoutward motion, and this other one is tuned for the returning motion. And all this is now information that we canread in real time, and the more neurons we can read, the more complex motion we can read. Absolutely incredible. However, this is still a little difficultto read, so let’s order them by what kind of motion makes them excited. And there we go! Suddenly, this is a much more organized wayto present all this neural activity, and now we can detect what kind of motion the brainis thinking about. This was the reading part, and that’s justthe start. What is even cooler is that we can invertthis process, read this spiking activity, and just by looking at these, we can reconstructthe motion the human wishes to perform. With this, brain-machine interfaces can becreated for people with all kinds of disabilities where the brain can still think about themovements, but the connection to the rest of the body is severed. Now, these people only have to think aboutmoving, and then, the Neuralink device will read it and perform the cursor movement forthem. It really feels like we live in a sciencefiction world. And all this signal processing is now possibleautomatically and in real time, and all we need for this is this tiny-tiny chip thattakes just a few square millimeters. And don’t forget, that is just version onefrom 2019. Now, onwards to the 2020 event, where, itgets even better.
The Neuralink device has been placed intoGertrude, the pig’s brain, and here, you see it in action. We see the raster view here, and luckily,we already know what it means, this lays bare the neural action potentials before our eyes,or in other words, which neuron is spiking and exactly when. Below with blue, you see these activitiessummed up for our convenience, and this way, you will not only see, but hear it too, asthese neurons are tuned for snout boops. In other words, you will see and hear thatthe more the snout is stimulated, the more neural activity it will show. Let’s listen. And all this is possible today, and in realtime. That was one of the highlights of the 2020progress update event, but it went further. Much further! Look! This is a pig on a treadmill, and here yousee the brain signal readings. This signal marked with the circle shows wherea joint or limb is about to move, where the other, dimmer colored signal is the chip’sprediction as to what is about to happen. It takes into consideration periodicity, andpredicts higher-frequency movement, like these sharp turns really well. The two are almost identical, and that meansexactly what you think it means – today, we can not only read and write, but even predictwhat the pig’s brain is about to do. And that was the part where I fell off thechair when I watched this event live. You can also see the real and predicted world-spacepositions for these body parts as well. Very close.
Now note that there is a vast body of researchin brain-machine interfaces, and many of these things were possible in lab conditions, andNeuralink’s quest here is to make them accessible for a wider audience within the next decade. If this project further improves at this rate,it could help many paralyzed people around the world live a longer, and more meaningfullife, and the neural enhancement aspect is also not out of question. Just thinking about summoning your Tesla mightalso summon it, which sounds like science fiction, and based on these results, you seethat it may even be one of the simplest tasks for a Neuralink chip in the future. And who knows, one day, maybe, with this device,these videos could be beamed into your brain much quicker, and this series would have tobe renamed from Two Minute Papers to Two Second Papers, or maybe even Two Microsecond Papers. They might actually fit into two minutes likethe title says, now that would truly be a miracle. Huge thanks to scientists at Neuralink forour discussions about the concepts descriped in this video and ensuring that you get accurateinformation. This is one of the reasons why our coverageof the 2020 event is way too late compared to many mainstream media outlets, which leadsto a great deal less views for us, but it doesn’t matter. We are not maximizing views here, we are maximizinglearning. Note that they are also hiring, if you wishto be a part of their vision and work with them, make sure to apply!