Explorable.com
Published on Explorable.com (https://explorable.com)

Home > Future of Sensors and Sensing

oskar's picture

Future of Sensors and Sensing [1]

oskar6.7K reads
Ideas [2]
https://explorable.com/sites/default/files/david-eagleman.jpg [3]
( original source ) [3]

David Eagleman explains how blind can see with their tongue and deaf can hear with their bodies. The technology is already here!

Full Transcription

Speakers: David Eagleman and Scott Novich.

Venue: Being Human 2013. Nourse Theatre, San Francisco, CA, September 28, 2013.

David: I’m going to start by talking about the future of being human by going back to a quotation from 1670 from Blaise Pascal who pointed out that men is equally incapable of seeing what’s happening at the infinitely small levels from which he emerges as he is incapable of seeing what’s happening at the infinity in which he is engulfed. In other words, we’re all trapped on this infinitely small strip between atoms and galaxies and we don’t really see most of what’s happening in our reality.

That was 350 years ago and the march of science has only made that worse. In the following sense, what we’ve realized is that we don’t see most of what’s happening at our own spatial scales. Let me give you an example of this. Take electromagnetic radiation that we call light, visible light, just bounces off of things and strikes photoreceptors in the back of our eyes. Sends electrical signals to our brain and we see. Our whole world is … we’re surrounded by these electromagnetic radiation that we love. It turns out, there’s lots of electromagnetic radiation. We could call it all light.

The thing is that this little strip that we think of as the part we can see is one ten trillionth of what’s going on out there. Right now, passing through your body is NPR and CNN and gamma rays and x-rays and microwaves, all kinds of things that are completely invisible to you even though it’s the same stuff.

Now, why is it invisible to you? Because you don’t have specialized biological receptors for picking up on it. As far as you’re concerned it doesn’t even exist. It turns out that not all animals are limited to the same range we are. For example snakes see in the infrared range. They can see a little bit farther out than we can. Honey bees see in the ultraviolet range and bees and flowers communicate with each other this way in this range that we can’t see.

Of course, we build machines in the dashboards of our cars to pick up on radio waves and we build machines in hospitals to pick up on x-rays and things like that. We can’t see any of that stuff. So what we’re able to experience, what your human experience is delimited by are the biological receptors that you happen to come to the table with. Isn’t that stunning?

It turns out that every animal, as you look across the animal kingdom, has their own window on reality. The reason is they have different biological receptors. For the blind world of the tic, the signals that it gets from its ecosystem are heat and the odor of butyric acid. Those are the signals that it’s picking up on to figure out what its world is.

For the black ghost knifefish, what he’s picking up on is electrical fields. That’s what it’s getting from its ecosystem. For the blind eco-locating bat, it’s air compression waves. The key is that these are all the signals that it gets and we have a word for this in science, this little window of your reality based on what you can pick up on is called umwelt which is a German word that means the surrounding world.

The really stunning part is that somehow all of us, presumably animals and so on, imagine that our umwelt is the entire objective reality out there. For example, in the Truman Show, Truman lives in this world that’s constructed, usually on the fly, by this intrepid producer. In the movie there’s a part where an interviewer asks the produces, “Why do you think it is that Truman has never come close to discovering the true nature of his world?” The producer says, “We all accept the reality of the world as it’s presented to us.” It’s one of those rare Hollywood lines and nailed it, right?

Here’s the thing. I want to do a consciousness raiser on that so we can really get what this is like. I want you to think about what it would be like to be blind. Let’s say you’re born blind, what’s that like? Do you think it’s a blackness or a hole where vision is supposed to be? If that’s your intuition, it turns out that’s wrong. The reason it’s wrong, to understand that, let’s take an example, a thought experiment. Which is, imagine, that you are a bloodhound dog. Your entire world is about scent, it’s about olfaction, smell. You’ve got this really long snout with 200 million olfactory receptors housed in there. You’ve got wet nostrils that attract and trap scent molecules. You have slits in your nostril that allow the nostril to flare when you breathe in so you bring in more odor.

You can have these long floppy years so that you, as you’re walking along, you’re kicking up scent molecules and smelling them. Your whole world is about smelling.

Let’s say one say you’re following your master, your human master and you stop in your tracks with a revelation and you think, “My god, what is it like to have the pitiful little nose of a human? What is it like to take a feeble little nose full of human air? You don’t smell anything.” That must be like a hole where smell is supposed to be, or a blackness that they’re missing.

Of course, since we all have the experience of being human, we know that we don’t miss it. Why? Because we never experienced it. It’s not part of our umwelt and therefore we didn’t even think of it as something that we are missing that could have been a part of our umwelt. So it goes with the congenitally blind, they don’t have darkness or a hole there, they’re not missing it because they never had it.

Here’s another example, for people who are color blind, you can’t even imagine that other people are seeing hues until you’re told that this is true and that other people are having experience that you can’t. If you’ve got normal color vision and you feel pretty god about that, just keep in mind that we now know that some fraction of the female population has not just three types of color photo receptors at the back of their eye, in their retina, but they have four types which means that they’re experiencing colors that the rest of us can’t see.

For those of you who are not members of this small population of tetra chromatic females you’ve just discovered that you’ve got an impoverishment that you didn’t even know about and you didn’t even miss.

Now to the context of this talk. What I’m really interested in is as we move forward, how are our technology is going to expand our umwelt and therefore the experience of being human? I promise myself as a rule for this talk that I’m not going to extrapolate into the future. What I’m going to do is take innovations that are present right now, things that are happening right now and ask it in this context about how are we using our technologies to expand our experience of being a human being?

Many of you probably know that there are hundreds of thousands of people walking around right now with artificial retinas and artificial inner ears. This has been a tremendously successful technology. Essentially you take a camera and you digitize the signal and you stick it on a chip at the back of the retina which then talks to the optic nerve, or you take a microphone and you digitize the signal and you put in a multi-electrode into the inner ear and you communicate to the auditory nerve.

What these have proven is that we can marry our technology to our biology. Here’s the really critical part. As recently as 20 years ago, many or most scientists figured that these wouldn’t work. Why would they think that? It’s the way that we build these technologies, these chips and electrode rays actually speak a slightly different language than our natural sensory organs.

The question is, how is the brain going to figure out this other language? It’s not quite right and doesn’t do all the pre-processing that the retina does or the inner ear does. You know what it turned out not to matter? The brain figured out how to understand it and people can see and hear with this.

What’s the secret here? The thing to understand is that your eyes and your ears aren’t doing the seeing the first place, your brain is doing the seeing and the thing to appreciate is that it is locked in silence and darkness in the vault of your skull. All the brain ever sees are electrical signals coursing around in giant populations of neurons. That’s all the brain ever experiences. It’s not seeing the light or the dark out there or the colors. It’s not hearing the conversation. This is all that the brain is experiencing and nothing more.

The brain is so tremendously flexible that what it’s really good at doing is saying, “Okay. Well, I’ve got these data cables coming in. I don’t know what information is carried out. We call those data cables the optic nerve and the auditory nerve but it doesn’t know what it is, all it sees is these kind of stuff but it’s really good at extracting patterns and figuring out what to do with them and eventually, amazingly, how to have a direct perceptual experience that it constructs about the outside world.

This by the way is the great unsolved question of neuroscience is how you ever convert this thing into private subjective experience. In any case, the brain is so good at doing this and so flexible that in my next book, I suggest that this is Mother Nature’s great secret is building this system once and then you can drop it into any body plan and it will figure out what to do.

What I call this is the MPH model of evolution. I don’t want to get too technical here, but by MPH, what I mean is Mr. Potato Head. The reason I call it this is because it turns out that the brain doesn’t care what the peripheral devices are that you plug in. These organs that we know and love that eyes and ears and fingertips and those, these are plug and play peripheral devices. You can put in anything you want into this system and the brain will figure out how to use it. This is really stunning but true. The reason we know this is because when you look across the animal kingdom you find lots of different peripheral devices.

For example, snakes have heat pits, those black hairs underneath the eyes, those are heat pits. That’s how it detects the infrared. The black ghost knifefish that I showed you has electro-sensory organs all up and down its body. Things that we don’t have, that’s what allows it to detect its umwelt.

This is an animal called the star nosed maul, this thing in front of it is its nose which is essentially made up of 20 fingers. This maul burrows through dark tunnels and feels the world with its 20 fingered nose. That’s a weird peripheral device to plug in but it works just fine.

It turns out cows as well as birds and insects, they have magnetite which allows them to orient to the earth’s magnetic field. Because they have that, they are able to operate on it whereas we don’t and so we’re unaware of where the magnetic field is oriented.

What this means is the particular plug and play devices that you come to the table with, this is what shapes your reality. This is what allows you to experience your little section of the world here. The lesson that surfaces here is that there’s nothing really special about these particular organs that we have. These are just what we’ve inherited from a long complex evolutionary history. This is what we happen to come to the table with. It’s not necessarily what we have to stick with. There’s flexibility in this system.

This is what leads to the idea, and really, I think the proof of principle for the Mr. Potato Head model of sensory substitution which is, “Could you actually feed sensory information into unusual sensory channels and with the brain figure it out?” This must sound crazy and speculative but I want to point out that the first demonstration, this was published in nature in 1969.

A scientist names Paul Bach-y-Rita took a camera and he fed the video feed into grids of solenoids on the back of a dental chair. So you sit in the chair and whatever passes in front of the camera gets poked into your back. If you put a face in front of the camera then you feel that pattern in your back and he sat blind people in the chair, it had them experiment with this for a while and it turned out that they got pretty good at being able to see or being able to report what they were seeing with their back. This has had a lot of incarnation since the late ‘60s.

Here, for example, this is a version of it, it’s called sonic glasses where you convert the video stream into an auditory stream. As things get closer and farther that goes [inaudible 00:12:54] like that and it sounds crazy at first when you put it on, it doesn’t make any sense. After a while you’re able to walk around and not bump into things and so on. After a few weeks people actually have direct perceptual experience essentially seeing.

Here’s another incarnation. This is the forehead version of it where there’s a small grid of [tactors 00:13:15] on your forehead and it turns the video stream into feeling on your forehead. This is the most modern incarnation, it’s called the brainport. It’s an electro-grid that sits on your tongue. Why your tongue? Because it’s got high resolution, got a lot of sensory receptors there, and you take the video feed and you convert it into what’s happening on the tongue. People come to see with the brainport. They actually report that it is vision.

I know that sounds crazy, but remember that all vision ever is is electrical signals coming to your brain and it doesn’t matter the root by which they get there. People can actually rock-climb wearing the brainport or they can toss a ball into a bucket. It’s a pretty amazing demonstration. I think what sensory substitution shows us is this principle that the Mr. Potato Head theory is right and that you can plug in anything you want here.

Now, one of the projects my lab is doing along these lines is we want to use sensory substitution to solve the problem of deafness. I wanted to tell you about that project, and for that I’d like to introduce my graduate student, Scott Novich who’s spearheading this work for his thesis.

Scott: Hi everyone. The application that you see David running behind me is a demonstration of how we envision sound to touch sensory substitution to work. What happens is sound comes into the system in real time.

David: The quick brown fox.

Scott: It gets compressed and then mapped to some wearable device such as a vest and this vest has an array of [tactors 00:14:53] on it which could be little vibrational motors. Those vibrational motors here represented by these black boxes which light up according to how much they’re being stimulated.

David: Jumped over the lazy brown dog. Was that the … Yeah, fine.

Scott: Right.

David: I don’t even know of this expression.

Scott: We started on this system a while back, we’re building all sorts of complex electronics to make it happen. We have this epiphany, it’s not 2008 anymore, we actually have all these technology right here in our pocket. What I have here is an actual copy of the same application that’s running and not only that, we have …

David: In the second half of the talk I’ll do the full monty, so just wait for it, yeah.

Scott: We have the fully working prototype and it actually goes into my android phone and I can turn it on. What happens is I can talk in here and can you hear it?

David: Can you guys hear the buzzing?

Scott: Maybe. In real time it gets mapped to the vest wirelessly over Bluetooth to David.

David: Great.

Scott: That’s our vest.

David: That’s the vibratory vest. The idea is this, our expectation, in the coming months we’re going to be testing this with deaf participants. Our expectation is that in the course of about two weeks they should be able to develop a direct perceptual experience of the sound in their environment. In the same way that when you watch a blind person reading brail, they’re having a series of patterns going over their fingers and they have a direct perceptual experience to that, the same way that we read font and think it’s immediately right there.

The idea is, if this works well with deaf people, this will really be a game changer because we can build this for 100 times cheaper than you can do a cochlear implant and this doesn’t require an invasive surgery. We have high hopes for where this is going. In the context of this talk, I don’t want to just talk about sensory substitution, what’s Scott and I are interested in is sensory addition. How can we use this technology to actually expand the human umwelt?

There are a lot of ideas here. Just as one example; imagine if I wanted to have perception of something larger than normally as a single human can have? Instead of just sensing what’s immediately here, imagine that we feed in real time data from the net about whether patterns of the surrounding 200 miles? By tracking those patterns and feeling them all day while I’m walking around, unconsciously, because the brain is really good at extracting patterns without having to pay detailed attention to it. If I could feel the surrounding 200 miles and be able to predict the weather better than the weatherman can and tell you guys what’s happening next, that’s opening up my umwelt into something that humans haven’t had an experience with before.

A related idea, one that we’ve actually started moving on is what happens if I walk around all day and I feed data from the stock market into me while I’m walking around all day? Again, I don’t have to pay any attention to it, but what I’m doing is I’m tapping into the ebbs and flows of the economy of the world, right? That’s an experience that humans simply haven’t had before. It just hasn’t been part of our umwelt.

The idea is it gives us a totally new experience to deal with here and we’re working on this right now. Here’s a third idea that we’re working on. You guys have all heard of the spidey sense. What we want to build is the tweety sense. Imagine that every single tweet that’s been going out of this room all day that has the hashtag being human 2013, imagine that we’ve been tracking every one of those which we have. Imagine that as I walk around in the vest all day I am plugged into the consciousness of a thousand people, because what I’m getting is the sentiment of a thousand people who are all experiencing something but I’m feeling that sentiment as I’m walking around and doing other things. I know when things are ebbing and flowing there.

If we really wanted to blow up on this, one of things that we’ve been thinking about a lot is that Twitter really has become the consciousness of the planet in the sense that the ideas that people really care about most, those trend up above the noise floor and rise to the top. Imagine now that you wanted to be plugged in not just to a thousand people but you wanted to expand your umwelt to the entire planet and you wanted to know what’s going on anytime. You can press 500,000 tweets per second through some natural language processing and all day you walk around and you’re feeling what’s going on. You’d be, “Oh, something bad just happened in Kenya.” You would know that before anybody else. That’s the idea of the tweety sense and the idea is that you would become very worldly in a sensory sense. Ho wants to be plugged in to the consciousness of the world, who knows? That’s one of the things we’re experimenting with.

Thank you very much Scott.

That’s the idea with sensory substitution. It turns out ours is just one of many projects happening all over the planet with this thing. One popular thing that many bio-hackers are doing is implanting neodymium magnets into their fingertips and then suddenly become aware of the electromagnetism around you. There are at least a thousand people who’ve done this so far. Maybe, let’s say there’s a power transformer, you can feel the shape of the electromagnetic bubble around that. You'd feel the shape and the size and the strength of it and some people even describe you feel the color of it because of the frequency of the electromagnetism. People can do things like repair their electronics just by feeling where there’s power running through the cables without having to break out their multi-meter. That’s one way of expanding the human umwelt.

This in the lower left is Neil Harbisson who’s born color blind and so he set up what he calls the eyeborg which is a color camera that converts different wavelength colors into particular sounds that he experiences through bone conduction. How he goes around world and really enjoys and experiences color. This is Moon Ribas who puts what she calls the speed [borg 00:21:07] again to her earlobes so she can do motion detection of everything around her and she can stand in a crowd and really feel what’s happening with the crowd.

Really, there is no limit that we can think of to the number of sensory expansions that we can do, like 360 degree vision or infrared or ultraviolet. Wouldn’t you like to experience that? Or not just expansions but additions, things that are absolutely new things that humans have never experienced before. The bottom line is the human umwelt is on the move. We are not in a position as we move into the future of getting to choose our own plug and play devices. We’re no longer natural species in the sense that we don’t have to wait for Mother Nature’s sensory gifts on her timescales. But nature, just like any good parent has given us the tools that we need to go out and construct our own experiences.

Link: 
[title] [3]
Image: 

Source URL:https://explorable.com/ideas/future-of-sensors-and-sensing

Links
[1] https://explorable.com/ideas/future-of-sensors-and-sensing [2] https://explorable.com/ideas [3] http://xpl.be/JD4A3h