Brainport V100

A wonderful day to you, dear reader!

 

You come to this page seeing a fancy science-fiction-looking title, and obviously you wonder what on earth it could be. Well, it is one of the sensory substitutions we haven’t discussed yet. It replaces vision by sensing with your tongue. It is not quite the same as replacing it by taste, but I’ll explain that later. We’ve had multiple applications that substitute vision by touch (albeit most often at the braille side), we’ve had some that substitute it by hearing (we’re even making our own thesis about it), but this might seem a little less likely to give you the same results.

And partially, that is correct. The Brainport V100 device is meant as a vision “assistant”: it helps a visually impaired person gain some information about shapes of items in front of them, and helps with their orientation, but it does not have the accuracy that some of the technologies previously discussed on this blog have.

The Brainport V100

How does it work? Not unlike our own thesis, the device makes use of a camera, added on to a pair of goggles. The data the camera reads in are processed, and the shapes of the objects recorded is sent to the output. And the output is what really makes this special. It is a 3×3 cm array of 400 electrodes that are individually pulsed according to the recorded image. The array is connected to the goggles via a wire, and is meant to be places on the tongue of the user. They can – though not without any practice – recognize the signals and gain information about their surroundings by using them. It’s no major inconvenience either: the signal is perceived as a fizzly feeling on your tongue, and, as user feedback suggests, can be quite pleasant.

The electrode array

As is per definition the case with sensory substitution, gaining vision is traded off by (temporary) loss of another sense, in a certain location. Indeed, it will be difficult at best to taste anything while using the device, and, perhaps more important, also speech and vision are mutually exclusive in this solution. This is of course a trade-off the users themselves have to make, and as discussed before, just having the option to use either one is a big step forwards already. The device itself makes it easy to change between using and not using as well, which is also an advantage in that regard.

And speaking of options, the choice of which assisting device to use is also a decision that a user can make, depending on his own personal experience. And there are several, quite differing options available. Just scroll through our previous posts :).

 

Koenraad

 

More info? Go to the site of the producer, read some articles in the media about it, or have even the manual of the device.

Tactile surfaces

Hello my dear reading audience!

We’ve written a post about a braille smartphone before, but there are other types of tactile surfaces, not specifically targeted towards visually impaired people, that follow the same reasoning: to be able to interact with a device without having to see the screen. The one in particular I’d like to talk about today is one from a rather uncommon source: the Disney research labs.

Among many projects, they have done research to develop what they call the “Tesla Touch” surface. The goal is to have a tangible touch-screen, not in the sense of having actual buttons pop out of it, like other projects have looked into before, but in the sense of being able to feel different types of surfaces depending on the screen input. The technology is based around having a changing oscillating electric field in the screen, which affects how much resistance you feel when you  slide your finger across it. The technical details are written down in their 2010 research paper on the topic.

The more obvious ways to use this type of technology for blind people would be to have them sense when they hover over icons, and have them receive a special type of sensation depending on what the icon is. Although it is not accurate enough to display braille, since you are only able to create a single electric field across the entire surface depending on where touch is registered (which also means you’ll have a single resistive signal for all fingers you use), it will be able to give some sense of feedback from the visual contents of the screen through touch. Couple this with screen readers, and the device gains a lot of extra functionality when targeted towards the visually impaired.

Also, it is not entirely unthinkable that the accuracy is improvable, since capacitive touchscreens nowadays already use multiple electrodes. Similar setups might be able to generate multiple local electric fields, which might enable braille. But that is speculation of course.

Surface using Tesla Touch technology

The big advantage of this type of screen over the reformable ones (where buttons pop out of/the braille smartphone etc), is that there is obviously no mechanical movement in these, which makes maintenance, but also developer-friendliness and power consumption a lot better.

Screens like this could be built into regular tablets and smartphone devices, you’ll have a single device that can be used for visually impaired, and non-visually impaired people, which could once more decrease the stigmatization a little. It offers functionality for both, and it can be adapted to react differently depending on the user, in the sense that people with full sight need less feedback, but can for example still use it to simulate the feeling of a keyboard or other things, while more visually impaired people can use the functionality in a broader way.

If you can think of other applications for this type of screen, or if you have a preference between this and the formable screens, let us know!

Koenraad