February 13, 2013

Terminator Vision: I Can Haz It?

You've all seen The Terminator film and it's sequels and, admit it, you loved them. Not just because of the creepy futurealistic storyline but because of the stunts, the camerawork, the casting, and the sheer action of it all. And, of course, the special effects. As an example of the best sci-fi films out there, the Terminator films franchise has grossed nearly $1.5 billion worldwide. Some of the iconic scenes in the movies related directly to the Terminator itself, that ice-cold stare as a mistaken victim was brutally gunned down in pursuit of the target. But what was it about that scary stare? Surely it was the gleaming infrared light in the robotic eyeball that was shielded most of the time by the humanlike exterior. That infrared light enabled the Terminator itself to view its own surroundings:

Composite image. Credits: odysseyart.net and Orion Pictures
 In the emerging field of neuroprosthetics, the most well-known examples of the technology are cochlear implants for the deaf and retinal implants for the blind. Generally speaking, they work by receiving auditory and visual signals and then transmit them to the relevant brain areas after being transformed into electrical impulses. Obviously, these tools are extremely useful in restoring hearing/vision functions to those who weren't born with them or who have lost them due to injury. In some cases, depending on the nature and extent of the absence/injury, it may be necessary to augment rather than restore the functions fully.

Now a new study by researchers at Duke University suggests that 'Terminator Vision' could one day be a reality for some, after successful experiments on rats found increased learning and perception skills when prosthetics were fitted into their brains. Eric Thomsen, Rafael Carra and Miguel Nicolelis trained a cohort of six rats on a simple visual discrimination task: Rats were placed in a circular chamber that had three reward ports. On each trial, a visible LED was activated in a particular port and rats who poked their noses in the correct port were rewarded with a drink of water. After three weeks of training, the rats managed to be 70% correct on average. They were then fitted with an infrared detector as well as implants into the whisker region of the S1 cortex, a touch-sensitive area of the parietal lobe which is largely responsible for spatial navigation.

Bearing in mind that rats are normally blind to infrared light (as are we), it would be worth putting them back into the chamber to see if they could perform the task as well as before. As for how it works: The IR detector transmits electrical impulses directly into the rats' S1 cortex if the rat moved towards the infrared light, which were increased as the rats moved closer or oriented their heads in the light's direction. And here's where it gets interesting: Not only did the rats perform better on the task as before by finding the infrared lights with greater accuracy, but other interesting behaviour was noticed too. Namely, "they learned to actively forage through the behaviour chamber, sweeping the IR sensor on their heads back and forth to sample their IR world".

Read that again: They learned to incorporate their new IR vision relatively quickly into their normal sensory range as a type of "IR vision". And they did this by taking the time to re-orient themselves and make sense of their surroundings. They didn't immediately associate the new stimulation with the task but just assumed it was "something new" for them, scratching their faces in response to the electrical microstimulation. Isn't that awesome?!

It is possible that criticism of this study may cite 'training effects', that the rats had an idea of what to do in the experimental condition because of their previous training with the LED light. But this can be rebutted by how the rats learned to navigate their way with normally invisible infrared light purely by their movement and guidance, what to speak of more additional difficulty layers being added to the original task which were relatively aced by the rats (above 93%) in the IR condition.

In conclusion, the researchers felt that the rats learned to treat microstimulation as an external stimulus originating in the surrounding environment rather than within their body, which is an interesting finding that reflects the understanding of vision in humans too. Even though we have innate eyeballs that look 'out', vision occurs by light entering 'into' the eye. So even though the rats' brains were being stimulated in (correct) response to invisible infrared light, they appeared to act as if the light was shining at them in order to attract them. It was beyond the scope of this study, however, to determine if the rats thought of microstimulation as a separate sense, although the researchers suggest that a potential application of this technology could be in developing motor neuroprostheses - artificial(?) limbs that would be improved in terms of reaction times and accuracy because of the closed-loop bidirectional interaction that the technology can offer.

And of course, a plethora of possibilities that offer the curious possibility of sensory augmentation, the potential to expand sensory range to see forms of light that are normally invisible to human eyes. So it is entirely possible for Terminator Vision to emerge one day.

UPDATE Feb 13th: Coverage of this paper at Scientific American has fallen into the trap of describing the augmented vision as a "sixth sense", oddly proclaiming it as a seventh sense too. Minor but amusing errors.

UPDATE Feb 14th: Wow, now BBC coverage has also fallen into the "sixth sense" trap. Makes you wonder as to who has actually read the paper.

---------------------------------------------------------------------------------------------------------
Thomson E.E., Carra R. & Nicolelis M.A.L. (2013). Perceiving invisible light through a somatosensory cortical prosthesis, Nature Communications, 4 1482. DOI:

No comments:

Post a Comment