Tomorrow’s bionic eyes will have ‘Predator’ vision

https://www.engadget.com/2019/08/09/Second-sight-orion-bionic-eyes-predator-vision/

Whether through illness or injury, 36 million people suffer from blindness worldwide, and until just a decade ago those afflicted had little chance of regaining their sight. In 2009, doctors at the University of Manchester implanted the first Argus II bionic eye in a patient. Now, 10 years later, the makers of the Argus II are trialing a more capable artificial-vision system — one that’s implanted directly into the patient’s brain.

It’s called the Orion Visual Cortical Prosthesis System, and it’s been developed by Second Sight Medical Products. Like the Argus II before it, the Orion system consists of a small camera mounted on a pair of glasses to capture images, a video-processing unit to convert what the camera sees into electrical impulses the wearer can interpret and an implant that stimulates the user’s brain to create a perceived image. However, unlike the Argus II which used implants that clamped onto the patient’s optic nerves, the Orion’s implant sits directly on the brain itself.

This implant is installed via a small craniotomy in the back of the patient’s head, above the occipital lobe. "They put the electrode array in there between the two halves of the brain against the visual cortex," Second Sight CEO Will McGuire told Engadget. "Then they basically screw the electronics package into the skull, just next to the craniotomy." This electronics package contains a small transmission coil that wirelessly receives data and power from the system’s external parts.

The installation process requires an overnight hospital stay followed by a three- to four-week recovery period before the unit is turned on. At that point the user is fitted with glasses, the various components are connected and "really what you’re hoping to get then is for them to start seeing spots of light, phosphenes, from some of the electronics," McGuire said. "But then there’s quite a bit of work that has to happen."

Those phosphenes are the result of the implant’s 60 electrode array electrically stimulating the visual cortex and each one needs to be individually tuned to provide the most distinct and discernable spot of light possible. This process requires weeks to months of adjustments to perfect. The next step is establishing a spatial map, ensuring that each electrode is energizing the correct spot on the patient’s brain. This involves having the patient repeatedly tap the specific spot the surface on a tablet when, say, electrode 32 is energized.

"It’s done over and over for each electrode — we really have to train them not to move their eyes, which is the natural response when you see light," Nik Talbot, Second Sight’s senior director, implant and R&D, explained. "As they move their eyes, the brain is expecting to see something different, where in fact, they’re not going to see anything different because they’re taking in everything through the camera. So they have to be taught to keep their eyes looking forward, the same as the camera."

Once the mapping is complete and confirmed accurate, that data are fed into an algorithm that "can be used to convert video into stimulation parameters to replicate what the camera is seeing," he continued. Then it’s a small matter of spending a few more months getting used to the system and learning how to use it most efficiently.

The Orion is undergoing an Early Feasibility Study at UCLA Medical Center and the Baylor College of Medicine in Houston to ensure that the technology is safe for larger trials. Six patients, five men and one woman, were outfitted with the prosthesis in January 2018. Each of them is completely bilaterally blind. 13 months after the implants were installed only one patient reported a serious adverse effect, specifically, a seizure.

"Overall, for that number of subjects at that point, we feel that’s very good and very safe," McGuire remarked. "And I think the physician community would agree with it."

However, the road to FDA approval is a long one, despite being part of the agency’s Breakthrough Device Program. "The FDA gives that designation to technologies that are that are meeting a significant unmet clinical need," Talbot explained. "So there’s no other option out there, whether it be a therapy or whether it be some sort of diagnostic tool." Being the first and only implantable artificial-vision system is certainly enough to qualify. This designation also provides the research team with more direct, high-priority interactions with the FDA as they seek a path toward approval. McGuire hopes to have an agreement finalized with the FDA in the second half of this year. The team can’t yet disclose when they expect the devices to make it to market, however.

What’s most exciting is where the Second Sight team plans to take this technology next. In addition to packing more and more electrodes into the array to improve the image fidelity, expanding the electrode count to between 150 and 200 channels.

"We think we can make some significant improvements just on the software side," McGuire said. "And then there’s other technologies that are being developed out there that we’re not necessarily developing but we think to play a key role with artificial divisions. And we’ve got partners who are working on some of these right now."

For example, the team is looking into is distance filtering. Because the image input comes from a single camera, the patient has no depth perception. "If we had two cameras perhaps we could give them the option of only seeing objects that are within 10 feet or objects that are greater than 10 feet," Talbot said. "What that would do would clear up the image for them. Right now, they’re picking up things that are near and far. That can distort what they’re seeing and make it more difficult to interpret."

The team is also investigating face- and object-recognition features. Because the image that the user sees is still decidedly low-fidelity, incorporating these technologies would enable the system to assist its wearer beyond stimulating their brain. "They could have this object-recognition software tell them in their ear, iPhone or coffee cup" when the item is in the camera’s field of view, McGuire said.

Then there’s the Predator vision. Second Sight is looking into integrating a thermal camera, which would enable the user to see in infrared, into its system. "It would be good for them to have that as kind of a mode perhaps, in which they could switch to thermal imaging," McGuire said. "And they can identify where people are in the room, day or night, more easily. They could maybe identify the hot part of a stove or cup of coffee, things like that."

We’re still years away from having the technology behind Geordi LaForge’s visor, but enabling the visually impaired to hunt an elite team of commandos through the South American rainforest is a pretty solid tradeoff.

via Engadget http://www.engadget.com

August 9, 2019 at 11:06AM

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.