On Tech & Vision Podcast

The Development of Artificial Vision

On Tech & Vision Podcast with Dr. Cal Roberts

Today’s big idea is all about the cutting edge advancements in ocular bionic prosthetics. The Argus II is a device that uses a camera and a chip to stimulate the retina and send signals to the brain. Our guest, Dr. Mark Humayun, developer of Argus II, speaks with Dr. Roberts about the development of this device.

Podcast Transcription

Mullins

Today’s big idea is all about the cutting edge advancements in ocular bionic prosthetics. The Argus II is a device that uses a camera and a chip to stimulate the retina and send signals to the brain. Our guest, Dr. Mark Humayun, developer of Argus II, speaks with Dr. Roberts about the development of this device.

Up at MIT where the National Disabled Sports Championships were being held, there were people running with carbon fiber, and woven carbon fiber is an energy storing material.

Roberts

Amy Mullins is an actor, an athlete and a public speaker.

Mullins

I still had these legs that were like a wood and plastic composite, and rubberized foot.  There was nothing high tech going on with this.

Roberts

She was born with fibular hemimelia.  No fibula bones in her legs.  When she was one year old, both her legs were amputated below the knee.  She learned to walk on the same clunky prosthetics, the wood and plastic composites that people had been using for decades.

Mullins

What is actually defined as by insurance companies is “the thing that can get you from the bed to the toilet.”  As far as aesthetics go, they have a basic shin shape and a foot that generally doesn’t even have any kind of toes.  There’s just sort of a rounded, rubberized thing that’s unisex.  I was a kid that wanted more.  It just drove me crazy that my Barbie doll had a bendy foot and you could bend her knees a little bit.  You could see it in movies.  you could see Robocop and the Terminator that there was articulated ankles in the Hollywood prosthetic makeup sphere.

Roberts

Since Amy was a child, huge developments have been made in prosthetics.  Not just for limbs, but for many neural or bionic prostheses.  Consider the cochlear implant.  Today, prosthetic users are closer to that Hollywood ideal that Amy saw on TV.

Mullins

There was so much opportunity for increased functionality and also beauty that just wasn’t being spoken to.

Roberts

Amy didn’t just win that race at MIT, she broke the national record.  She parlayed her win into a spot on the Division I track team at Georgetown, competing against runners with two biological legs, and beyond that into a Paralympic career.  But with that win, she also developed a relationship with Van Phillips, the developer of the Flex-Foot Cheetah, a carbon fiber sprinting leg that you may have seen on athletes like Oscar Pistorius.

Mullins

I was the very first user.  I was the guinea pig on them.  I have been so lucky my whole adult life to have that collaborative experience with everyone who has ever built legs for me.  I put big daydreams in front of the scientists and engineers and the prosthetists of what I would love to be able to do.

Roberts

The Flex-Foot Cheetah legs required reimaging the human leg, really reconsidering what a leg is for.  We asked Amy about bionic prosthetics, prosthetics that replace a missing body part with something electronically or mechanically powered.  Perhaps something closer to what she saw as a child in the Terminator movies.  While Amy doesn’t have a bionic prosthetic herself, she was able to describe them for us.

Mullins

They would implant a little chip, basically, in the bottom of your residual limb, and theoretically it will talk to your ankle.  So I would think about flexing my ankle and it would do that.  But those things are still very much in the very early stages, and if you see the success about them they tend to be in upper extremity prosthetics because it’s not weight-bearing.  It’s definitely on the horizon.

Roberts

This is On Tech & Vision with Dr. Cal Roberts.  I’m Dr. Cal Roberts and today’s big idea goes beyond the cutting edge advancements in prosthetics that have benefited Amy and others like her to ocular bionic prosthetics.  Like Amy’s Flex-Foot Cheetah carbon fiber leg, these bionic advances also require a collaboration between developers and early adopters.  

On that note, meet Barbara.

Campbell

I first noticed that I was losing my vision in sixth grade, so I was about 12.

Roberts

Barbara Campbell has retinitis pigmentosa, a genetic condition that involves the breakdown of cells in the retina.

Campbell

I was having difficulty completing the standardized test where I had to fill in the little bubbles with the Number Two pencil.  That was the first time it was noticed.  

Roberts

Like Amy, Barbara chose to be one of the first users of a new assistive device.  In 2013 she received a radical new technology called the Argus 2, an artificial retina surgically implanted into the eye, that helps users with retinitis pigmentosa to see light.

Campbell

The most exciting thing, I think, was they took letters on the computer screen – they were fairly large, I’d say maybe about 10 or 12 inches.  But that was the first time I was able to see letters since probably about 1993.  The most emotional thing to me was being able to see letters again.  That was such a huge, emotional experience.  I don’t even know how to put it in words.

Roberts

Barbara underwent an extensive training program that helped her make sense of the new visual stimuli she was seeing.  It’s amazing to say seeing because Barbara spent so many years not seeing.  Did Barbara feel she was seeing?

Campbell

Totally.  It made a big difference.  That’s one of the first things I noticed that during the day I would be able to see the crosswalk lines out on the street.  The other very important thing, where I take the bus to go to work, it doesn’t have a bus shelter.  It just had a pole with a bus sign.  I was able to very easily identify the pole for the bus stop.  That was really good.  That was really very good.

Roberts

Our guest today is the developer of the Argus 2 retinal implant, sometimes referred to as the bionic eye that Barbara used.  Dr. Mark Humayan is the Director of the University of Southern California Ginsberg Institute for Biomedical Therapeutics and Co-Director of the USC Roski Eye Institute.  You can find his full bio in the show notes.  I asked him to describe the Argus 2 and how it works.

Humayan

The device has two components.  A wearable component and an implanted component.  The wearable component consists of glasses with a tiny camera in them which captures the image and sends it to a video processing unit which is worn in a pocket or on a belt.  The video processing unit also has a battery, so a power source.  This wearable component is rechargeable, and this is what captures the images and sends it into the implant.  So both power and data are sent in wirelessly.

The implanted part is implanted during an outpatient surgical procedure that lasts anywhere between two to three hours.  The implanted part consists of an antenna which receives the information and then an electronic chip that processes it and then sends it by a very delicate, much like Saran Wrap, electrode that stimulate the remaining cells of the retina in the eye.

You can think of it as this.  It wirelessly connects you, the blind person, to the camera, and jump-starts the otherwise blind eye and sends the information to the brain.

Roberts

You and I have known each other for a very, very long time.  But for those who don’t know you as well as I do, you told me that seeing your grandmother go blind motivated you to go into ophthalmology.  So, tell me about her and tell me about that story.

Humayan

At that time it was very devastating.  We were very, very close.  When I was in medical school she started to lose her vision.  We couldn’t figure out what it was.  She did get laser, but because of her progressive diabetic retinopathy, she ended up going blind.  It really made me focus on ophthalmology and in particular retinas.

Roberts

You weren’t intending on ophthalmology when you went to medical school.

Humayan

I was very interested in the nervous system.  Neurons, how they fire, how they work together.  I was going to be a neurosurgeon.  I had a pretty good start, but it was really seeing what happened to my grandmother and her retinopathy that led to her blindness that made me change my course.

Roberts

Mark became an ophthalmologist, but he couldn’t get the idea of an artificial retinal out of his mind.  So, while completing his residency and raising a family, he earned a quick PhD in engineering at UNC Chapel Hill.

So, now being an engineer, let’s talk about computers and let’s talk about how the visual system is similar to a computer.  When I think about it, the visual system is, in many ways, analogous to a computer in that it requires hardware and software.  The hardware includes the eye, the retina, the optic nerve and the visual cortex of the brain.  The software exists actually where we’ve come to learn throughout the brain to convert these signals into what we describe as sight.

When you were first conceiving this idea for artificial vision, what hardware elements of the visual system did you want to change or replace?

Humayan

Well, Cal, you’ve really hit on a key point that took me many years to really realize and understand.  If you wind the clock back to ’87, the late 80s when we were doing this work, there was a lot of understanding about microelectronics, but how to implant them in the eye was certainly very novel.

I didn’t know what part I wanted to replace.  I knew we had to definitely replace the photoreceptors because light wasn’t being transduced.  But, how much more of the retina we had to replace was an enigma.  It was one of these things – let’s just build it.  Let’s try to stimulate the ganglion cells which is the last outpost of the retina going towards the brain and see where we get.

Cochlear implants had taught us that if you even put somewhat of a rudimentary signal into the ear that the brain can start to use it.  We were really hoping that the brain would help us, but we clearly knew that there was a huge part of software that we still had to do.

Roberts

Were you looking for patients like your grandmother who had diabetic retinopathy?  Were you looking for people with hereditary diseases?  Macular degeneration?  Glaucoma?  Was there a group of patients that you were targeting?

Humayan

You know, we thought that if you’re going to input into the retina you certainly need the optic nerve.  The question becomes is home much of the optic nerve you need.  Obviously, if there’s zero optic nerve then it doesn’t make sense to put something in the eye.  It won’t be able to transmit to the brain.  But what level of optic nerve function did you need?

So, we just thought, let’s start with those conditions that have no optic neuropathy.  So, for example, let’s not start with advanced glaucoma or an acute ischemic optic neuropathy or any of those.  Let’s start with a retinal condition, primarily photoreceptor loss.  When you look at that exactly, you’re faced with the inherited retinal conditions or the age related macular degenerations.

Given all these things, we thought let’s go with patients who are blind from photoreceptor loss primarily, and those who have no light perception or barely light perception and see if we can make a difference there.

Roberts

When we talk about photoreceptors we know there are different types with different spectral sensitivities.  Then there are circuits in the retina that send color and others that specialize in luminous contrast.  Mark, how do you stimulate color vision versus spatial contrast vision?  Can you stimulate one versus the other?  How does this all work?

Humayan

You’re absolutely right.  Our electrodes are very big.  They don’t stimulate single cells.  They stimulate groups of cells, and in fact, they very well could have stimulated on and off cells.  And I’m sure they do to some extent.  But he told me when we first got going, look, you’re playing piano with boxing gloves.  You’re not going to be able to access each one of those photoreceptors or the analogy of being able to access each one of those keys on the piano.

So, we knew we were going to be stimulating groups of neurons, and we were able to do it.  The electronic system stimulates groups of neuron into visual perceptions.  Initially we just got black and white, and I was very worried that if you just get black and white it would be very limiting because a lot of our depth cues, environmental cues we get is from actually shades of grey.

So, we were lucky and so far we have been able to get up to ten grey scales by changing the stimulation software algorithms, and that’s good.  Not every patient gets ten grey scales, but we can get them and that’s very important for orientation and mobility.

Roberts

And what about color?

Humayan

We’ve been working hard on color.  Of course, going back to what I said earlier, if you’re stimulating with these large electrodes that are stimulating groups of neurons you are not going to be able to stimulate the red or the green or the blue channel in the retina, so we have to do it a different way and we’ve been able to do it through changing the frequencies and it’s very exciting to be able to show that we’re beginning to get some color that can add further content information.

Roberts

I thought this was exceptionally interesting.  We know colors are made by light moving in a specific frequency or pattern.  Violet is the highest frequency of light in the visual spectrum and red is the lowest frequency of light we can see.

Humayan

Similarly we’re able to tune the device to convey the sense of blue by stimulating.  So then, if the camera is looking at a blue shirt, and let’s say yellow pants, kind of a lively wardrobe of somebody, it would stimulate in a different frequency the retina and the brain would say hmm, that is a blue shirt.  And for yellow pants it would do it at a different frequency.  That’s how it would work.

Roberts

The device reads the color of the item and then stimulates the retina at a given frequency.  The patient learns to associate that particular retinal stimulation with a named color.  This way it’s able to communicate that color to the brain.  Really remarkable.

I remember your first clinical version.  I think it had about 60 receptors and I think of them as more like pixels in a digital camera.  And later versions increased that number.  So, why is more better?  And is more always better?

Humayan

This is another really key question, what’s the density of the electrodes that are stimulating the neurons?  We started with 16 electrodes and the reason cochlear implant at that time could only stimulate 16 electrodes.  So we sort of reconfigured a cochlear implant and used it to stimulate the retina.

Roberts

You reconfigured the cochlear implant? 

Humayan

We shamelessly did that.  Because they had figured out for so many years how to protect the electronics.  How to do this wireless telemetry link.  So, instead of putting the electrodes into the cochlea, we took those electrodes and put them into the eye and then used a camera to input into.  And so, before you spent tens of millions of dollars, you wanted to get there as quickly as possible.  And this was a medically approved device with a long safety record.  We just had to take it and reconfigure it such that the electrodes could go into the eye and that it could take inputs from the glasses.

What we found is as you made more electrodes and made them more dense, meaning packed more of them in the same area, the visual acuity got better, and it continues to get better.  So 60 is not the limit.  We’re looking at a 256 electrode array which will be more than four times.  We haven’t hit the ceiling yet on the visual acuity of these devices but it’s very tricky.  As you start to add more pixels, as the current starts to overlap and you have to control that and in terms of neural stimulation you have to be very careful what you’re doing, so more doesn’t necessarily mean better, but it has panned out to be that and we’ll just have to continue to work and see where the ceiling is on this.

Roberts

So, sometimes people think that an inventor has an idea and goes from step A, to B, to C, to D, to E, to F in a straight line and just advances forward.  That doesn’t always happen, or rarely does it happen.  Can you give me a couple examples of things that you tried that actually didn’t work along the way, and why didn’t it work?

Humayan

The very first one was basically, no matter what we did, what we tried, we couldn’t get a response from the retina.  This goes to that software code that we were talking about earlier.  It takes a very specific type of pulse pattern and stimulation.  So we were speaking the wrong language.  We were on a different channel from the retina.  So that took a very long time.  What I thought is, well, we’ll just put this device in and ask the patient, better one or two?  I mean, obviously we refract that way.  We prescribe glasses saying is the image one better or image two.  But if somebody hasn’t seen for 50 years, and the visual input is very crude or blurry, they cannot guide you.

So, we couldn’t use the patient to help us as much.  That’s a very critical example that we couldn’t figure out how to stimulate the retina until quite some time thereafter.

The next challenge was an engineering one.  So, that one is a neurophysiological one.  The next one’s an engineering one.  Nobody bills cell phones to be thrown into warm, corrosive water.  That is not where technology is going.  So, how do you build a tiny package that fits around the eye, that has 60 wires coming out of it and have it survive in this warm, corrosive environment of the body?

It’s called a hermetic package, because that package has to last decades.  It was very challenging and that’s not somewhere we could turn to any of the devices that are currently used, like cells phones.  Because they don’t have that constraint.

And then, a surgical one.  I’ll give you one of each flavor just to give you an idea.  How do you attach something to the retina?  The eye is moving.  The retina is extremely delicate.  I still remember my mentor was Dr. Robert Machemer who invented vitrectomy surgery and he said, look young man, the retina is very delicate, the eye is moving.  You plan to do what?  Attach something to it and not have it detach?

So, to work out a surgical procedure that ends up attaching the device to the retina without crushing it.  Because, remember, we’re putting it on top of the retina, and if we crush those ganglion cells that 20:30 on, the output would be gone.

That’s just an example of neurophysiology, engineering and surgical challenges.  But there were so many, and many many more that we could talk about.

Roberts

Everyone was very excited when you first implanted these Argus implants to see what the patient would see.  So, what do patients see?

Humayan

When you’re testing these devices in blind patients who haven’t seen for up to 50 years, how do you quantify those responses?  There’s a whole area called psychophysics.  Through that you end up taking these subjective responses and put some size to it and make them more objective.

What they first see if very different than what they see with a few months of use of the device.  It’s very interesting.  For example, one electrode may even illicit two or three phosphenes, dots of light.  But as they continue to use it, that becomes a single phosphene.  You ask them, do you still see two or three?  They say no.

Campbell

That was a such a huge, emotional experience.  I don’t even know how to put it into words.

Humayan

They’re able to use the device more effectively to see large letters, objects.  What they tell us, it’s very interesting and it continues to evolve to a point where they eventually don’t see those dots.  They just see the form and see beyond the dots.

Roberts

Hearing about patient feedback on the Argus 2, I’m reminded of Amy Mullins and her collaborative relationships with the developers of her prosthetics.  This is a continue learning process for you.  You give the patients an experience, the patient then learns from that experience.  They then tell you what they’re experiencing, and then you have to come up with a next step.

Humayan

Exactly.  It’s a reiterative process.  We learn from the patient.  We try to enhance the tech, or what I call the eye tech.  And then we launch the next code of the software to see what we can do.  And we’ve gotten great improvements in visual acuity through this reiterative software upgrade so it absolutely is very important to have the patient input.  It’s getting better.

Roberts

If we look into the future we talked about the goal of the hardware is to stimulate the visual cortex.  We said that with the Argus technology we’re going to use the optic nerve in order to transmit the signal to the visual cortex.  Is there a role to bypass the optic nerve and go directly to the brain?

Humayan

That’s a very good question, because 50-60% of patients do have severe optic neuropathy where putting an implant in the eye, stimulating the retina won’t work.  So, what do we do with those patients who are blind?  An area that has been explored a lot, even before the Argus, by others, is the visual cortex.  To go in and put the electrodes in to the primary visual cortex and bypass the entire eyeball, optic nerve and some of the relay stations.  

So, the Orion, it has been implanted in six patients.  What it does is takes the image from a camera, and the video processing unit, the battery is the same.  The wearable is the same.  But instead of wirelessly transmitting it into the implant around the eye now it wirelessly transmits it to an implant that’s in the vision center of the brain.  That’s basically how the Orion works.

Getting some very interesting early results, but your bypassing a lot of information that’s occurring in the eye altogether so you have to recreate data at the brain level.  It has its different challenges but of course, in this particular case, it completely bypasses the eye altogether.

Roberts

And so, as we continue to think big, what is possible?  Or, what might be possible for the development of artificial vision for people?

Humayan

I’m always an optimist.  But, I think more electrodes.  Understanding this neural code better.  Understanding this language of our body, the electrochemical pulses.  How to harness this to really restore site from the retina at the visual cortical level.  These are very exciting times, and we now have the hardware and technology to get there and be there and try to begin to answer these very difficult questions.

Roberts

And similarly, we dream of artificial limbs that could actually run faster than our natural legs.  Could you envision ocular prosthetics that could see better than the human eye?

Humayan

I would be very happy if it could see just as well as the human eye, but we’re not there yet.  There’s some other very interesting things, of course.  The camera could be anywhere in the room.  It doesn’t have to be on the glasses.  Whereas our eye can’t be anywhere in the room.  It has to be in our head.  So that’s an interesting aspect.

Also, the camera can see in infrared ultraviolet outside the visible spectrum if we wanted to and that’s something that our human eye can do.  And of course, there’s digital zoom.

So there’s some features that are different than our human eye, there’s some advantages, but clearly our human eye is incredibly, exquisitely engineered to give you very pristine, refined high-resolution image.  And also be able to see from dawn to noon sun.  See this incredible spectrum of brightness, whereas cameras struggle a little bit with that.

Roberts

Last question.  Mark, where do you get your passion from?

Humayan

Well, my passion and continued motivation comes from patients.  It started with my grandmother who was obviously very near and dear to me, but since then I often see many a grandmother in my examination chair who is going to go blind or has gone blind.  It always motivates me.  

I have a clinical practice where I end up getting referred a lot of patients who have lost their vision or about to lose their vision.  So, I really do see that every week and it’s tremendously motivating to make a difference for patience.  Just because we’ve accomplished this much, which many would think is a fair amount, it’s never enough for me.  I think it’s really important to keep pushing and seeing how much more we can help our patients.

That’s what keeps me going and something that’s near and dear to my heart.

Roberts

Barbara Campbell was one of the first people to have the Argus 2 implant.  Though her device eventually stopped working, one of the risks of being an early adopter, how does she feel about that?

Campbell

I see myself more as a pioneer.  Somebody has to be first and try it and learn the lessons.  And there will be improvements and there will be things that work and things that do not work.  So they will be able to move forward from the lessons learned from my experience.  It’s knowledge you can only grow from here.

Roberts

Where would these developments be without pioneers like Barbara Campbell?  And where will they be in the future?  It’s fun to imagine the bionic advances that are possible with devices like the Argus 2.  But let’s not forget when Oscar Pistorius petitioned to compete in the Olympics on carbon fiber legs many said it was unfair.  Amy Mullins had this to say about augmentations that might in the future make athletes super-abled.

Mullins

When we reach that point when a prosthetic makes you truly super-enabled, we as a society will decide upon new rules in the sports that we love that reflect the values that we still want to see in those sports.  Defining what is natural is going to become increasingly difficult because everyone is using a medical assisted device of some kind, even if it’s just your glasses.

Roberts

But, just as super ability is not Mark Humayan’s goal with the Argus 2, neither is it Amy’s goal.

Mullins

Ultimately, if you ask me what do I want out of my prosthetic legs, I would like them to be able to do all the things that a flesh and bone leg can do.  I would love be able to raise up on my tippy-toes to read something.  I would love be able to feel sand between my toes or grass under my feet.  I can’t do that.  Yet.  But I think all those things are coming, and I think we’re living in such an exciting time because even in the last 25 years of my life I’ve just gotten to see with the internet there’s so many more people throwing their hat in the ring as designers and inventors.  People who don’t necessarily have the medical background that traditionally was expected in prosthetics.  And I find that just very, very exciting.

Roberts

And at Lighthouse Guild, so do we.  Did this episode spark ideas for you?  Let us know at podcasts@lighthouseguild.org.

And if you like this episode, please subscribe, rate and review us on Apple Podcasts or wherever you get your podcasts.

I’m Dr. Cal Roberts. On Tech & Vision is produced by Lighthouse Guild. For more information, please visit www.lighthouseguild.org.

I’m Dr. Cal Roberts. On Tech & Vision is produced at Lighthouse Guild by my colleagues Cathleen Wirts, Jaine Schmidt and Annemarie O’Hearn.  My thanks to Podfly for their production support.