On Tech & Vision Podcast

Integrating Touch and Sound

On Tech & Vision with Dr. Cal Roberts Podcast

Today’s big idea is how we can use touch and sound together to create spatial awareness. Just like learning to play the piano or another instrument, smart cane technology engages the both sound and touch to compensate for loss of vision. Kürşat Ceylan, the founder and CEO of WeWALK, shares how the WeWALK smart cane engages a number of senses to develop new autonomy for users, and how their technology is a jumping-off point for integrating other technologies.

Episode 2: Integrating Touch and Sound

Hsu

The research has found out people who take piano lessons or instrumental lessons, their spatial skill is more established than the people who have never done it.  I remember even when I was little, I could close my eyes, after I know the skill so well, I could close my eyes.  I could play.

Roberts

Yu-pin Hsu is an occupational therapist and classically trained concert pianist.  A graduate of the Manhattan School of Music, who for many years taught piano to young people.

Hsu

I do have a memory that building to my sense is like the touch.  Where I am is spatial.  Where you’re located. Where things are located. Where’s your  located.  So you apply that in spatial skills.

Graham

My wife tells me when I have the right note because she hears me practicing.  She’ll say, that is so much better.  That makes me feel good.

Roberts

Bill Graham took up piano after he started losing his vision.  Playing piano engages his sense of touch.

Graham

It’s a mental to physical thing.  You have to take your brain and have it move your finger.

Roberts

And playing engages his ear.  His hearing is becoming more attuned to the music.

Graham

From when I started till now, I hear differently.  And I think maybe I hear other things differently.  I don’t know if my sound is better or that my brain is more attuned to listening to the sound, absorbing the different sounds.  It’s becoming more automatic.  Rhythm and where I’m going.

Roberts

This is Lighthouse Guild’s podcast On Tech & Vision with Dr. Cal Roberts, where I talk with people with big ideas about how technology can make life better for people with vision loss.

I’m Cal Roberts and today we’ll explore the big idea of using touch and sound together to create spatial awareness through the use of smart canes.  As with the piano, the white cane engages users’ senses of touch and sound together to compensate for lost vision.  We’ll ask how smart canes amplify users’ engagements of these senses.  We’ll also look into how smart canes may be the next conduit for the explosion of assisted technologies and software that are available for people who are blind.

Were you a science and technology person?

Ceylan

I was.  Fourth grade, while I was ten years old and one of my friends introduced the computer to me.  He was doing coding.  This was maybe the turning point of my life.

Roberts

Our guest is Kürşat Ceylan, entrepreneur and developer of the WeWALK smart cane, a white cane enabled with smart technologies for people who are blind.

Ceylan

He was also blind, and the first time I heard from himself, visually impaired people can also use computers because he was really geek person.

Roberts

I’ll mention here that Kürşat is also blind himself.  Coincidentally, music was instrumental to Kürşat’s engagement with technology.

Do you remembered what you enjoyed doing on the computer?

Ceylan

Yes.  I remember the first night of my computer because my friend brought the computer in the evening to me.

Roberts

You heard that right.  His friend, also blind, hauled a desktop computer over to Kürşat’s house some evenings to share the cool, new technology.

Ceylan

And then he showed me how to play organ, how to play keyboards over the computer.  And as a ten year’s old boy it looked so enjoyable playing some music, some songs.  It was so enjoyable.

Roberts

For Kürşat, music was how he fell in love with technology.  Soon he began developing his own tools, learning to code and, fast forward, he’s now the Founder and CEO of WeWALK, a smart cane technology for people who are blind.

Let’s talk about the smart cane, and before we do that, let’s talk about the standard cane.  What does a cane do?

Ceylan

First of all, we can detect obstacles in front of us with our white canes.  While I’m walking in the street when I hit something, I can understand okay, there’s stairs, a curb, etc.  Beside that, it shows that I am blind, and also I am not shy being blind.  It means independence for us.  It’s so meaningful for the visually impaired community.

Roberts

Some people I speak with tell me that they actually spend a lot of time tapping with their white cane because they get a lot of information from the sounds of the taps.  Explain that to me.

Ceylan

Yes.  There are different styles.  Tapping is one of them.  When we tap our canes right and left, tick tock, tick tock, we can also hear the echo.  This echo tells us many things.  It gives us many clues.  If there’s something in front of me, sounds might reflect to me.  Also, am I at a crowded place, or not?  Is there water in front of me, or not?  That kind of information can be given with the tapping style.

Roberts

In fact, at the Lighthouse Guild here, when we teach the use of the white cane, we always teach it in conjunction with listening.  You have to listen to the traffic.  You have to hear where you are.  So that hearing and touch go hand in hand.

Ceylan

Yes.  Also if you are using the smell as well.  When I’m walking in the street and I smell coffee, I say it’s time to turn right.  Because whenever I smell coffee it means I’m at the corner of my house.  Not only hearing, not only touching, smelling as well.

Roberts

In doing our prep for this show, we learned Kürşat produced and hosted an award-winning radio show in Turkey, Exploration of Emotions That Are Suppressed by Sight.

What are some of the emotions that are suppressed by sight?

Ceylan

It’s easy to rely on the sight.  However, as a visually impaired person, I’m not relying on my sight.  It means I have room to rely on my other senses.  While I’m sitting at a restaurant, for example, sometimes I’m saying that to my sighted friend, did you hear the last song?  It was really good.  He said to me, oh no, I didn’t recognize – which song was playing?  Oh, you have a really powerful hearing sense.  However, we have similar hearing sense, but in the restaurant there are many things to see.  Looking at the waiter, he was watching the cuisine, he was watching the kitchen.  This is why he didn’t focus on the song.  Those kind of senses are suppressed by the scene.

Roberts

You had this moment where you said, I love my white cane but I want it to do more.  Tell me about that moment.

Ceylan

Maybe ten years ago I didn’t think that.  Or fifteen or twenty years ago while I was using the white cane.  But, as I said, technology has advanced so much, and we started to get the benefit from these technologies, such as navigation technology or smart city solutions.

However, using this technology in the street and while we are holding our white canes started to be complicated.  Imagine yourself as visually impaired walking on the street.  You are holding your white cane in one hand.  Also, you have to hold your smart phone in order to get navigation or calling Uber.  Your two hands are occupied.

Also, while you’re walking you have to check, what is in front of you?  You have to be so careful.  You also have to listen to your environment.  This is a distractive situation.  I experienced some accidents as well.

It was around three years ago.  I was in New York to give a speech at the United Nations.  While I was trying to find my hotel I was holding my white cane in one hand and my smart phone with the other.  I was checking my GPS direction on my smart phone.  At the same time I had to pull my luggage as well.  I have two hands, however I have three things to carry.  I bumped into a pole.  You can see some scars on my head.  That kind of accident gave us, as a team, an idea to develop these technologies.

Roberts

Wayfinding is an important challenge to solve for people who are blind of visually impaired.  Kürşat’s anecdote shows us that when people with vision loss are out navigating their way through the world, even the distraction of the phone can be dangerous.

Ceylan

And that’s why we developed WeWALK.  There are three main features.  First of all, WeWALK can detect obstacles at the head level, such as trees, signs, tables, poles, etc.  And it alerts us by vibrating.  The second feature, WeWALK can pair with our smart phones.  It means I don’t need to hold my smart phone while I’m walking.  I can put it into my pocket and then I can keep managing my smart phone over my WeWALK’s touch pad.

And then, as a third feature, this is the feature that I like most.  WeWALK can gain new features with the software updates.  It gives us flexibility to improve this technology more.

Roberts

Think about it.  Just like with applications on your desktop or in your car or on your phone, updates and new tools can be sent directly to the device when they’re ready for the user.

Ceylan

For example, right now visually impaired people can get navigation over their WeWALKs.  However, we have also developed a “what is around me?” feature.  While we are walking in the street, we can hear the names of the stores, restaurants, cafes while we are passing by.

“Take me to home.”

Searching for location.

And also, we have just released our latest feature.  It’s called Public Transportation.  Visually impaired people can get information about the bus timeline.  As a visually impaired person, while I’m waiting at the bus stop, I don’t know which bus is approaching the bus stop.  I have to ask someone else.  Always.  But this feature, I need just my WeWALK and I can get bus timeline information over my WeWALK.

Roberts

As you’re improving WeWALK, how much of the technology do you need to develop yourself, or how much can you partner with others or take what others have learned and incorporate that?

Ceylan

Perfect question.  Thank you for asking.  WeWALK is powerful because of its partnerships.  We are partnering with Microsoft, for example.  We are using their cloud technology.  We are partnering with Google and using Google Maps.  Also smart city concepts, smart city technologies have taken off.  It gives opportunity to integrate these smart solutions into smart canes was well.

We are developing our own technology using the Google Maps infrastructure.  On top of Google Maps infrastructure we are adding our own technology and we are giving the clockwise directions.

Roberts

I find this fascinating.  The device itself is a gateway.  An access point for many different useful technologies adapted for accessibility for people with vision loss.  Again, we’ve come a long way from that shared desktop.

One of the technologies that WeWALK developed on its own is the use of ultrasound for obstacle detection.  You know what other animals use ultrasound to detect obstacles?  Bats.  And porpoises.  Here at Lighthouse Guild, we love understanding how new technologies for people who are blind tap into and expand on our natural senses.

Ultrasound transducers send out beams of sound waves.  And the sound waves reflect back to the transducer if the objects are detected.  The same technology is used in cars to detect objects when backing up.  I asked Kürşat why he used ultrasound for obstacle detection.

Ceylan

We found out that ultrasound technology is the best technology to implement sight smart cane because there were some technologies as well.  For example, lidar.  That kind of technology, especially light technology, is detecting the obstacle at only one point.  Just one spot because it’s sending a light and collecting the light back, the reflection of the light.  And light is like a thin string.  Imagine there is a real big obstacle in front of you, however it has a little hole on it.  So, if this light passes through this hole, your cane won’t detect any obstacle because light is like a thin string.

Ultrasound, however, has width and it covers a larger area.  That’s why even if the obstacle has some holes on it, ultrasound doesn’t pass through.  It can detect it anyway.  That’s why ultrasound is such an important technology for the smart cane for the obstacle detection technology as well.

Roberts

In order to work, the WeWALK cane has to convert the data from the reflecting ultrasound waves into information that will be meaningful to the user.  Unlike a medical ultrasound machine, it doesn’t produce an image with those rebounding waves.  It has to communicate with the user in a different way.

And so, the ultrasound information comes back as a beep?  How does it inform the user?

Ceylan

When ultrasound detects an obstacle, WeWALK starts vibrating.  Also, we’ve gotten some insights from our users.  They also want to hear a beep.  That’s why we will release a new software update and our users will hear a beep as well.

Roberts

Sometimes I hear people with vision loss tell me about sensory overload.  They’re getting too much information and they find that confusing.  With WeWALK, how do you determine the right amount of information to give at any moment?

Ceylan

First of all, we are giving the freedom to our users.  They can pick when they want to get the information.  The user can tap the touchpad and he or she demands the information.  Or, users pick the frequency of the information as well over WeWALK’s application.  So, we are giving them freedom.

Roberts

The marketplace will decide if Kürşat and his team have found the right balance of sensory inputs for a user.  But it’s so inspiring to see an entrepreneur working so diligently for people with vision loss.  I asked him where he gets his passion from.

Ceylan

I have many reasons to be passionate, to develop this smart cane technology.  I’m blind.  My brother is blind.  My nephew, my niece are blind.  And I have the power to change something in my hands.

Roberts

What are your goals for WeWALK for the future?

Ceylan

As a big vision, we want to make WeWALK a personal hub, assistant, for visually impaired, which provide image recognition, voice assistant and smart city integration.  I can briefly give an example.

As a visually impaired person, when I step out from my home, I will just tell where I want to be to my WeWALK, and WeWALK takes me to my destination step by step without any interruption. We call it a fully autonomous blind journey.  That’s why we are working to complete this vision day by day with our new technologies, with new partnerships.  However, telling this story as a sentence it’s easy.  When we submit it to parts, it includes many advanced technologies.  For example, it means WeWALK should have voice assistance.  And that should be integrated with the smart city solutions because I want to hear the traffic lights.  Or, I want to get indoor navigation as well.  It means indoor navigation technology that we have to develop.

All these technologies need really too much to develop.

Roberts

That’s a lot of ideas all focused on making a more robust smart cane.

Ceylan

I wake up with the WeWALK idea and I go to bed with the WeWALK idea and adding new features into WeWALK because I’m sure it will take time.

Roberts

This is Yu-pin Hsu playing Arabesque by Debussey.

Hsu

So, if you can depend on the vision, you’re going to have to tune even more of that joint memory 21:21 .  At the same time, the location, the spatial.

Roberts

As an occupational therapist, Yu-pin is uniquely positioned to describe what’s happening in the body when someone learns piano.

Hsu

You have to be registered in the brain, and on top of that now the ear.  We all use different senses to learn.

Roberts

The white cane has long been a tried and true assistive device that provides users with sensory information through touch and tapping sounds.  Users comfortable with the traditional cane might find a learning curve with a smart cane, which integrates vibrations from ultrasound obstacle detection and data from cell phone apps.

As with learning to play the piano, learning to use smart canes like WeWALK will engage a number of senses and develop new pathways for users.  One thing is for sure, we want the world Kürşat describes where people with vision loss can navigate their lives with full autonomy.

Are you a developer?  Entrepreneur?  Or a person with vision loss?  Did this episode spark ideas for you?  Let us know at podcasts@lighthouseguild.org

I’m Dr. Cal Roberts. On Tech & Vision is produced at Lighthouse Guild by my colleagues Cathleen Wirts, Jaine Schmidt and Annemarie O’Hearn.  My thanks to Podfly for their production support.  For more information, please visit www.lighthouseguild.org.