Android Developers Blog: #WeArePlay | How Zülal Uses AI to Help the Visually Impaired

Android Developers Blog: #WeArePlay | How Zülal Uses AI to Help the Visually Impaired


Posted by Leticia Lago – Developer Marketing

Zülal was born in Istanbul, Turkey, with low vision and has been a power user of visual aids since the age of four. When she lost her sight completely at the age of ten, she became dependent on technology to see and experience the world around her.

Today Zülal is the founder of Be proudher solution to the problems she found with other visual aids. The app enables people with visual impairment to be inspired by the world around them. With a team of 4, she leads the technological development and user experience for the app.

Zülal shared her story in our latest film for #WeArePlayingthat celebrates people around the world who build apps and games. She shared her journey from uploading photos of her parents to a computer to get descriptions of them as a child, to developing her own visual assistive app. Find out what’s next for Zülal and how she’s using AI to help people like herself.

Tell us more about the inspiration behind FYE.

Today, there are approximately 330 million people with severe to moderate visual impairment. Visual aids are transforming people’s lives, giving them back a sense of independence and connection to the world around them. I am a poet and a composer, and in order to create, I needed this technology so I could see and describe the world around me. Before I developed FYE, the visual aids I relied on were failing me. I wanted to take back control. I didn’t want to sit back, wait and see what technology could do for me—I wanted to harness its power. So I did.

Why was it important for you to set up FYE?

I never wanted to be limited by low vision. I always thought: how can I improve this? How can I improve my life? I want to do everything, because I can. I truly believe that there is nothing I can’t do. There is nothing WE can’t do. For a founder like me to be at the forefront of visual aids illustrates that. We are taking back control of how we experience the world around us.

What's different about FYE?

With our app, I think our audience can really see the world again. It uses a combination of AI and human input to describe the world around them to our users. It includes an AI model trained on a dataset of over 15 million data points, so it really captures all the different factors that make up the world of everyday visual experiences. The goal was to make descriptions as vivid as if I were describing my surroundings myself. It’s the little details that make a big difference.

What's next for your app?

We already have personalized AI outputs so that the user can create different AI assistants that are suitable for different situations. You can use it to work on the web while you are browsing or shopping. I use it a lot for cooking, where the AI ​​can adapt and learn to adapt to each situation. We also work with places where people with visual impairments may struggle, like the subway and the airport. We have built AI outputs in collaboration with these spaces so that anyone using our app can navigate those spaces with confidence. I am currently working on developing From Your Eyes as an organization, re-branding the app as a part of the organization under the new name FYE. Then we are exploring integrations with smart glasses and watches to bring our app to wearables.

Discover more #WeArePlay Stories and share your favorites.

How useful did you find this blog post?