BenVision: Navigating with Music
This podcast is about big ideas on how technology is making life better for people with vision loss. When it comes to navigation technology for people who are blind or visually impaired, many apps utilize voice commands, loud tones or beeps, or haptic feedback. In an effort to create a more natural, seamless experience, the team at BenVision has created a different type of system that allows users to navigate using musical cues instead! For this episode, Dr. Cal spoke with BenVision’s CEO and co-founder, Patrick Burton, along with its Technology Leadd, Aaditya Vaze. They shared about the inspiration behind BenVision, how they’re able to create immersive soundscapes that double as navigation aids, and the exciting future applications this technology could offer. The episode also features BenVision’s other co-founder and Audio Director, Soobin Ha. Soobin described her creative process for designing BenVision’s soundscapes, how she harnesses the power of AI, and her bold vision of what’s to come. Lighthouse Guild volunteer Shanell Matos tested BenVision herself and shares her thoughts on the experience. As you’ll hear, this technology is transformative! The Big Takeaways Why Music? Navigation technology that uses voice, tone, or haptics can create an added distraction for some users. But the brain processes music differently. Instead of overloading the senses, for some users music works alongside them, allowing them to single out separate sound cues, or take in the entire environment as a whole. Like how the different instruments correspond to various characters in “Peter and the Wolf,” BenVision assigns unique musical cues to individual objects. User Experience: Shanell Matos appreciated how BenVision blends in more subconsciously, allowing her to navigate a space without having to be as actively engaged with the process. Additional Applications: BenVision began as an augmented reality program, and its creators see a potential for it to grow beyond a navigational tool to expand for use by people who are visually impaired or fully sighted. For example, it could be used to create unique soundscapes for museums, theme parks, and more, augmenting the experience in exciting new ways. The Role of AI: Artificial Intelligence already plays a big role in how BenVision works, and its creators see it being even more important in the future. BenVision already harnesses AI for object detection and its companion app uses AI to provide instant voice support about the immediate surroundings if needed. Moving forward, AI could be used to help instantaneously generate new sound cues or to help users customize their experience at the press of a button. Tweetables “We thought that if the human brain can learn echolocation and we have this amazing technology that’s available to us in the modern day, then why can’t we make echolocation a little bit more intuitive and perhaps a little bit more pleasant.” — Patrick Burton, BenVision CEO & Co-Founder “You can think of it like a bunch of virtual speakers placed at different locations around the user. So like a speaker on a door or a couch or a chair. And then there are sounds coming from all these virtual speakers at the same time.” — Aaditya Vaze, BenVision Technology Lead “I want to gamify this idea so that the user can actually find some interest and joy by using it, rather than just find it only helpful, but also [to create ] some pleasant feeling.” — Soobin Ha, BenVision Audio Director & Co-Founder “So if there’s a lot of people, there’s a lot of conversations happening, a lot of sounds happening, a lot of movement happening. It’s really difficult to keep up with what everything is doing. Whereas with music, it’s not as difficult to pick out layers.” – Shanell Matos, Lighthouse Guild Volunteer Contact Us: Contact us at podcasts@lighthouseguild.org with your innovative new technology ideas for people with vision loss.