18 episodes

Dr. Cal Roberts, President and CEO of Lighthouse Guild, the leading provider of exceptional services that inspire people who are visually impaired to attain their goals, interviews inventors, developers and entrepreneurs who have innovative tech ideas and solutions to help improve the lives of people with vision loss.

On Tech & Vision With Dr. Cal Roberts Lighthouse Guild

    • Technology
    • 5.0 • 28 Ratings

Dr. Cal Roberts, President and CEO of Lighthouse Guild, the leading provider of exceptional services that inspire people who are visually impaired to attain their goals, interviews inventors, developers and entrepreneurs who have innovative tech ideas and solutions to help improve the lives of people with vision loss.

    Tools for Success: Tech Convergence and Co-Designed Products Close Gaps for Children Who are Blind

    Tools for Success: Tech Convergence and Co-Designed Products Close Gaps for Children Who are Blind

    This podcast is about big ideas on how technology is making life better for people with vision loss.
    People who are blind or visually impaired know all too well the challenges of living in a sighted world. But today, the capabilities of computer vision and other tech are converging with the needs of people who are blind and low-vision and may help level the playing field for young people with all different sensory abilities. These tools can pave the way for children’s active participation and collaboration in school, in social situations, and eventually, in the workplace, facilitating the important contributions they will make to our world in their adult lives.
    Access to educational materials is a consistent challenge for students and adults who are blind, but Greg Stilson, the head of Global Innovation at American Printing House for the Blind (APH), is trying to change that. Together with partner organizations Dot Inc. and Humanware, APH is on the verge of being able to deliver the “Holy Braille” of braille readers, a dynamic tactile device that delivers both Braille and tactile graphics in an instant, poised to fill a much-needed gap in the Braille textbook market. Extensive user testing means the device is as useful for people who are blind as possible. Greg sees a future in which more inclusively designed and accessible video games, augmented reality (AR), and virtual reality (VR) will help children who are blind learn with greater ease, and better engage with their sighted peers.
    Enter Dr. Cecily Morrison, principal researcher at Microsoft Research in Cambridge, UK. Based on extensive research and co-designing with people who are blind, she and her team developed PeopleLens, smart glasses worn on the forehead that can identify the person whom the user is facing, giving the user a spatial map in their mind of where classmates (as one example) are in space. PeopleLens helps children who are blind overcome social inhibitions and engage with classmates and peers, a skill that will be crucial to their development, and in their lives, as they move into the cooperative workspaces of the future.
     
    The Big Takeaways:
    Robin Akselrud, an occupational therapist and assistant professor at Long Island University in Brooklyn, author of MY OT Journey Planner and The My OT Journey Podcast, explains how a baby who is born blind becomes inhibited from their first developmental milestones. She explains the stressors that these children might face upon attending school and describes the kinds of interventions that occupational therapy offers. Bryce Weiler, disability consultant, sports enthusiast, and co-founder of the Beautiful Lives Project, emphasizes how important it is for children who are blind or low-vision to have rich sensory experiences — and life experiences — which give them a chance to flourish and socialize with peers. Beautiful Lives Project offers opportunities to do that. Greg Stilson, Director of Global Innovation at American Printing House for the Blind, and his team are developing a dynamic tactile device (DTD) that can switch seamlessly between Braille and tactile graphics — the “Holy Braille” of braille devices. The DTD is made possible by developments in pin technology by Dot Inc, and APH. Humanware developed the software for the device. No longer using the piezoelectric effect to move pins has reduced the cost of the device significantly, and APH can funnel federal funds to reduce the price further, making the DTD a potential, viable option for institutions. Cecily Morrison, principal researcher at Microsoft Research in Cambridge UK, and her team developed PeopleLens, a head-worn pair of smart glasses that lets the wearer know who is in their immediate vicinity. Dr. Morrison and her team tested it in classrooms for school-age children who are blind or visually impaired and found that PeopleLens reduces students’ cognitive load and helps young people overcome social anxiety and inhibitions t

    • 32 min
    Innovations in Intraocular Pressure and Closed-Loop Drug Delivery Systems

    Innovations in Intraocular Pressure and Closed-Loop Drug Delivery Systems

    This podcast is about big ideas on how technology is making life better for people with vision loss.
    In 2012, Christine Ha won the third season of Masterchef, after having lost her vision in her twenties. Since her win, she has opened two restaurants in Houston, adapting to the challenges the pandemic still poses to restaurateurs in order to meet the needs of her community. In a similarly innovative way, Max Ostermeier, CEO and Founder of Implandata Ophthalmic Products out of Hannover Germany, has reimagined the remote management and care of patients with glaucoma. Max and his team developed the EyeMate system, a microscopic implantable device and microsensor that measures intraocular pressure throughout the day. The EyeMate sends eye pressure data to an external device and uploads it to their eye doctor's office for analysis. This game-changing technology allows people with glaucoma to bypass regular trips to the ophthalmologist’s office to measure their eye pressure, key data in maintaining their eye health. We revisit a conversation with Sherrill Jones, who lost her sight due to glaucoma, in which she shares how difficult it was to adhere to compliance protocols. Max believes the EyeMate will evolve to be part of a closed loop drug delivery system; that is, when the EyeMate registers a high pressure, medications could automatically be released into the patient’s eye, which could improve outcomes significantly. We dig into issues of compliance and closed-loop systems by considering diabetes. We talk to occupational therapist Christina Senechal who has managed her diabetes for 27 years, and Dr. Carmen Pal, who specializes in internal medicine, endocrinology, diabetes, and metabolism in Lighthouse Guild’s Maxine and John M. Bendheim Center for Diabetes Care.
     
    The Big Takeaways:
    Innovation and adaptability have been key as Christine Ha, who won Masterchef in 2012 despite being blind, keeps the doors open on two Houston restaurants during the pandemic. In order to find opportunities through the pandemic, Christine has had to discover new needs that have to be met. The pandemic has made it harder for patients with glaucoma, many of whom are older, to get to the eye doctor. Dr. Max Ostermeier and his team have invented a microscopic implantable device, the EyeMate, that measures intraocular pressure throughout the day and then sends the data to a patient’s handheld device and their eye doctor's office for analysis, saving patients a trip. Since it can take many readings throughout the day, this system is better than relying on a doctor’s visit. The abundance of data can help patients use their medication more reliably, but can also be optimized in the future by algorithms. In the future, Max envisions that the EyeMate could be paired with a drug delivery implant to close the loop between monitoring and drug delivery, as with diabetes. This would make it easier for patients to stay in compliance with glaucoma protocols. We speak with Sherrill Jones, a patient with glaucoma, on why compliance for glaucoma was a challenge for her. We hear from Christina Senechal about her journey with diabetes and how hard compliance can be and how new technologies can help, and from Dr. Carmen Pal about how closed-loop drug delivery systems that pair glucose monitoring with insulin delivery help patients stay in compliance.  
    Tweetables:
    There is opportunity and you just have to kind of think outside of the box and figure out what new ways … to think or do things, and what new needs are that need to be met. And that's how you survive. — Christine Ha, chef and restaurateur, Xin Chao and The Blind Goat, Houston “Glaucoma patients … are among the most vulnerable. … They have been asked to stay out of the doctor’s office. … But … they have an eye disease, which, if the pressure is too high, can damage the optic nerve. —Max Ostermeier, CEO and Founder of Implandata Ophthalmic Products Right now, ... a patient se

    • 35 min
    Restoring Vision: Code Breaking and Optogenetics

    Restoring Vision: Code Breaking and Optogenetics

    This podcast is about big ideas on how technology is making life better for people with vision loss.
    The Enigma machines that Germany used to encode messages during World War II were notorious for their complexity. Two Enigma experts — Dr. Tom Perera, a retired neuroscientist, and founder of EnigmaMuseum.com, and Dr. Mark Baldwin, an expert on the story of Enigma machines — tell us how the Allies were able to crack the code, by using input-output mapping.
    The human brain is similarly complex. Until recently, no one knew the code the retina used to communicate with the brain to create sight. Our guest, Dr. Sheila Nirenberg, a neuroscientist at Weill Cornell, and Principal and Founder of Bionic Sight has — using input-output mapping — cracked the retina’s neural code, enabling her to recreate the electric signals to the brain that could restore sight in people with retinal degeneration. She has created a set of goggles that convert a camera’s images into the code, via pulses of light. And she relies on optogenetics, a relatively new procedure in neuroscience that helps neurons become responsive to light. In her clinical trial, Dr. Nirenberg injects the optogenetic vector into the eye, and trial participants who are completely blind, like Barry Honig, who we speak with on this program, report being able to see light. In early studies, coupling the effects of the optogenetics with the code-enabled goggles has an even more impressive effect on patients’ vision. Dr. Nirenberg is also using her knowledge of the visual neural code to inform machine learning applications that could also be further used to support people who are blind or visually impaired. Clinical trial participants are important partners in the journey of discovery, Dr. Nirenberg says. Barry Honig agrees. He was happy to participate to help ease the burden on future children diagnosed with eye diseases that would otherwise result in blindness, but thanks to these advancements, someday may not.
     
    The Big Takeaways:
    Dr. Tom Perera and Dr. Mark Baldwin describe the history and workings of the Enigma machine, the complex encoding device that allowed Germany to take the upper hand at the beginning of World War II, a war in which communication was sent wirelessly, elevating the need for encryption. They then describe the Polish and British efforts to break Enigma, including standard decryption and Alan Turing’s Bombe machine. Similar to the Enigma, the human brain is incredibly complex, and much of the codes that make it run have not yet been deciphered, until now. Our guest, Dr. Sheila Nirenberg, conducted extensive input-output mapping on live human retinas. She was able to keep them alive in a dish outside the body for a few hours, during which time she’d show them videos. As the retina perceived the films, Dr. Nirenberg mapped the electrical current that would pulse through the ganglion nerve. In this way, she was able to learn how the human eye sees and to decipher the code that allows our brains to perceive images. This code has been honed via evolution over millennia. Having cracked the retinal neural code, Dr. Nirengberg held the key to restoring vision in people who are blind from retinal degeneration. She developed goggles embedded with a camera to convert the visual world into the retina’s neural code using pulses of light, but she still had to get these pulses of light into an unseeing eye. Optogenetics is the key to creating light perception. Optogenetics is a relatively new procedure in neuroscience, by which researchers have created a genetically modified virus based on light-responsive algae, which when injected into live human cells, recombines its DNA with the DNA of host cells. In Dr. Nirenberg’s case, she injects the optogenetic vector into the patient’s retina. Most patients report the restoration of light perception to varying degrees, with just the optogenetics alone. Coupled with the goggles, and with Dr. Nirenberg

    • 31 min
    Seeing with Sound: Using Audio to Activate the Brain’s Visual Cortex

    Seeing with Sound: Using Audio to Activate the Brain’s Visual Cortex

    This podcast is about big ideas on how technology is making life better for people with vision loss.
    Every day, people who are blind or visually impaired use their hearing to compensate for vision loss. But when we lose our vision, can we access our visual cortex via other senses? We call this ability for the brain to change its activity “plasticity,” and brain plasticity is an area of active research. In this episode, we’ll explore how, through sensory substitution, audio feedback can, in some cases, stimulate a user’s visual cortex, allowing a user to — without sight — achieve something close to visual perception.
    Erik Weihenmayer — world-class mountain climber, kayaker, and founder of No Barriers who lost his vision as a teenager due to retinoschisis — brings us to the summit of Everest by describing what it sounds like. He explains how his hearing helps him navigate his amazing outdoor adventures safely. We also speak with Peter Meijer, the creator of The vOICe, an experimental technology that converts visual information into sound, and has been shown to activate users’ visual cortices, especially as users train on the technology, and master how to interpret the audio feedback. We hear an example of what users of The vOICe hear when it translates a visual image of scissors into audio. Erik Weihenmayer shares his experience with Brainport, a similar sensory substitution technology featured in our episode “Training the Brain: Sensory Substitution. While research is ongoing in the areas of sensory substitution and brain plasticity, it’s encouraging that some users of The vOICe report that the experience is like seeing. In the spirit of Erik Weihenmayer, one user even uses it to surf.
     
    The Big Takeaways:
    Erik Weihenmayer, despite having lost his vision as a teenager, has become a world-class adventurer. He summited Everest in 2001 and then summitted the highest peaks on each continent. He has also kayaked 277 miles of whitewater rapids in the Colorado River through the Grand Canyon. He explains how his sense of hearing, in addition to his other senses, and technologies, teams, and systems, helps him achieve his goal to live a life with no barriers. Dutch Inventor Peter Meijer developed a technology called The vOICe, which converts a two-dimensional image from a camera into audio feedback. Dr. Roberts interviews Dr. Meijer about this technology and gives listeners a chance to hear what The vOICe sounds like. Users who train on this system interpret the sounds to make sense of the original visual image. Research on The vOICe shows that this happens in the brain’s visual cortex. While some users say the experience is more auditory than visual, others report the experience as akin to sight. The vOICe relies on the principles of sensory substitution established by the founder of sensory substitution Paul Bach-y-Rita. We discussed sensory substitution in our episode “Training the Brain: Sensory Substitution,” which featured the Brainport device by WICAB. Erik has used Brainport, and in this episode, he describes how the Brainport allowed him to catch a ball rolling across a table, an exciting feat for someone who is blind. He adds that sensory substitution takes serious practice to master. The vOICe is still in the experimental stage, and more research has to be done on sensory substitution. However, neuroscientists studying The vOICe have shown that it stimulates the visual cortex, and some users report visual results. One user of The vOICe recently reported using the technology to surf.   Tweetables:
    “When there’s a lack of things that the sound bounces off of, like on a summit, the sound vibrations just move out through space infinitely and that’s a really beautiful awe-inspiring sound.” — Erik Weihenmayer, No Barriers. “She rolled this white tennis ball across. It lit up perfectly. [...] I’m like, ‘Holy cow, that is a tennis ball rolling towards me.’ And I just natural

    • 30 min
    Beyond Self Driving Cars: Technologies for Autonomous Human Navigation

    Beyond Self Driving Cars: Technologies for Autonomous Human Navigation

    This podcast is about big ideas on how technology is making life better for people with vision loss.
    Today’s big idea is about exciting and emerging technologies that will someday allow people who are blind or visually impaired to navigate fully autonomously. In this episode, you will meet Jason Eichenholz, the Co-Founder and CTO of Luminar, and his manufacturing engineer, Nico Gentry. Luminar’s LIDAR technology is instrumental to the development of self-driving cars, but this same technology could be useful for people who are blind or visually impaired, who also have to navigate autonomously. You’ll hear from Thomas Panek, the President and CEO of Guiding Eyes for the Blind, an avid runner who dreamed of running on his own. He took this unmet need to a Google Hackathon and Ryan Burke, the Creative Producer at Google Creative Lab put together a team to develop a solution that turned into Project Guideline. Kevin Yoo, Co-Founder of WearWorks Technology is using inclusive design to develop Wayband, a navigation wristband that communicates directions with users via haptics.
     
    The Big Takeaways:
    Since LIDAR uses a shorter wavelength of light than other sensing technologies it creates the most nuanced image, but unlike a camera, LIDAR also measures the distance to each element in the landscape, making it perfect for self-driving cars. And the fact that LIDAR sensors have gotten better and cheaper for self-driving cars has made them available as well for technologies that help people who are blind and visually impaired. LIDAR’s Jason Eichenholz and his engineer, Nico Gentry; who is visually impaired; dive deep into the broad benefits of  LIDAR for self-driving cars and for autonomously navigating people. As an avid runner who is visually impaired, Thomas Panek, President and CEO of Guiding Eyes for the Blind, decided to take matters into his own hands, and enlist Google to help build him a tool that would allow him to run without a guide — human or canine. Ryan Burke weighs in on how his prototype, Project Guideline, helps people like Thomas run safely. We can’t talk about running safely without talking about GPS. Kevin Yoo of WearWorks Technology has developed a wearable band called Wayband to help pedestrians navigate different paths and terrain more accurately by connecting to GPS maps. And he’s developing a haptic language that will allow users to understand nuanced directions without the need for visual or audio feedback.  
    Tweetables:
    “The big difference of LIDAR technology over sonar or radar is the wavelength of light. So because the wavelength of light is so much shorter, you're able to get much higher spatial resolution. [...] So what you're able to do is to have [....] camera-like spatial resolution with radar-like range, you're getting the best of both worlds.”  — Jason Eichenholz, of LIDAR technology. “The learning curve to be able to run as fast as my legs could carry was being able to train to those beeping sounds and fine-tuning those sounds with the Google engineering team.” — Thomas Panek “It's a compass; it's a vibration compass. And literally, as you rotate, [...] we can literally guide you down the line of a curvy road by creating this Pac-Man-like effect. So what we call these dew points. So as soon as you collect the dew point, it will guide you to the next one.” — Kevin Yoo  
    Contact Us:
    Contact us at podcasts@lighthouseguild.org with your innovative new technology ideas for people with vision loss.  
    Pertinent Links:
    Lighthouse Guild Jason Eichenholz Thomas Panek Ryan Burke Kevin Yoo

    • 25 min
    Batman Technology: Using Sonar for Human Navigation

    Batman Technology: Using Sonar for Human Navigation

    This podcast is about big ideas on how technology is making life better for people with vision loss.
    Today’s big idea is Sonar and a somewhat similar technology called LiDAR! Can we use the latest sonar technology for obstacle detection the way bats and other nocturnal creatures do? There have been many exciting advances happening in sonar sensors that now make this possible for people who are blind. However, unlike bats, we won’t need to receive feedback signals through our ears. Advances in haptic technologies and languages make communication through touch possible. Dr. Cal Roberts talks with Dr. Matina Kalcounis-Rueppell from the College of Natural and Applied Science at the University of Alberta, Ben Eynon, and Diego Roel from Strap Technologies, Marco Trujillo of Sunu, and Sam Seavey of The Blind Life YouTube Channel to find out more.
     
    The Big Takeaways:
    How does a bat see what it sees? Dr. Kalcounis-Rueppell studies bats and how they use sound to thrive in their nighttime world. Bats use a series of echoes to see a 3D view of their environment, but their world isn’t always so simple. There’s rain, there are leaves, and other creatures flying that bats need to detect with their sonar. Similarly, people with vision impairment have to use their  hearing to navigate complex auditory environments. Strap Technologies uses Sonar and LiDAR sensors that can be strapped across the chest, which helps people who are blind detect obstacles. These kinds of sensors have been used to park spacecraft, but with recent developments, they’re finally small enough that a human can wear them in a compact way. Ben and Diego share how it works. Unlike Sonor, LiDAR technology uses pulsed laser light instead of sound waves. Though bats have been honing their echolocation skills for millennia, interpreting information haptically, rather than sonically, is an adaptation that humans, using technologies like Strap, can make. Haptic information can help us navigate without sight through the use of vibrations, which is great news because it means we can leave our ears open to process our active world. More specifically, Ben and Diego suggest that people may no longer need to use a cane to detect obstacles. Ben and Diego are excited about the future. With their technology, they hope to create quick-reacting haptic technology so people who are blind can one day ride a bike or run a race. Infrared or radiation sensors could be added in the future to detect other hazards in the environment. The more user feedback they receive, the easier it will be to add on these product enhancements. Another way we can approximate sight is through echolocation. However, how easy is it for us to hear echoes, really? For Marco at Sunu, it’s actually a natural skill we can learn to develop. Similar to Strap Technologies, the process of learning echolocation could be improved if you're wearing a Sunu Band. Sam Seavey was diagnosed at age 11 with Stargardt’s Disease. He decided to use his voice and video skills to create a YouTube review channel for those who need to use assistive tech. The positive feedback from the community keeps him going. Sam has personally reviewed the Sunu Band, and you can check out the link to his review in the show notes!  
    Tweetables:
    “They parked spacecraft with these same sensors, and recent developments have really pushed the miniaturization of the components, such that a human being can now wear them in a very compact form factor.” — Ben Eynon
    “He said, ‘I’m walking faster than I have in a long, long time,’ because he started to trust that the haptic vibrations were telling him every obstacle in the way.” — Ben Eynon shares the reaction from a user who is visually impaired testing Strap
    “We're changing our environment around us in ways that also change the acoustic environment.” — Dr. Matina Kalcounis-Rueppell
    “How is it that we have self-driving cars, we have rockets that land themselves

    • 23 min

Customer Reviews

5.0 out of 5
28 Ratings

28 Ratings

Geond ,

Eye opening podcast content

This podcast is literally and figuratively an eye-opening experience. Dr. Cal Roberts, with the aid of Irisvision's founder and other professionals, talks us through the biology and neurology of vision loss and low vision conditions. Then we are treated to new, hi-tech inventions that build on virtual reality technology to bring a new world of enhanced visual reality to those who have lost varying degrees of visual ability.

Dr. Cal makes us see (pun intended) that the main goal of these technologies is to bring those who have become socially isolated by their loss of vision back to socialization (the pandemic, not withstanding) with their family and friends. I recommend this podcast highly to all manner of health professionals who are invested with the well being of their patients, low vision or otherwise.

Krazyc306 ,

Great content! A must listen

I love this podcast and can’t wait for whats to come!

Top Podcasts In Technology

Jason Calacanis
Lex Fridman
Jack Rhysider
NPR
Gimlet
PJ Vogt

You Might Also Like

The New York Times
This American Life
NPR
Blind Abilities Team
Freakonomics Radio + Stitcher
Clark Howard