30 min

438: Product ideas in the real-world metaverse – with David Rose Product Mastery Now for Product Managers, Leaders, and Innovators

    • Management

What product managers need to know about augmented reality

Today we are talking about augmented reality and what product managers and leaders need to know about this rapidly changing field that is becoming part of many digital transformation programs. Our guest has created several products using augmented reality, including a phone-based vision test at Warby Parker, the Neiman Marcus digital mirror that makes trying on and selecting clothes easier, the SalesForce conversational balance table, and much more.

His name is David Rose, and he’s an MIT lecturer, an author, and a serial entrepreneur who offers a unique perspective on the next platform of spatial computing—what he calls SuperSight. This is also the title of his latest book, SuperSight: What Augmented Reality Means for Our Lives, Our Work, and the Way We Imagine the Future.

Summary of some concepts discussed for product managers

[2:13] What is the real-world metaverse?

I’m trying to highlight the difference between virtual reality (VR) and augmented reality (AR). A lot of people, when they think of metaverses, think about roadblocks, Minecraft, multiplayer games, etc. Those are all virtual environments where you are sealed off and the real world is obscured from your vision.

By the real-world metaverse, I mean laying information over the existing architecture, city, and water of the places that we go in order to make those places easier to navigate or imagine how they might change in the future. I’m talking about taking all of the internet and spatially anchoring it in the real world.

[3:57] You have created several products related to this real-world metaverse. Tell us about how you get ideas for products.

For me, good ideas for projects or innovation come from a confluence of three things: The first is user need or insights about people. I’ll give an example of an application to boating. I’m a boater, and I’m regularly disoriented out on the water. That’s the user need component.

The second thing is a technological maturity component. In the case of the boating application, it’s computer vision. Computer vision can now identify things in front of your boat.

The third thing is the viability of the business idea or a way to scale. For me, this usually comes from meeting a go-to-market partner who could help commercialize the technology.

 [9:14] Can you tell us more about the boating product example?

I was speaking about my new book at a healthcare conference. We had shown how we could see through the human body for surgery planning. A fisherman approached me afterward and said he wanted to be able to see through the water to avoid hazards and see where to fish. Underwater maps existed, but he wanted to be able to see without using his hands.

I wondered if we could spatially anchor the underwater maps in glasses so you could see the terrain underwater as if you’re in a glass-bottomed boat.

We started off using Unity, a 3D game engine. We got the maps from ArcGIS, which is a spatial company. We made a mockup using glasses called Nreal, which are now at Verizon stores.

We started prototyping the magic moment of seeing through the water. It was pretty compelling, but the more we used it, the more we realized that the glasses really weren’t the way to scale this and get it to market.

We talked to a company called Freedom Boat Club that has a boat membership model for new boaters who don’t have a boat or for people who are going to boat in a new place. We figured out that the best way to image the world around you was to put a camera on top of the boat. We used computer vision with a 360-degree camera system that sits on top of the boat. It uses computer vision to see everything that’s around you that might be a hazard, and then it mixes that with cartography, so that the screen that’s sitting next to...

What product managers need to know about augmented reality

Today we are talking about augmented reality and what product managers and leaders need to know about this rapidly changing field that is becoming part of many digital transformation programs. Our guest has created several products using augmented reality, including a phone-based vision test at Warby Parker, the Neiman Marcus digital mirror that makes trying on and selecting clothes easier, the SalesForce conversational balance table, and much more.

His name is David Rose, and he’s an MIT lecturer, an author, and a serial entrepreneur who offers a unique perspective on the next platform of spatial computing—what he calls SuperSight. This is also the title of his latest book, SuperSight: What Augmented Reality Means for Our Lives, Our Work, and the Way We Imagine the Future.

Summary of some concepts discussed for product managers

[2:13] What is the real-world metaverse?

I’m trying to highlight the difference between virtual reality (VR) and augmented reality (AR). A lot of people, when they think of metaverses, think about roadblocks, Minecraft, multiplayer games, etc. Those are all virtual environments where you are sealed off and the real world is obscured from your vision.

By the real-world metaverse, I mean laying information over the existing architecture, city, and water of the places that we go in order to make those places easier to navigate or imagine how they might change in the future. I’m talking about taking all of the internet and spatially anchoring it in the real world.

[3:57] You have created several products related to this real-world metaverse. Tell us about how you get ideas for products.

For me, good ideas for projects or innovation come from a confluence of three things: The first is user need or insights about people. I’ll give an example of an application to boating. I’m a boater, and I’m regularly disoriented out on the water. That’s the user need component.

The second thing is a technological maturity component. In the case of the boating application, it’s computer vision. Computer vision can now identify things in front of your boat.

The third thing is the viability of the business idea or a way to scale. For me, this usually comes from meeting a go-to-market partner who could help commercialize the technology.

 [9:14] Can you tell us more about the boating product example?

I was speaking about my new book at a healthcare conference. We had shown how we could see through the human body for surgery planning. A fisherman approached me afterward and said he wanted to be able to see through the water to avoid hazards and see where to fish. Underwater maps existed, but he wanted to be able to see without using his hands.

I wondered if we could spatially anchor the underwater maps in glasses so you could see the terrain underwater as if you’re in a glass-bottomed boat.

We started off using Unity, a 3D game engine. We got the maps from ArcGIS, which is a spatial company. We made a mockup using glasses called Nreal, which are now at Verizon stores.

We started prototyping the magic moment of seeing through the water. It was pretty compelling, but the more we used it, the more we realized that the glasses really weren’t the way to scale this and get it to market.

We talked to a company called Freedom Boat Club that has a boat membership model for new boaters who don’t have a boat or for people who are going to boat in a new place. We figured out that the best way to image the world around you was to put a camera on top of the boat. We used computer vision with a 360-degree camera system that sits on top of the boat. It uses computer vision to see everything that’s around you that might be a hazard, and then it mixes that with cartography, so that the screen that’s sitting next to...

30 min