7 episodes

Make your data product or decision support app indispensable. Brian reveals the strategies and activities that product, data science and analytics leaders are using to deliver valuable experiences around data. From traditional analytics to ML and AI, Brian and his guests explore how extraordinary value can be created when UX and human-centered design are applied to data science and analytics. Great for data scientists, analytics leaders, CDOs, CAOs, product managers, and designers. Use the hashtag #ExperiencingData. Transcripts available at: https://designingforanalytics.com/podcast-subscribe

Experiencing Data with Brian T. O'Neill Brian T. O'Neill from Designing for Analytics

    • Business
    • 4.9, 13 Ratings

Make your data product or decision support app indispensable. Brian reveals the strategies and activities that product, data science and analytics leaders are using to deliver valuable experiences around data. From traditional analytics to ML and AI, Brian and his guests explore how extraordinary value can be created when UX and human-centered design are applied to data science and analytics. Great for data scientists, analytics leaders, CDOs, CAOs, product managers, and designers. Use the hashtag #ExperiencingData. Transcripts available at: https://designingforanalytics.com/podcast-subscribe

    040 – Improving Potato Chips and Space Travel: NASA’s Steve Rader on Open Innovation

    040 – Improving Potato Chips and Space Travel: NASA’s Steve Rader on Open Innovation

    Innovation doesn’t just happen out of thin air. It requires a conscious effort, and team-wide collaboration.

    At the same time, innovation will be critical for NASA if the organization hopes to remain competitive and successful in the coming years. Enter Steve Rader. Steve has spent the last 31 years at NASA, working in a variety of roles including flight control under the legendary Gene Kranz, software development, and communications architecture. A few years ago, Steve was named Deputy Director for the Center of Excellence for Collaborative Innovation. As Deputy Director, Steve is spearheading the use of open innovation, as well as diversity thinking. In doing so, Steve is helping the organization find more effective ways of approaching and solving problems.

    In this fascinating discussion, Steve and Brian discuss design, divergent thinking, and open innovation plus:

    Why Steve decided to shift away from hands-on engineering and management to the emerging field of open innovation, and why NASA needs this as well as diversity in order to remain competitive.
    The challenge of convincing leadership that diversity of thought matters, and why the idea of innovation often receives pushback.
    How NASA is starting to make room for diversity of thought, and leveraging open innovation to solve challenges and bring new ideas forward.
    Examples of how experts from unrelated fields help discover breakthroughs to complex and greasy problems, such as potato chips!
    How the rate of technological change is different today, why innovation is more important than ever, and how crowdsourcing can help streamline problem solving.
    Steve’s thoughts on the type of leader that’s needed to drive diversity at scale, and why that person should be a generalistPrioritizing outcomes over outputs, defining problems, and determining what success looks like early on in a project.
    The metrics a team can use to measure whether one is “doing innovation.”

    Resources and Links
    Designingforanalytics.com/theseminar

    Steve Rader’s LinkedIn: https://www.linkedin.com/in/steve-rader-92b7754/

    NASA Solve: nasa.gov/solve

    Steve Rader’s Twitter: https://twitter.com/SteveRader

    NASA Solve Twitter: https://twitter.com/NASAsolve

    Quotes from Today’s Episode
    “The big benefit you get from open innovation is that it brings diversity into the equation […]and forms this collaborative effort that is actually really, really effective.” – Steve

    “When you start talking about innovation, the first thing that almost everyone does is what I call the innovation eye-roll. Because management always likes to bring up that we’re innovative or we need innovation. And it just sounds so hand-wavy, like you say. And in a lot of organizations, it gets lots of lip service, but almost no funding, almost no support. In most organizations, including NASA, you’re trying to get something out the door that pays the bills. Ours isn’t to pay the bills, but it’s to make Congress happy. And, when you’re doing that, that is a really hard, rough space for innovation.” – Steve

    “We’ve run challenges where we’re trying to improve a solar flare algorithm, and we&#8217

    • 1 hr 1 min
    039 – How PEX Fingerprinted 20 Billion Audio and Video Files and Turned It Into a Product to Help Musicians, Artists and Creators Monetize their Work

    039 – How PEX Fingerprinted 20 Billion Audio and Video Files and Turned It Into a Product to Help Musicians, Artists and Creators Monetize their Work

    Every now and then, I like to insert a music-and-data episode into the show since hey, I’m a musician, and I’m the host

    • 44 min
    038 – (Special Co-Hosted Episode) Brian and Mark Bailey Discuss 10 New Design and UX Considerations for Creating ML and AI-Driven Products and Applications

    038 – (Special Co-Hosted Episode) Brian and Mark Bailey Discuss 10 New Design and UX Considerations for Creating ML and AI-Driven Products and Applications

    Mark Bailey is a leading UX researcher and designer, and host of the Design for AI podcast — a
    program which, similar to Experiencing Data, explores the strategies and considerations around designing data-driven human-centered applications built with machine learning and AI.
    In this episode of Experiencing Data — co-released with the podcast Design for AI — Brian and Mark share the host and guest role, and discuss 10 different UX concepts teams may need to consider when approaching ML-driven data products and AI applications. A great discussion on design and #MLUX ensued, covering:

    Recognizing the barrier of trust and adoption that exists with ML, particularly at non-digital native companies, and how to address it when designing solutions.
    Why designers need to dig beyond surface level knowledge of ML, and develop a comprehensive understanding of the space
    How companies attempt to “separate reality from the movies,” with AI and ML, deploying creative strategies to build trust with end users (with specific examples from Apple and Tesla)
    Designing for “undesirable results” (how to gracefully handle the UX when a model produces unexpected predictions)
    The ongoing dance of balancing UX with organizational goals and engineering milestones
    What designers and solution creators need to be planning for and anticipating with AI products and applications
    Accessibility considerations with AI products and applications – and how itcan be improved
    Mark’s approach to ethics and community as part of the design process.
    The importance of systems design thinking when collecting data and designing models
    The different model types and deployment considerations that affect a solution’s UX — and what solution designers need to know to stay ahead
    Collaborating, and visualizing — or storyboarding — with developers, to help understand data transformation and improve model design
    The role that designers can play in developing model transparency (i.e. interpretability and explainable AI)
    Thinking about pain points or problems that can be outfitted with decision support or intelligence to make an experience better

    Resources and Links:
    Designing for AI Podcast
    Designing for AI
    Experiencing Data – Episode 35
    Designing for Analytics Seminar
    Seeing Theory
    Measuring U
    Contact Brian
    @DesignforAI
    Quotes from Today’s Episode
    “There’s not always going to be a software application that is the output of a machine learning model or something like that. So, to me, designers need to be thinking about decision support as being the desired outcome, whatever that may be.” – Brian
    “… There are [about] 30 to 40 different types of machine learning models that are the most popular ones right now. Knowing what each one of them is good for, as the designer, really helps to conform the machine learning to the problem instead of vice versa.” – Mark
    “You can be technically right and effectively wrong. All the math part [may be] right, but it can be ineffective if the human adoption piece wasn’t really factored into the

    • 1 hr 1 min
    037 – A VC Perspective on AI and Building New Businesses Using Machine Intelligence featuring Rob May of PJC

    037 – A VC Perspective on AI and Building New Businesses Using Machine Intelligence featuring Rob May of PJC

    Rob May is a general partner at PJC, a leading venture capital firm. He was previously CEO of Talla, a platform for AI and automation, as well as co-founder and CEO of Backupify. Rob is an angel investor who has invested in numerous companies, and author of InsideAI which is said to be one of the most widely-read AI newsletters on the planet.
    In this episode, Rob and I discuss AI from a VC perspective. We look into the current state of AI, service as a software, and what Rob looks for in his startup investments and portfolio companies. We also investigate why so many companies are struggling to push their AI projects forward to completion, and how this can be improved. Finally, we outline some important things that founders can do to make products based on machine intelligence (machine learning) attractive to investors.
    In our chat, we covered:

    The emergence of service as a software, which can be understood as a logical extension of “software eating the world” and the 2 hard things to get right (Yes, you read it correctly and Rob will explain what this new SAAS acronym means!) !
    How automation can enable workers to complete tasks more efficiently and focus on bigger problems machines aren’t as good at solving
    Why AI will become ubiquitous in business—but not for 10-15 years
    Rob’s Predict, Automate, and Classify (PAC) framework for deploying AI for business value, and how it can help achieve maximum economic impact
    Economic and societal considerations that people should be thinking about when developing AI – and what we aren’t ready for yet as a society
    Dealing with biases and stereotypes in data, and the ethical issues they can create when training models
    How using synthetic data in certain situations can improve AI models and facilitate usage of the technology
    Concepts product managers of AI and ML solutions should be thinking about
    Training, UX and classification issues when designing experiences around AI
    The importance of model-market fit. In other words, whether a model satisfies a market demand, and whether it will actually make a difference after being deployed.

    Resources and Links:
    Email Rob@pjc.vc
    PJC
    Talla
    SmartBid
    The PAC Framework for Deploying AI
    Twitter: @robmay 
    Sign up for Rob’s Newsletter
    Quotes from Today’s Episode
    “[Service as a software] is a logical extension of software eating the world. Software eats industry after industry, and now it’s eating industries using machine learning that are primarily human labor focused.” — Rob
    “It doesn’t have to be all digital. You could also think about it in terms of restaurant automation, and some of those things where if you keep the interface the same to the customer—the service you’re providing—you strip it out, and everything behind that, if it’s digital it’s an algorithm and if it’s physical, then you use a robot.” — Rob, on service as a software.
    “[When designing for] AI you really want to find some way to convey to the user that the tool is getting smarter and learning.”— Rob
    “There’s a gap right now between the business use cases of AI and the places it’s getting adopted in organizations,” — Rob
    “The reason that AI’s so interesting is because what you e

    • 48 min
    034 – ML & UX: To Augment or Automate? Plus, Rating Overall Analytics Efficacy with Eric Siegel, Ph.D.

    034 – ML & UX: To Augment or Automate? Plus, Rating Overall Analytics Efficacy with Eric Siegel, Ph.D.

    Eric Siegel, Ph.D. is founder of the Predictive Analytics World and Deep Learning World conference series, executive editor of “The Predictive Analytics Times,” and author of “Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie, or Die.” A former Columbia University professor and host of the Dr. Data Show web series, Siegel is a renowned speaker and educator who has been commissioned for more than 100 keynote addresses across multiple industries. Eric is best known for making the “how” and “why” of predictive analytics (aka machine learning) understandable and captivating to his audiences.
    In our chat, we covered:

    The value of defining business outcomes and end user’s needs prior to starting the technical work of predictive modeling, algorithms, or software design.
    The idea of data prototypes being used before engaging in data science to determine where models could potentially fail—saving time while improving your odds of success.
    The first and most important step of Eric’s five-step analytics deployment plan
    Getting multiple people aligned and coordinated about pragmatic considerations and practical constraints surrounding ML project deployment.
    The score (1-10) Eric  gave the data community on its ability to turn data into value
    The difference between decision support and decision automation and what the Central Intelligence Agency’s CDAO thinks about these two methods for using machine learning.
    Understanding how human decisions are informed by quantitative predictions from predictive modes, and what’s required to deliver information in a way that aligns with their needs.
    How Eric likes to bring agility to machine learning by deploying and scaling models incrementally to mitigate risk
    Where the analytics field currently stands in its overall ability to generate value in the last mile.

    Resources and Links:
    Machine Learning Week
    #experiencingdata
    PredictiveAnalyticsWorld.com
    ThePredictionBook.com
    Dr. Data Show
    Twitter: @predictanalytic
    Quotes from Today’s Episode
    “The greatest pitfall that hinders analytics is not to properly plan for its deployment.” — Brian, quoting Eric
    “You don’t jump to number crunching. You start [by asking], ‘Hey, how is this thing going to actually improve business?’ “ — Eric
    “You can do some preliminary number crunching, but don’t greenlight, trigger, and go ahead with the whole machine learning project until you’ve planned accordingly, and iterated. It’s a collaborative effort to design, target, define scope, and ultimately greenlight and execute on a full-scale machine learning project.” — Eric
    “If you’re listening to this interview, it’s your responsibility.” — Eric, commenting on whose job it is to define the business objective of a project.
    “Yeah, so in terms of if 10 were the highest potential [score], in the sort of ideal world where it was really being used to its fullest potential, I don’t know, I guess I would give us a score of [listen to find out!]. Is that what Tom [Davenport] gave!?” — Eric, when asked to rate the analytics community on its ability to deliver value with data
    “We really need to get past our outputs, and the things that we make, the artifacts and those types of software, whatever it may be, and really

    • 35 min
    026 - Why Tom Davenport Gives a 2 out of 10 Score To the Data Science and Analytics Industry for Value Creation

    026 - Why Tom Davenport Gives a 2 out of 10 Score To the Data Science and Analytics Industry for Value Creation

    Tom Davenport has literally written the book on analytics. Actually, several of them, to be precise. Over the course of his career, Tom has established himself as the authority on analytics and how their role in the modern organization has evolved in recent years. Tom is a distinguished professor at Babson College, a research fellow at the MIT Initiative on the Digital Economy, and a senior advisor at Deloitte Analytics. The discussion was timely as Tom had just written an article about a financial services company that had trained its employees on human-centered design so that they could ensure any use of AI would be customer-driven and valuable. We discussed their journey and:

    Why on a scale of 1-10, the field of analytics has only gone from a one to about a two in ten years time
    Why so few analytics projects actually make it into production
    Examples of companies who are using design to turn data into useful applications of AI, decision support and product improvements for customers
    Why shadow IT shouldn’t be a bad word
    AI moonshot projects vs. MVPs and how they relate
    Why journey mapping is incredibly useful and important in analytics and data science work
    How human-centered design and ethnography is the tough work that’s required to turn data into decision support
    Tom’s new book and his thoughts on the future of data science and analytics

    Resources and Links:

    Website:  Tomdavenport.com
    LinkedIn:  Tom Davenport
    Twitter: @tdav
    Designingforanalytics.com/seminar
    Designingforanalytics.com

    Quotes from Today’s Episode
    “If you survey organizations and ask them, ‘Does your company have a data-driven culture?’ they almost always say no. Surveys even show a kind of negative movement over recent years in that regard. And it's because nobody really addresses that issue. They only address the technology side.” — Tom Eventually, I think some fraction of [AI and analytics solutions] get used and are moderately effective, but there is not nearly enough focus on this. A lot of analytics people think their job is to create models, and whether anybody uses it or not is not their responsibility...We don't have enough people who make it their jobs to do that sort of thing. —Tom I think we need this new specialist, like a data ethnographer, who could sort of understand much more how people interact with data and applications, and how many ways they get screwed up.—Tom I don't know how you inculcate it or teach it in schools, but I think we all need curiosity about how technology can make us work more effectively. It clearly takes some investment, and time, and effort to do it.— Tom TD Wealth’s goal was to get [its employees] to experientially understand what data, analytics, technology, and AI are all about, and then to think a lot about how it related to their customers. So they had a lot of time spent with customers, understanding what their needs were to make that match with AI. [...] Most organizations only address the technology and the data sides, so I thought this was very refreshing.—Tom “So we all want to do stuff with data. But as you know, there are a lot of poor solutions that get provided from technical people back to business stakeholders. Sometimes they fall on dea

    • 47 min

Customer Reviews

4.9 out of 5
13 Ratings

13 Ratings

enceladusdata ,

Entertaining and informative content!

As a data analyst and consumer of many data related podcast, I found Brian’s show to be top notch as its very informative and insightful. If you enjoy designing and building great data products that meet the needs of your business users, then you should check out this show!

JKinghorn ,

Insights in how to think about analytics projects

Many of the blog posts, podcasts, etc. are focused on the technical "how" behind analytics and visualization (e.g. which library or tool to use, streaming or batch). This podcast focuses more on the critical link between the business strategy, user experience and use case for analytics projects and the technical underpinnings. For anyone looking at making their data products more successful, these discussions are tremendously valuable.

Top Podcasts In Business

Listeners Also Subscribed To