HCI Explained

HCI Explained

Curious about how humans interact with technology? HCI Explained breaks down complex Human-Computer Interaction topics into clear, engaging stories. From agency and automation to design ethics and UX psychology, we explore how tech shapes what it means to be human. Whether you're a student, designer, or tech enthusiast, join us weekly to learn how interaction design influences our everyday lives.

  1. Multimodal Interaction: Blending Touch, Voice, and Beyond #HCIExplained

    21/09/2025

    Multimodal Interaction: Blending Touch, Voice, and Beyond #HCIExplained

    Every day, you’re already engaging in multimodal interaction, whether you’re tapping on a screen, speaking to a virtual assistant, or combining gestures and voice commands in your car. But what does it really mean when technology supports more than one way of communicating? And how does it change the way we design, use, and experience our digital tools? In this episode of HCI Explained, we explore multimodal interaction: the blending of touch, voice, gesture, and other input modes to create more natural, flexible, and powerful ways of working with technology. 🔹 What we cover in this episode: The basics of multimodal interaction and why it’s a step beyond single-mode systems. How combining modalities (like voice + touch, or gesture + gaze) reduces cognitive load and enhances accessibility. The challenges designers face when blending modalities, including context, consistency, and system complexity. Everyday examples: voice assistants, adaptive car interfaces, VR/AR environments, and smart devices. Emerging frontiers: AI-driven multimodality, where systems learn and adapt to your preferred ways of interacting. Multimodal systems aren’t just about convenience, they represent a major shift in human-computer interaction (HCI). They allow more inclusive experiences, support diverse user needs, and create interfaces that adapt to us rather than forcing us to adapt to them. As automation and AI increasingly shape how we live and work, multimodal interaction also raises fascinating design and ethical questions: How can we ensure systems remain intuitive while growing in complexity? What happens to agency when the system “decides” which mode to prioritise? Can multimodal interaction help close accessibility gaps, or will it widen them if not carefully designed? By the end of this episode, you’ll have a clearer sense of how multimodal interaction is already shaping your daily tech experiences and how it could transform the future of human-AI collaboration. 🎧 Tune in and let’s unpack the many voices, touches, and gestures that are redefining interaction. #HCIE #MultimodalInteraction #UX #HCI

    31 min
  2. Embodied Interaction: When the Body Becomes the Interface #HCIExplained

    13/09/2025

    Embodied Interaction: When the Body Becomes the Interface #HCIExplained

    What if your body wasn’t just a tool to operate technology but the interface itself? From swiping on a touchscreen to moving through a VR world, your body is already shaping how you experience the digital. This is the essence of embodied interaction, an HCI perspective that sees the body not just as an input device, but as central to how we think, act, and connect with technology. In this episode of HCI Explained, we dive into the fascinating world of embodied interaction, exploring how movement, posture, and gestures allow us to interact more naturally with digital systems. 🔎 What you’ll learn in this episode: The roots of embodied interaction: How human cognition is shaped by our bodies, and why this matters for technology design. Everyday examples: Touchscreens that respond to swipes, fitness trackers that map movement, VR headsets that sense orientation, and gesture-based controls in gaming. Beyond the basics: How embodied interaction is pushing into advanced areas like full-body tracking, somatic design (using the felt experience of the body in design), and augmented movement through wearables or robotics. Theoretical underpinnings: Links between embodiment, perception, and cognition that make these interactions feel natural rather than artificial. Social and cultural contexts: Why the meaning of a gesture or movement can shift depending on who uses it and where. Future directions: From brain-computer interfaces to somatic play, embodied interaction could transform not just usability but also creativity, therapy, and social connection. 💡 Why this mattersEmbodied interaction challenges us to rethink the boundaries between body and computer. It makes interactions more intuitive, immersive, and human-centered, while also raising important questions: How do we design responsibly when tech literally moves our bodies? How do we ensure inclusivity when bodies differ in ability, culture, and context? By the end of this episode, you’ll see that your body isn’t just using technology, it’s part of the interface itself. 🎧 HCI Explained brings you weekly insights into the hidden principles shaping how humans and computers connect.

    45 min
  3. Cognitive Maps: Navigating Digital Information Spaces #HCIExplained

    31/08/2025

    Cognitive Maps: Navigating Digital Information Spaces #HCIExplained

    Every time you open an app, scroll through a website, or manage a cluster of browser tabs, your brain is doing more than you realise. It’s not just clicking and scrolling — it’s building a mental map of a hidden digital landscape. These internal representations, called cognitive maps, shape how we navigate, find information, and make sense of complex digital worlds. In this episode of HCI Explained, we explore the fascinating role of cognitive maps in Human-Computer Interaction (HCI). Originally studied in psychology to explain how humans and animals navigate physical spaces, cognitive maps now help us understand how people orient themselves in vast digital environments. 🔎 What you’ll learn in this episode: How cognitive maps, once tied to navigating cities and physical spaces, now extend to digital platforms. Everyday examples: feeling lost in a sea of tabs, learning the layout of a new app, or remembering where settings are in your phone’s OS. How users build these maps through repetition, patterns, cultural conventions, and landmarks like icons, menus, and breadcrumbs. The difference between good design (that supports mental mapping with clarity and consistency) versus poor design (that leads to confusion, frustration, and abandonment). How disorientation in digital spaces mirrors being lost in real ones — and why that costs time, energy, and trust. Design strategies to support user maps: clear landmarks, predictable layouts, and consistency across contexts. Future frontiers: how AR and VR extend physical navigation into virtual environments, how AI may serve as adaptive “guides,” and the ethical questions of whether designers should empower exploration or funnel behaviour. 💡 Why this mattersCognitive maps explain why some interfaces feel intuitive while others leave us struggling. They reveal how design choices directly affect user orientation, memory, and satisfaction. Understanding cognitive maps can revolutionise the way we design digital spaces — making them more navigable, humane, and empowering. By the end of this episode, you’ll see that you’re not just “using” apps and websites — you’re exploring invisible landscapes your brain is constantly mapping. 🎧 HCI Explained is your guide to uncovering the hidden principles that shape our interactions with technology.

    24 min

About

Curious about how humans interact with technology? HCI Explained breaks down complex Human-Computer Interaction topics into clear, engaging stories. From agency and automation to design ethics and UX psychology, we explore how tech shapes what it means to be human. Whether you're a student, designer, or tech enthusiast, join us weekly to learn how interaction design influences our everyday lives.