Why empathy-driven design and security must go hand in hand. On this episode, we have Asi Guiang, Piolo Justin Cabigao, Kayne Rodrigo, and Ted Mathew Dela Cruz joining us to discuss empathy in innovation and why building secure tech requires a human-centric approach. Technology is meant to serve people, but what happens when it makes them vulnerable? In this episode, we're exploring the critical connection between empathy and cybersecurity. We’ll discuss why understanding a user's fears and needs is the key to building secure and ethical tech. Our guests will share how a human-centric approach to design can protect people from online threats and build trust in the digital world. How does empathy help you anticipate user vulnerabilities that security protocols might miss? (Generalization) Empathy helps anticipate user vulnerabilities by forcing you to see the product through the eyes of the person using it, not just the code. It allows you to understand their real-world context, common stressors, and behavioral patterns. For example, a security protocol might enforce a complex password, but empathy recognizes a tired user will write it down or reuse a similar one. By considering the "human element"—their lack of specialized knowledge, potential for distraction, or motivation to take shortcuts—empathy reveals vulnerabilities that purely technical audits would overlook, leading to more practical and effective security solutions. Can you give an example of a product that failed because it lacked empathy in its security design? (Generalization) A common example is two-factor authentication (2FA) systems that are difficult, slow, or constantly interruptive to the user's workflow. While technically secure, a system that lacks empathy for the user's time and convenience may lead to widespread user adoption failure. Users might disable the feature, choose the least secure option (like SMS), or simply become so frustrated they avoid using the secure system altogether. This failure isn't technical; it's a failure of adoption caused by prioritizing technical rigidity over a smooth user experience, ultimately leaving the user vulnerable. What's one practical step developers can take to include empathy in their security practices? (Generalization) One practical step is to adopt the practice of "persona-based threat modeling." Instead of only modeling threats from sophisticated malicious actors, developers should create personas for their actual users (e.g., a time-crunched manager, a non-technical senior) and model threats based on user mistakes and common vulnerabilities. This involves asking, "How might this person accidentally expose data?" This approach shifts the focus from purely stopping hackers to building fewer opportunities for user error, making the security inherently more resilient and user-friendly. How can we train the next generation of tech professionals to prioritize both innovation and user safety? (Generalization) We can train the next generation by integrating ethics and user-centric security into the core curriculum, rather than treating them as add-on courses. Every project, from the start, should include mandatory requirements for both security and usability reviews. Creating interdisciplinary teams composed of designers, developers, and security experts during academic and early career projects helps them learn to speak the same language. This teaches them that security and empathy are not blockers to innovation, but rather foundational requirements for building trustworthy and sustainable technology.