Science in Parallel

Krell Institute

Science in Parallel focuses on people in computational science and their interdisciplinary research to solve energy challenges, discover new materials, model medicines and more — using high-performance computing (HPC) and artificial intelligence. Host Sarah Webb interviews researchers about their career paths and motivations. Our conversations cover topics such as integrating emerging hardware, the effects of remote work, the role of creativity in computing and foundation models in science. Our show is for curious, science-oriented listeners who like technology. You don’t need a deep background in science and computing to learn from our guests. Science in Parallel has been shortlisted for the Publisher Podcast Awards: for 2022 Best Technology Podcast, 2023 Best Science and Medical Podcast and both categories in 2024 and 2025. It is produced by the Krell Institute and is a media outreach project of the Department of Energy Computational Science Graduate Fellowship (DOE CSGF) program.

  1. HACE 1 DÍA

    S6E8:Youngsoo Choi: Building Reliable Foundation Models

    Foundation models-- LLMs or LLM-like tools-- are a compelling idea for advancing scientific discovery and democratizing computational science. But there's a big gap between these lofty ideas and the trustworthiness of current models. Youngsoo Choi of Lawrence Livermore National Laboratory and his colleagues are thinking about to how to close this chasm. They're engaging with questions such as: What are the essential characteristics that define a foundation model? And how do we make sure that scientists can rely on their results? In this conversation we discuss a position paper that Youngsoo and his colleagues wrote to outline these questions and propose starting points for consensus-based answers and the challenges in building foundation models that are robust, reliable and generalizable. That paper also describes the Data-Driven Finite Element Method, or DD-FEM, a tool that they've developed for combining the power of AI and large datasets with physics-based simulation. You’ll meet: Youngsoo Choi is a staff scientist at Lawrence Livermore National Laboratory (LLNL) and a member of the lab's Center for Applied Scientific Computing (CASC), which focuses on computational science research for national security problems. Youngsoo completed his Ph.D. in computational and mathematical engineering at Stanford University and carried out postdoctoral research at Stanford and Sandia National Laboratories before joining Livermore in 2017.

    31 min
  2. 15 JUL

    S6E5: Amanda Randles: A Check-Engine Light for the Heart

    Duke University associate professor Amanda Randles' work to simulate and understand human blood flow and its implications demonstrates how high-performance computing paired with scientific principles can help improve human health. In this conversation, she talks about how she brought together early interests in physics, coding, biomedicine and even political science and policy and followed her enthusiasm for the Human Genome Project. She discusses how supercomputers are pushing the boundaries of what researchers can learn about the circulatory system noninvasively and how that knowledge, paired with data from wearable devices, could lead to new ways to monitor and treat patients. She also talks about her public engagement and science policy work and its importance, both for educating patients and supporting computational science’s future. You’ll meet: Amanda Randles is the Alfred Winborne and Victoria Stover Mordecai associate professor of biomedical sciences at Duke University and director of Duke’s Center for Computational and Digital Health Innovation. Her research using high-performance computing to model the fluid dynamics of blood flow has garnered numerous awards including one of the inaugural Sony Women in Technology Awards with Nature , the 2024 ISC Jack Dongarra Early Career Award and the 2023 ACM Prize in Computing. Amanda completed her Ph.D. at Harvard University working with Efthimios Kaxiras and Hanspeter Pfister. She was a Department of Energy Computational Science Graduate Fellowship (DOE CSGF) recipient from 2010 to 2013 and a Lawrence Fellow at Lawrence Livermore National Laboratory from 2013 to 2015. Follow Amanda on social media: LinkedIn, BlueSky and Instagram.

    30 min
5
de 5
3 calificaciones

Acerca de

Science in Parallel focuses on people in computational science and their interdisciplinary research to solve energy challenges, discover new materials, model medicines and more — using high-performance computing (HPC) and artificial intelligence. Host Sarah Webb interviews researchers about their career paths and motivations. Our conversations cover topics such as integrating emerging hardware, the effects of remote work, the role of creativity in computing and foundation models in science. Our show is for curious, science-oriented listeners who like technology. You don’t need a deep background in science and computing to learn from our guests. Science in Parallel has been shortlisted for the Publisher Podcast Awards: for 2022 Best Technology Podcast, 2023 Best Science and Medical Podcast and both categories in 2024 and 2025. It is produced by the Krell Institute and is a media outreach project of the Department of Energy Computational Science Graduate Fellowship (DOE CSGF) program.

También te podría interesar