38本のエピソード

A podcast where Kevin Libuit and Andrew Page share thoughts on the latest advancements in Bioinformatics.

the bioinformatics lab The Bioinformatics Lab

    • 科学

A podcast where Kevin Libuit and Andrew Page share thoughts on the latest advancements in Bioinformatics.

    Ep 35: Bioinformatics in the Cloud

    Ep 35: Bioinformatics in the Cloud

    Summary

    In this episode, Kevin and Andrew discuss cloud computing in the context of bioinformatics. They explore the different layers of cloud technology and how it can be implemented in laboratories. They highlight the benefits of cloud computing, such as quick access to resources, cost savings, and scalability. They also discuss the challenges of managing costs and ensuring security in a cloud environment. Overall, they emphasize the importance of understanding the specific needs of bioinformatics and choosing the right cloud infrastructure.

    Takeaways

    Cloud computing offers quick access to resources and cost savings compared to on-premises infrastructure.
    The specific needs of bioinformatics, such as high volume and different hardware requirements, should be considered when setting up a cloud infrastructure.
    Managing costs and ensuring security are important challenges in a cloud environment.
    Cloud computing provides scalability and flexibility to meet the evolving needs of bioinformatics.

    • 19分
    Ep 34: PHA4GE with Dr. Alan Christoffels

    Ep 34: PHA4GE with Dr. Alan Christoffels

    PHA4GE Website: https://pha4ge.org/

    Summary

    Alan Christoffels, the principal investigator for the Public Health Alliance for Genomic Epidemiology (PHA4GE), discusses the challenges and future directions of the consortium. He shares his background as a scientist and director of a program leading international collaborations in pathogen genomics. The conversation highlights the importance of global community building and the need to connect laboratories across borders to address common challenges in public health. The impact of PHA4GE in the response to the COVID-19 pandemic is also discussed, along with the value of creating a community of practice and providing opportunities for professionals to contribute. Alan invites individuals from various backgrounds to get involved in PHA4GE and contribute to its mission.

    Takeaways

    PHA4GE is a global consortium that aims to connect laboratories across borders to address common challenges in public health.
    Alan Christoffels emphasizes the importance of global community building and the need to involve individuals from various backgrounds in PHA4GE.
    The impact of PHA4GE in the response to the COVID-19 pandemic is discussed, highlighting the value of creating a community of practice and providing opportunities for professionals to contribute.
    Alan encourages individuals to get involved in PHA4GE and contribute to its mission.

    • 46分
    Ep 33: Communicating Results in Public Health

    Ep 33: Communicating Results in Public Health

    Summary

    In this episode, Kevin Libuit and Andrew Page discuss the challenges of communicating complex genomic information to different audiences in public health. They explore the need for tailored communication strategies for various stakeholders, including bioinformatics scientists, epidemiologists, and the general public. The power of analogies in simplifying complex concepts is highlighted, along with the importance of building bridges between different technical fields. The conversation also emphasizes the need to communicate the limits and nuances of genomic data, and the role of genetic relatedness as a proxy for epidemiological associations. Overall, the episode underscores the ongoing need for effective communication in the field of genomics and public health.

    Takeaways

    Tailored communication strategies are necessary when communicating complex genomic information to different audiences in public health.
    Analogies can be powerful tools for simplifying complex concepts and making them relatable to a wide range of stakeholders.
    Building bridges between different technical fields, such as bioinformatics and epidemiology, is crucial for effective communication.
    Communicating the limits and nuances of genomic data is essential to avoid misinterpretation and ensure proper understanding.

    • 14分
    Ep 32: Best Practices - Pipeline Development, Part Three

    Ep 32: Best Practices - Pipeline Development, Part Three

    Summary

    This episode of the Bioinformatics Lab Podcast continues the conversation on public health pipeline best practices. The focus is on pipeline functionality, documentation for local install and remote access, and example usage. The hosts discuss the importance of clearly articulating the function of a pipeline and the different pipeline systems available. They emphasize the need for documentation that includes instructions for installation and usage, as well as providing example data. The episode concludes with a call to read the best practices document and the announcement of future efforts to promote and assess adherence to these practices.

    Takeaways

    Clearly articulate the function of a pipeline in the field of public health bioinformatics.
    Use pipeline systems like Galaxy and Terra to visually represent and manage workflows.
    Provide clear documentation for local installation and remote access of pipelines.
    Include example usage and data to facilitate understanding and testing of pipelines.

    • 15分
    Ep 32: Best Practices - Pipeline Development, Part Two

    Ep 32: Best Practices - Pipeline Development, Part Two

    PHA4GE Ten Best Practices for Public Health Bioinformatics Pipelines:
    https://github.com/pha4ge/public-health-pipeline-best-practices/blob/main/docs/pipeline-best-practices.md

    Summary

    In this episode, Kevin Libuit and Andrew Page discuss the 10 best practices for public health pipeline development. They start by emphasizing the use of common file formats and the importance of avoiding reinventing the wheel. They highlight the benefits of standard file formats and the availability of parsers for different languages. They also discuss the implementation of software testing, including the use of automated testing and the integration of testing with Docker containers. They emphasize the need for accessibility to benchmark or validation data sets and the importance of reference data requirements. They also touch on the significance of hiring bioinformaticians and the documentation practices that should be followed.

    Takeaways

    Use common file formats to avoid reinventing the wheel and enable compatibility with other programs.
    Implement software testing, including automated testing, to ensure functionality and identify bugs.
    Provide benchmark or validation data sets to allow users to compare and evaluate the performance of the pipeline.
    Consider the reference data requirements and ensure accessibility to curated databases.
    Hire bioinformaticians with domain expertise to navigate the complexities of pipeline development.
    Follow documentation practices, including communication of authorship, pipeline maintenance statements, and community guidelines for contribution and support.

    • 16分
    Ep 32: Best Practices - Pipeline Development, Part One

    Ep 32: Best Practices - Pipeline Development, Part One

    PHA4GE Ten Best Practices for Public Health Bioinformatics Pipelines:
    https://github.com/pha4ge/public-health-pipeline-best-practices/blob/main/docs/pipeline-best-practices.md

    Summary

    In this episode, Andrew Page and Kevin Libuit discuss best practices for public health bioinformatics pipelines. They highlight the importance of code availability, open source licensing, version control, workflow management systems, and containerized and packaged software. These practices aim to improve transparency, reproducibility, and interoperability in the field of bioinformatics.

    Takeaways

    Code availability and open source licensing are crucial for transparency and collaboration in public health bioinformatics.
    Version control allows for the tracking of software changes and facilitates collaboration.
    Workflow management systems provide standardization and interoperability in pipeline development.
    Containerized and packaged software ensures reproducibility and simplifies software installation.

    • 18分

科学のトップPodcast

超リアルな行動心理学
FERMONDO
佐々木亮の宇宙ばなし
佐々木亮
サイエントーク
研究者レンとOLエマ
科学のラジオ ~Radio Scientia~
ニッポン放送
早稲田大学Podcasts 博士一歩前
早稲田大学広報室
サイエンマニア
研究者レン from サイエントーク

その他のおすすめ

Illumina Genomics Podcast
Illumina, Inc.
The Daily
The New York Times
Pod Save America
Crooked Media
This Week in Virology
Vincent Racaniello