83 afleveringen

Trusted CI is the NSF Cybersecurity Center of Excellence. The mission of Trusted CI is to lead in the development of an NSF Cybersecurity Ecosystem with the workforce, knowledge, processes, and cyberinfrastructure that enables trustworthy science and NSF’s vision of a nation that is a global leader in research and innovation. More information can be found at trustedci.org.

Trusted CI podcast Trusted CI

    • Technologie

Trusted CI is the NSF Cybersecurity Center of Excellence. The mission of Trusted CI is to lead in the development of an NSF Cybersecurity Ecosystem with the workforce, knowledge, processes, and cyberinfrastructure that enables trustworthy science and NSF’s vision of a nation that is a global leader in research and innovation. More information can be found at trustedci.org.

    April 2024: SPHERE - Security and Privacy Heterogeneous Environment for Reproducible Experimentation

    April 2024: SPHERE - Security and Privacy Heterogeneous Environment for Reproducible Experimentation

    Cybersecurity and privacy threats increasingly impact our daily lives, our national infrastructures, and our industry. Recent newsworthy attacks targeted nationally important infrastructure, our government, our researchers, and research facilities. The landscape of what needs to be protected and from what threats is rapidly evolving as new technologies are released and threat actors improve their capabilities through experience and close collaboration. Meanwhile, defenders often work in isolation, use private data and facilities, and produce defenses that are quickly outpaced by new threats. To transform cybersecurity and privacy research into a highly integrated, community-wide effort, researchers need a common, rich, representative research infrastructure that meets the needs across all members of the community, and facilitates reproducible science.

    To meet these needs, USC Information Sciences Institute and Northeastern University have been funded by the NSF mid-scale research infrastructure program to build Security and Privacy Heterogeneous Environment for Reproducible Experimentation (SPHERE). This infrastructure will offer access to an unprecedented variety of hardware, software, and other resources connected by user-configurable network substrate, and protected by a set of security policies uniquely aligned with cybersecurity and privacy research needs. SPHERE will offer six user portals, closely aligned with needs of different user groups. It will support reproducible research through a combination of infrastructure services (easy experiment packaging, sharing and reuse) and community engagement activities (development of realistic experimentation environments and contribution of high-quality research artifacts).


    Speaker Bios:

    Dr. Jelena Mirkovic is Principal Scientist at USC-ISI and Research Associate Professor at USC. She received her MS and PhD from UCLA, and her BSc from University of Belgrade, Serbia. Jelena's research interests span networking and cybersecurity fields, as well as testbed experimentation. Her current research is focused on authentication, use of machine learning for network attack detection, large-scale dataset labeling for security, and user privacy. She is the lead PI on the SPHERE project.

    Mr. David Balenson is Senior Supervising Computer Scientist and Associate Director of the Networking and Cybersecurity Division at USC-ISI. He received his MS and BS in Computer Science from the University of Maryland. His current research interests include cybersecurity and privacy for critical infrastructure and cyber-physical systems including automotive and autonomous vehicles, experimentation and test, technology transition, and multidisciplinary research. He is the Community Outreach Director for SPHERE.

    • 38 min.
    March 2024: Lessons from the ACCORD Project

    March 2024: Lessons from the ACCORD Project

    The ACCORD cyberinfrastructure project at the University of Virginia (UVA) successfully developed and deployed a community infrastructure providing access to secure research computing resources for users at underserved, minority-serving, and non-PhD-granting institutions. ACCORD's operational model is built around balancing data protection with accessibility. In addition to providing secure research computing resources and services, key outcomes of ACCORD include creation of a set of policies that enable researchers external to UVA to access and use ACCORD. While the ACCORD expedition achieved its technical and operational goals, its broader mission of broadening access to underserved users had limited success. Toward gaining a better understanding of the barriers to researchers accessing ACCORD, our team carried out two community outreach efforts to engage with researchers and computing service leaders to hear their pain points as well as solicit their input for an accessible community infrastructure.

    In this talk, we will describe the ACCORD infrastructure and its operational model. We will also discuss insights from our effort to develop policies to balance accessibility with security. And finally, we wil share lessons learned from community outreach efforts to understand institutional and social barriers to access.

    Speaker Bios:

    Ron Hutchins: In the early 1980’s, Ron worked at the Georgia Institute of Technology to create a networking laboratory in the College of Computing teaching data communications courses there. After moving to the role of Director of Campus Networks in 1991, Ron founded and led the Southern Crossroads network aggregation (SoX) across the Southeast. In 2001 after receiving his PhD in computer networks, he took on the role of Chief Technology Officer for the campus. In August of 2015, Ron moved into the role of Vice President of Information Technology for the University of Virginia, working to build partnerships across the campus. Recently, Ron has moved from VP to research faculty in the Computer Science department at UVA and is participating broadly across networking and research computing in general including work with the State of California building out the broadband fiber network backbone across the state.

    Tho Nguyen is a computer science and policy expert. He served as project manager for the ACCORD effort from 2019-2021, and continues to support the project implementation and growth. Nguyen is currently a Senior Program Officer at the National Academies of Sciences, Engineering, and Medicine. From 2015-2021 Nguyen was on the research staff in the Department of Computer Science at the University of Virginia where he worked on compute-in-memory and developing HPCs for research. Prior to UVA, he was a AAAS Science and Technology Policy Fellow at the National Science Foundation where he worked primarily on the Cyber Physical Systems program. Nguyen holds a PhD in Systems & Controls (Electrical Engineering) from the University of Washington.

    • 56 min.
    December 2023: Open Science Chain

    December 2023: Open Science Chain

    The envisioned advantage of sharing research data lies in its potential for reuse. Although many scientific disciplines are embracing data sharing, some face constraints on the data they can share and with whom. It becomes crucial to establish a secure method that efficiently facilitates sharing and verification of data and metadata while upholding privacy restrictions to enable the reuse of scientific data. This presentation highlights our NSF-funded Open Science Chain (OSC) project, accessible at https://www.opensciencechain.org. Developed using blockchain technologies, the OSC project aims to address challenges related to the integrity and provenance of research artifacts. The project establishes an API-based data integrity verification management service for data-driven research platforms and hubs, aiming to minimize data information loss and provide support for managing diverse metadata standards and access controls.

    Speaker Bio:
    Subhashini Sivagnanam is the manager of the Cyberinfrastructure Services and Solutions (CISS) group at the San Diego Supercomputer Center/ UCSD. Her research interests predominantly lie in distributed computing, cyberinfrastructure development, scientific data management, and reproducible science. She serves as the PI/Co-PI on various NSF/NIH projects related to scientific data integrity and developing cyberinfrastructure software. Furthermore, she oversees the management of UC San Diego’s campus research cluster known as the Triton Shared Computing Cluster.

    • 26 min.
    September 2023: Improving the Privacy and Security of Data for Wastewater-based Epidemiology

    September 2023: Improving the Privacy and Security of Data for Wastewater-based Epidemiology

    As the use of wastewater for public health surveillance continues to expand, inevitably sample collection will move from centralized wastewater treatment plants to sample collection points within the sewer collection system to isolate individual neighborhoods and communities. Collecting data at this geospatial resolution will help identify variation in select biomarkers within neighborhoods, ultimately making the wastewater-derived data more actionable. However a challenge in achieving this is the nature of the wastewater collection system, which aggregates and commingles wastewater from various municipalities. Thus various stakeholders from different cities must collectively provide information to separate wastewater catchments to achieve neighborhood-specific public health information. Data sharing restrictions and the need for anonymity complicates this process.

    This talk presents our approaches to enabling data privacy in wastewater-based epidemiology. Our methodology is built upon a cryptographic technique, Homomorphic Encryption (HE), ensuring privacy. Additionally, we outline a technique to enhance the performance of HE, which could be of independent interest.

    Speaker Bio:
    Ni Trieu is currently an Assistant Professor at Arizona State University (ASU). Her research interests lie in the area of cryptography and security, with a specific focus on secure computation and its applications such as private set intersection, private database queries, and privacy-preserving machine learning. Prior to joining ASU, she was a postdoc at UC Berkeley. She received her Ph.D. degree from Oregon State University.

    • 56 min.
    • video
    August 2023: Leveraging Adaptive Framework for Open Source Data Access Solutions

    August 2023: Leveraging Adaptive Framework for Open Source Data Access Solutions

    More than a decade ago, Clemson University outlined the requirements needed to integrate several campus-wide enterprise applications in a way that would automate the exchange of data between them, and establish the relationships of that data to the unique identities that represented all users within the system, including faculty, staff, students, alumni and applicants. There would be no direct access of data, except through applications that were approved and had established Memorandum of Understanding (MOU) contracts in place. This project was known as the Clemson Vault.

    Within the Identity Management space, solutions for automating the provisioning of identities are offered by several vendors these days. However, mileage and cost vary when you wish to integrate arbitrary university resources, such as mailing lists, disk storage, building card access, and course registrations. Open source solutions, with all of the above requirements, are non-existent.

    At Clemson University, we combined licensed vendor software and in-house apps, scripts and procedures to create a data integration solution that met the original requirements. This implementation has served us well for many years, but many of the drawbacks to the current design prompted us to begin pulling out many of these features into its own project, where we could collaborate on features and enhancements for the future with institutions outside of our own organization. The patterns, interfaces, and source code that emerged from the original vault were extracted out, embellished and migrated into an open source repository known as Adaptive Framework (https://github.com/afw-org/afw).

    Clemson University has been working on this project for several years now, and has recently released this open source framework for building data access solutions that provide web service API’s, data transformation tools, real-time data provisioning and an authorization architecture. The framework that has emerged offers a built-in scripting language, pre-compiled server-side applications and an administrative web interface.

    Although it was originally designed for the implementation of an open source identity vault, we envision a broader adoption of this framework for other data-driven needs, such as extending databases with metadata, building policy-based authorization systems, and integrating data repositories with a metadata catalog, and varying levels of access control, across federated environments.

    Our goal with this project is to gather external support from both commercial and public institutions to help make this framework sustainable moving forward.

    Speaker Bio:
    Jeremy Grieshop is a software engineer (B.S. Miami University, M.S. Clemson University) and has been employed by Clemson University since 2001. His role has been in software development for the Identity Management team and has been directly involved in the software design and implementation of many of the authentication and provisioning software, along with self service tools that are in place at Clemson University today.

    • 46 min.
    July 2023: The Technical Landscape of Ransomware: Threat Models and Defense Models

    July 2023: The Technical Landscape of Ransomware: Threat Models and Defense Models

    Ransomware has become a global problem. Given the reality that ransomware will eventually strike your system, we focus on recovery and not on prevention. The assumption is that the attacker did enter the system and rendered it inoperative to some extent.

    We start by presenting the broad landscape of how ransomware can affect a computer system, suggesting how the IT manager, system designer, and operator might prepare to recover from such an attack.

    We show the ways in which ransomware can (and sometimes cannot) attack each component of the systems. For each attack scenario, we describe how the system might be subverted, the ransom act, the impact on operations, difficulty of accomplishing the attack, the cost to recover, the ease of detection of the attack, and frequency in which the attack is found in the wild (if at all). We also describe strategies that could be used to recover from these attacks.

    Some of the ransomware scenarios that we describe reflect attacks that are common and well understood. Many of these scenarios have active attacks in the wild. Other scenarios are less common and do not appear to have any active attacks. In many ways, these less common scenarios are the most interesting ones as they pose an opportunity to build defenses ahead of attacks.

    The Ransomware Report they discussed during the presentation is here: 

    https://hdl.handle.net/2142/118242

    And, the latest version of our Guide to Securing Scientific Software, is here: 

    https://hdl.handle.net/2142/118240 

    Speaker Bios:

    Barton Miller is the Vilas Distinguished Achievement Professor and the Amar & Belinder Sohi Professor in Computer Sciences at the University of Wisconsin-Madison. He is a co-PI on the Trusted CI NSF Cybersecurity Center of Excellence, where he leads the software assurance effort and leads the Paradyn Tools project, which is investigating performance and instrumentation technologies for parallel and distributed applications and systems. His research interests include software security, in-depth vulnerability assessment, binary and malicious code analysis and instrumentation, extreme scale systems, and parallel and distributed program measurement and debugging. In 1988, Miller founded the field of Fuzz random software testing, which is the foundation of many security and software engineering disciplines. In 1992, Miller (working with his then­student Prof. Jeffrey Hollingsworth) founded the field of dynamic binary code instrumentation and coined the term “dynamic instrumentation”. Miller is a Fellow of the ACM and recent recipient of the Jean Claude Laprie Award for dependable computing.

    Miller was the chair of the Institute for Defense Analysis Center for Computing Sciences Program Review Committee, member of the U.S. National Nuclear Safety Administration Los Alamos and Lawrence Livermore National Labs Cyber Security Review Committee (POFMR), member of the Los Alamos National Laboratory Computing, Communications and Networking Division Review Committee, and has been on the U.S. Secret Service Electronic Crimes Task Force (Chicago Area).

    Elisa Heymann is a Senior Scientist on TrustedCI, the NSF Cybersecurity Center of Excellence at the University of Wisconsin-Madison, and an Associate Professor at the Autonomous University of Barcelona. She co-directs the MIST software vulnerability assessment at the Autonomous University of Barcelona, Spain.

    She coordinates in-depth vulnerability assessments for NFS Trusted CI, and was also in charge of the Grid/Cloud security group at the UAB, and participated in two major Grid European Projects: EGI-InSPIRE and European Middleware Initiative (EMI). Heymann's research interests include software security and resource management for Grid and Cloud environments. Her research is supported by the NSF, Spanish government, the European Commission, and NATO.

    • 57 min.

Top-podcasts in Technologie

de Groene Nerds
Aljo Hartgers & Danny Oosterveer // De Podcasters
✨Poki - Podcast over Kunstmatige Intelligentie AI
Alexander Klöpping & Wietse Hage
Acquired
Ben Gilbert and David Rosenthal
Lex Fridman Podcast
Lex Fridman
Tweakers Podcast
Tweakers
Bright Podcast
Bright B.V.