The Path to Stronger Open Source Security Processes



The open source community has benefited from an explosion of new security practices in recent years, but Emily Fox says the true measure of a security practice is how easy it is for maintainers and developers to implement it. “Security should not be yet another job for a developer. It should be something that they don’t even have to think about because security engineers have done such an excellent job of making it a first-class citizen within the ecosystem.” 

In her journey from creative director at an entertainment company to security lead at Red Hat, Fox has learned a lot about making security a natural part of the development process, and she shares those lessons on today’s episode of the Open at Intel podcast. Fox also shares her thoughts on the role AI plays in open source and creating inclusive spaces. 

Listen to the full episode here. This conversation has been edited and condensed for brevity and clarity. 

Fox’s Journey Into Open Source Security

Katherine Druckman: You wear a lot of hats. Will you tell us about them—both your current hat and the other community hats you wear?  

Emily Fox: My current hat is red. I’m the security lead in emerging technologies at Red Hat. I’m also our security community architect in the open source program office (OSPO), in the office of the CTO for Red Hat. Some of the areas that we’re working on right now include refining Sigstore within software supply chains and working on remote attestation of nodes so that we can inject cryptographic identities and bring them all the way from the node up into the workload.  

Outside of Red Hat, I’m also in open source. Primarily, I’m the Technical Oversight Committee (TOC) chair of the Cloud Native Computing Foundation (CNCF). In addition to that, I am the TOC liaison for the TAG Contributor Strategy, a technical advisory group, as well as TAG Environmental Sustainability. Prior to all of these things, I used to be the cochair for KubeCon + CloudNativeCon for Detroit, Amsterdam, and Valencia.  

Katherine Druckman: How did you get into the security field in the first place?  

Emily Fox: I have not always been in technology. Prior to joining the tech ecosystem, I was a creative director for an entertainment company, so I did wedding planning, event planning, set and prop design and made costumes. But eventually, I got tired of working extremely long hours and needed a change. I bounced around from a few different jobs and was fortunate enough to get an opportunity with a Department of Defense contractor. After a six-month rotation, I became a government civilian, and through a lot of different organizational changes, ultimately ended up as the DevOps security lead and the developer security lead for the National Security Agency (NSA).  

Shifting and Expanding Left

Katherine Druckman: Given your positions, you have a unique view of the entire landscape. What are the greatest challenges that you see open source maintainers have in keeping their projects secure? 

Emily Fox: The wealth of information out there is hard to discover. While great and fantastic, the security ecosystem can do a better job of making it easier for software engineers and the open source community to adopt and practice security. We still have a long way to go. Projects like Sigstore are a step in the right direction, but we can always do more. Security should not be yet another job for a developer. It should be something that they don’t even have to think about because security engineers have done such an excellent job of making it a first-class citizen within the ecosystem, within the infrastructure, and within CI/CD systems. It should be clear what the expectations are: When somebody goes to submit a pull request, we run the CI/CD tests against it to make sure it’s meeting all the security goals and objectives and be upfront about what those are so that they’re not surprised if the build fails or if we kick it out of a pipeline.  

Katherine Druckman: What are some hurdles to making it easier? I love the idea of the ecosystem shifting left and addressing security early on. 

Emily Fox: That is a challenge because when we shift left, we’re moving all of the security practices to where developers are. That’s part of the problem. We’re overwhelming them with, “You’ve fixed this common vulnerability and exposure (CVE). Now there’s another one.” We really should be expanding left. When we’re talking about security holistically for software development, or even organizations that are building software and releasing it or deploying it in their environments, we need to think about it as a sandwich. We need to ensure that we’re empowering developers to know what the security expectations are and then on the other end catching anything that came through.  

Katherine Druckman: As we shift left, what advice do you have for developers to expand their security expertise? 

Emily Fox: One is to be open to it. I’ve met with many maintainers and contributors to open source projects, both within cloud native and outside of the ecosystem, and we need to be mindful that there is more than just the cloud native community out there that’s impacted by some of our security practices. There are some old school open source developers who are not necessarily anti-security, but they firmly believe in, “It’s just open source. There are no guarantees associated with it.” But there are a lot of developers who want to do the right thing. We need to ensure that we’re hearing them, listening to both sides of that discussion, and figuring out how we can get security injected into open source projects—or at least put the guardrails and the gates up before they’re being consumed by other open source projects so that they can be successful in their response.  

I was talking with a community member today who’s on the security team for an open source project, and they were lamenting about the fact that it’s so difficult to manage CVEs, particularly ones that have a high impact and might be under embargo. Right now, organizations and cloud service providers have an excellent opportunity. They’re on preembargo lists, so they get notifications about critical vulnerabilities before they blow up in the ecosystem. Our open source project maintainers don’t have the same opportunity unless the vulnerability originates from their project themselves.  

The Hard Conversation Around AI

Katherine Druckman: I have to ask because everyone’s excited about it. Do you have any thoughts about the role AI can play in securing software?  

Emily Fox: It’s been a hot topic this week. I want to take it back a little bit to pre-AI days. When there’s new technical innovation that comes out, we’re usually years behind applying security practices to it. Until the DevOps movement really started to take off, there were a bunch of security professionals in the community who were like, hey, wait a minute. This is going to cause a lot of problems if we allow it to happen. But rather than saying no, they said, “Yes, but how do we make it work?” We’re seeing that with AI as well. What allowed DevSecOps to take place through automation practices and injecting security scanning within those automation pipelines, we’re starting to have similar conversations with AI.  

Where advancements in software supply chain security allowed us to develop the muscle memory and the reflexes on how we secure software supply chains, we’re now starting to apply those principles and practices to AI systems. How do we ensure that AI models are built with the integrity mechanisms that we expect? How do we know whether the response that’s coming back from a generative AI system is a hallucination? But there’s a lot more to it than there’s just that; there’s predictive AI. That’s where I see the industry heading as we start looking at how we enable large language models (LLMs) or AI workloads in our infrastructure. It’s more about what are their special needs so that we can ensure our projects and our products are capable of adapting to them? 

Katherine Druckman: Are you optimistic about all the work being poured into this arena? 

Emily Fox: Yes and no. I have some pessimistic optimism on the topic. We’re in this interesting hype cycle, but we’ve not had the adult conversation of: Just because we can, should we? If what we’re doing in AI is critical enough to the success of technology, no one’s countering that argument with: Do we understand how the processor utilization associated with these workloads is going to impact us? I’ve had a lot of these discussions with folks in TAG Environmental Sustainability as well as CNCF’s AI/ML working group. We’re asking, how do we responsibly allow these workloads to run in our environments while balancing the CPU needs of our organizations so that we can continue to be technical leaders in this space? 

Believing in Each Other’s Potential

Katherine Druckman: What else are you excited about in the open source world? 

Emily Fox: I was in a governing board town hall panel yesterday, and a group of Black engineers who joined the panel session very astutely asked the hard question, “What are you doing to increase Black voices within the cloud native and technical ecosystem?” They’re not the only ones who have been asking that question. TAG Contributor Strategy set up the Deaf & Hard of Hearing Working Group this year. We need to do better in diversity and inclusion and accessibility within our ecosystem. We’ve made a lot of strides—the cloud native ecosystem is probably one of the most welcoming and diverse. But there are community members who still don’t bring consideration for their peers and what they might be experiencing. In a way, that’s damaging to those who are still trying to break in. For an ecosystem that needs its maintainers, contributors, and leaders, we need to do a better job of supporting them. 

Keeping the Developer Experience Simple

Katherine Druckman: Do you have any thoughts on how to best address DevOps security needs, the needs of the software itself, and automation vs. the needs of the people working with it? 

Emily Fox: It’s very difficult for a new contributor in open source to onboard to some of these projects, particularly older projects with extremely complex code bases. I feel that’s something that we should be bringing into the conversation. This is cool tech, but how do we make it easier? And how do we decrease the barrier to entry for somebody to onboard into it? How do we ensure that we’re understanding what their goals and objectives are and what they’re trying to use this for? While I’m close to a project and I get it, nobody else is going to be that close. I need to take a step back and consider from the outside looking in what the experience is for people. 

To hear more of this conversation and others, subscribe to the Open at Intel podcast: 

About the Author

Katherine Druckman, Open Source Evangelist, Intel 

Katherine Druckman, an Intel open source evangelist, hosts the podcasts Open at Intel, Reality 2.0, and FLOSS Weekly. A security and privacy advocate, software engineer, and former digital director of Linux Journal, she’s a longtime champion of open source and open standards.  

Emily Fox, Security Lead in Emerging Technologies, Red Hat 

Emily Fox is a DevOps enthusiast, security unicorn, and advocate for women in technology. She promotes the cross-pollination of development and security practices. She has worked in security for over 13 years to drive a cultural change where security is unobstructive, natural, and accessible to everyone. Her technical interests include containerization, least privilege, automation, and promoting women in technology. She holds a BS in information systems and an MS in cybersecurity. Serving as chair on the Cloud Native Computing Foundation’s (CNCF) Technical Oversight Committee (TOC) and cochair for KubeCon + CloudNativeCon China 2021, Europe 2022, North America 2022, Europe 2023, and CloudNativeSecurityCon 2023, she is involved in a variety of open source communities and activities.