Growing up in rural Vermont, I had a more isolated upbringing than I’d like, and anybody I could socialize with was so far away—but that all changed my junior year when I took a robotics class.
We had a small high school, with 300 students in total. A few younger students opted to join me, and we started a robotics team led by our teacher, Mark Chabot. I discovered the VEX Robotic Competition and filed a grant proposal to our high school’s Alumni Association.
We named our first robot “Cha-bot” after Mr. Chabot, and, in our first competition we came in dead last. As luck would have it, we were assigned to the same table as the previous year’s world champions, and learned a lot talking with them. Armed with their advice, we rebuilt our robot, and came in second place at every competition for the rest of that year. We later qualified for the VEX Robotics World Championships, an oddity for a first-year team.
While Mr. Chabot answered our more complex questions, he mostly enabled us to succeed on our own. He applied for several grants to help fund our team, and even paid out of pocket while we waited for them to come through. I’m so grateful for everything Mr. Chabot did: He gave us access, the ability, a classroom, and space. Then he let us learn by doing. When he left for meetings, he’d say, “Just don’t burn down the place.” He’s actually been running the robotics club for more than 10 years now, and it’s one of the reasons people go to Thetford Academy.
Thanks to robotics, I went to the Governor’s Institute of Vermont, a summer camp for teens, where I met graduates from Worcester Polytechnic Institute, and—much to my surprise—discovered you could go to college for robotics. My grades were a bit of a stretch, but I got in off the waitlist and had a spectacular four years. I learned so much from my peers, discovered a love of software, and graduated with a dual degree in robotics and computer science.
I’m so grateful for everything Mr. Chabot did: He gave us access, the ability, a classroom, and space. Then he let us learn by doing.
Discovering how much he already knew
My first vulnerability discovery was actually a mistake. It had to do with the Gradle Plugin Portal.
Apparently I was not the first person to find this: Google had reported it only 10 days before me. But that was my first security vulnerability. I set my proof of concept up so it injected code via a supply chain attack into your Gradle build. This way, if you were using wild-card versions, I published a version higher, so you would be resolving my jar that I published, and you could be running whatever code I wanted you to run. It was a pretty serious security vulnerability in the Gradle Plugin Portal given that so many companies and products rely on Gradle for builds.
At that point in my career, I was just a software developer with an interest in security. As an avid listener of the Security Now podcast by Steve Gibson and Leo Laporte, I found stories of software security hacks and fixes fascinating, but didn’t think I had enough experience to get involved. I had only taken one security class in college, and while I enjoyed it, I didn’t think that could possibly be enough. At a certain point, I realized the barrier to entry for finding security vulnerabilities is actually way lower. As a software engineer, I already had the knowledge.
I started by looking into the supply chain of the Java ecosystem. I found that most of the build infrastructure, both in Gradle and Maven, still used HTTP to resolve their dependencies instead of HTTPS. I found this vulnerability nearly everywhere I looked, from Minecraft mods to the Kotlin compiler to the NSA and Oracle. Sonatype, the hosts of the Maven Central Repository, reported that 25% of traffic was still using HTTP at the time. I reached out to the owners and maintainers of all the most critical parts of the Java ecosystem with a suggestion. Simply redirecting HTTP to HTTPS would still leave software vulnerable, so I proposed that we cause any build that still used HTTP to break. That effectively forced Java developers to move to HTTPS to keep their software working. A wide swath of Java ecosystem players announced that they’d stop supporting HTTP on January 15, 2020. By that deadline, Sonatype found HTTP use had dropped to 20%. The shut-off broke a lot of Maven and Gradle Builds. Later, I developed a bot that I used to generate 1,596 pull requests to fix Maven POM files that were still vulnerable to this. I managed to make a major change in the way the Java ecosystem functioned. It was a formative experience, to say the least.
At a certain point, I realized the barrier to entry for finding security vulnerabilities is actually way lower. As a software engineer, I already had the knowledge.
The famous Zoom vulnerability and a difficult loss
Around the same time Zoom IPOed as a $14 billion company back in 2019, I discovered that malicious users could hijack your webcam through Zoom just by getting you to visit a particular website—even if you uninstalled Zoom. When I alerted Zoom to this vulnerability however, they told me that the auto-webcam launch was a feature of their software, not a bug… I disagreed!
I genuinely believed this bug had the potential to wreak havoc, so I set about trying to convince Zoom to fix the issue, or I would publish the vulnerability publicly. I gave them 90 days. They argued that if the end-user was worried about their camera, they could opt out by changing the default setting manually.
After several weeks of back and forth, Zoom had only implemented a partial fix, and I ended up publishing a full-disclosure of the vulnerability which went viral. I included a link to a proof of concept (POC) that would demonstrate the vulnerability, launching you into a Zoom call. I spent three days on that call, meeting random strangers who tried out my exploit. The second day after disclosure, the CEO of Zoom, Eric Yuan, joined the call and expressed, live to 150 people, how they were now considering this a vulnerability, and that they were committed to fixing the issue. I never expected so many people to join my call, much less the CEO of Zoom himself, but it’s become one of my favorite stories (and one my partner has heard way, WAY too many times!).
We later discovered that Zoom white-labels their software, and the same vulnerability existed in 13 other pieces, like RingCentral. Apple stepped up and leveraged their anti-malware capabilities to remove the Zoom web server (as well as the white-label variants) from everyone’s Mac silently, patching the vulnerability fully and completely. I’ve given two talks on it, one at SchmooCon, one at BSides Connecticut. There’s also this fun version that the animator Zheng Yan put together that’s completely narrated by a llama.
My mother thought the Zoom find was incredible. Even though she was unfamiliar with the software, she told everyone far and wide about it. It was silly, embarrassing, and wonderful, all at the same time. “Have you heard about Zoom? MY SON found a big vulnerability!” That was her. She would have found the llama video especially hilarious.
Unfortunately, my mom passed away in April 2020 and never got the opportunity to see it. It was 12 days from when we found out she had lung cancer to the day she died, and it was right at the beginning of the pandemic, so hospice and hospitals wouldn’t let us in until the very end. Strangely enough, because of social distancing, we had to look for alternative ways to bring people together for a funeral, and once again, we turned to Zoom. We held one of the very first funerals over Zoom, and a reporter from The New York Times reached out to me for an article. It was a story that I could uniquely tell. I had gone through this very personal and distinctive experience, but I was also “The Zoom Guy.”
My mom and I had a very close, complicated relationship. She taught me how to stand up for myself, and a lot about the rights and wrongs of the world. I miss her so much sometimes.
I actually went on leave June through August of 2021, to focus on my mental health. On top of losing my mom, I’m an extrovert and love socialization, which made the pandemic really tough. I didn’t know what was wrong until a friend helped me identify my depression. I took three months off and was outside as much as possible: I went sailing, saw the Grand Canyon, and spent time with people I love. I thought I would miss my computer, but I was so happy. While I love my field, I’m glad I had the opportunity and the support I needed to take a break. Software, and especially software security, can be stressful, and I really hope that others who feel burnt out or depressed are able to take time for themselves, too.
My mom taught me how to stand up and show up for myself, and a lot about the rights and wrongs in the world. I miss her so much sometimes.
The moral dilemma
My mom always stood up for me, and I sometimes feel that I’ve taken on a similar role in software, going to bat for the end-user as a security researcher. It’s often that the security community blames the end-user for being uneducated or making mistakes that get them hacked or compromised, but doesn’t look inward to see how they can improve their own systems.
I have come to realize the impact software security researchers can have, and the extent to which they can apply their skills for the greater good. Dan Kaminsky also knew this. A famous security researcher, Dan passed away in 2021. In the security community, he had a reputation for research, kindness, and compassion. In the wider industry, he was known for a famous security DNS vulnerability in 2008. In short, the vulnerability could have allowed anybody to hijack any website on the internet and redirect traffic to their malicious websites unfettered. He worked silently with a bunch of vendors to get it fixed, and they did the impossible by working together.
Dan spoke during his talks at Black Hat and Defcon about how we, as engineers, should be both building systems that are secure-by-default, and with proper guardrails to protect a naive user. It’s from this angle that I attempt to communicate vulnerability impact.
There’s a big debate about vulnerability disclosure in the industry. Some argue that fully disclosing a vulnerability’s details gives bad actors the ability to leverage vulnerabilities into full exploits. I choose to leverage full-disclosure for a variety of reasons. Primarily, disclosure promotes awareness and enables everyone to fix the issue. I could have been in a situation where I was not able to disclose the Zoom vulnerability, but because I did, Apple was able to step in and fix it. If Zoom tried to quietly fix it without telling anyone, Apple might have never known and it would have been a disaster.
When I find a vulnerability, I feel a personal obligation to make sure it’s fixed. I’ve been told that you can make money by disclosing vulnerability information privately, or selling it to the gray market, but that’s not how I was raised. The people I look up to in the industry are the same way. They do it because it’s the right thing to do.
I don’t necessarily always know what the right thing to do is. Before the Zoom vulnerability, I felt I didn’t have anyone in my corner. After that, I made a ton of connections that I didn’t have before, and now know who I can reach out to. I really appreciate having a community behind me.
I’ve tried to emulate some of what I see out there, and one of the biggest shining examples is Google Project Zero, and the work from Tavis Ormandy and his team. Their reasoning really resonates with me: They disclose in order to help fix the issues and protect people. It’s sort of like an obligation because it’s a serious niche that nobody else focuses on. So how can I fill that niche and be successful and make sure it gets done? What’s even more challenging is how do I get paid and continue to live a life that I love? It’s a work in progress.
How can I fill that niche and be successful and make sure it gets done? What’s even more challenging is how do I get paid and continue to live a life that I love? It’s a work in progress.
Moving the industry forward from a security perspective
When I started telling people about the security issues I found, people were surprised. Finding hidden back doors, private data that should not be accessible but is accessible, and the potential for everyone’s information to get compromised? It can be pretty exciting stuff!
I was surprised to find this career path, and even more surprised to realize there are so few of us actively looking at open source software (OSS). I’m guessing a lot of things that get fixed in open source are the result of people accidentally finding vulnerabilities. But anybody who does bug bounty research and security research will attest to the specific high you get when you find a security vulnerability that nobody else knows about—especially ones that have a lot of impact.
For example, I found a vulnerability in this project called JUnit 4, which is a test framework used by a huge chunk of the Java ecosystem. Discovering that vulnerability led to thousands and thousands of pull requests generated by GitHub’s Dependabot. It’s cool to see work you did kickstart the machinery that not only alerts the world about this vulnerability, but also provides an easy fix for it. You can have an impact on tons of people, even with a small vulnerability disclosure.
To help guide my career, I’ve watched a lot of talks from Dan, and really wish I had met him. The DEF CON group held a funeral for him and everyone said he was a really kind person who was more interested in hearing about other people’s work than his own. That is so cool. There are parts of him and how he worked that I aim to emulate.
At the end of 2021 I was awarded the first-ever Dan Kaminsky Fellowship, a fellowship created to commemorate Dan’s life and legacy. This fellowship gives me the opportunity to spend the next year dedicating myself to tackling the widespread vulnerabilities that overwhelm modern maintainers. My goal is to not just provide the fix, but the complete package that includes disclosures so that the solution gets in the hands of those who need it most—making software more secure for millions of users everywhere.