A Dialogue with Caroline Wong, VP of Cobalt Labs

This is promotional content. Learn more.

ITSPmagazine sat down with Cobalt VP Caroline Wong at RSA 2017 in San Francisco. Wong, a CISSP, sheds light on the benefits of crowdsourced pen testing in today’s InfoSec industry. Watch the interview or read a recap of the interview below.


ITSPmagazine: Can you tell me a little bit about what Cobalt Labs does?

Wong: The thing about traditional pen test work is that sometimes it's hard to match up exactly the right skills at exactly the right time. But with a crowdsourced, global pool of freelance talent, we've simply got more to choose from.


ITSPmagazine: Do you publish any of your findings?

Wong: Our pen test metrics report  describes the key metrics that are needed to determine the impact and the ROI of a modern pen test program. I've talked to perhaps four dozen or so different application security teams over the last few years. Everyone does pen testing and a lot of people really struggle with metrics, with how to structure their metrics. And I ask them if they have a single source of record for their pen test findings, because that's a good place to start. A lot of organizations really struggle with that. They have pen test results scattered around in PDFs and in email, and without the data it's difficult to count how many pen test findings were discovered, let alone track which of those actually got fixed.


ITSPmagazine: Do you assign a particular person or a team of people to a pen test?

Wong: Depending on the specific application technology that's being assessed, we'll put together what we call a purpose-built team of technical domain experts. These are folks whose skill sets are matched to an application technology stack.

One of the cool things about the crowdsourced model is that we have a platform. So we put together this purpose-built team of technical domain experts and they will work together to collaboratively do a security review of a web application, a mobile application or an API. As the team discovers security bugs and flaws, those get put into the crowdsourced pen test platform, and they get delivered to the organization. The metrics are automatic as they are already in the platform; you don't have to go and seek out the data and calculate the results yourself, because they're all there.

At Cobalt we’ve done more than a hundred pen tests to date and the report eBook we created contains the summary data from all of the pen tests, which were completed in 2016.


ITSPmagazine: Any interesting findings?

Wong: One of the questions that we asked – which, frankly I think any organization that's doing pen testing ought to ask themselves – is: what's the average number of findings in a pen test? For us we found out it was 13.

Another question is: what does the distribution of severity look like for the findings? Out of all the findings from our 2016 pen tests, we found that 9% of them were critical, 6% were high, 14% were medium, and the rest were low.



ITSPmagazine: And from the metrics, do you also look at how long it takes to fix and what the remediation is from that?

Wong: We do recommend that organizations track time to fix. And each of the metrics included in this pamphlet has sort of a narrative description, how an organization can practically go and get that kind of data for themselves. I think it's one thing to find security issues – and I think that's very important – but it's quite another thing to integrate with development processes in order to get those security issues fixed, so that the code and the applications are actually more secure.

One of the ways we've tried to do that with our platform is by integrating with developer bug tracking systems so that the bugs can automatically be put into the developers’ work queues, they can be tracked, and the fix time can actually be captured. That way, when a security person asks the question "What's the time to fix?" they actually have the data.


ITSPmagazine: Let’s talk about diversity in the security industry. What have you seen?

Wong: I am a big believer in diversity. I've seen reports come out from Stanford and Harvard and MIT and Scientific American. Everyone says diversity promotes better workplace results; diversity is great for the bottom line. I think something like crowdsource pen testing really supports that, because in order to be a strong pen tester it doesn't matter where you're from or what your gender is or what you look like. All that matters is your skill set and your experience and your recent performance.


ITSPmagazine: Let’s touch upon education. Are we helping enough people to become “security aware”?

Wong: I've seen organizations that choose to publish their pen test results internally to teach their development teams, by saying something like "Hey, here's an attack that actually works on our code and here's how to address something like that."

I think that that sort of local training is more relevant and can really make a bigger impact to a developer by communicating that this is something that could actually happen versus talking about something that's going to be less relevant, or maybe not even in the same code base or that happened to someone else. But if someone sees real pen test results that happened to their code, then that's going to be something that I think is going to have a much greater impact.


Watch the Full Interview


About Caroline Wong

Caroline Wong, CISSP, is a strategic leader known for her strong communications skills, cybersecurity knowledge, and experience delivering global programs. Her close and practical information security knowledge stems from broad experience as a Cigital consultant, a Symantec product manager, and day-to-day leadership roles at eBay and Zynga.

More about Caroline