Here’s What Government Gets Wrong About Bug Bounties


Congress has gone bananas for bug bounties, but they may not always be the right choice.

Bug bounties are hot in government right now, but the craze may be outpacing the contests’ actual usefulness, bug bounty practitioners tell Nextgov.

The Defense Department has launched four of the ethical hacking contests, which were first popularized at major tech firms and offer cash rewards in exchange for spotting and disclosing dangerous computer vulnerabilities.

Those contests—one each at the Pentagon and Army and two at the Air Force—netted more than 500 valid bug reports and more than $400,000 in payouts to hacker participants. Another contest, announced Monday, will challenge hackers to find vulnerabilities in the Pentagon’s travel booking system, which processes more than 25,000 transactions each day.

Lawmakers have sponsored bills mandating bug bounties at the Homeland Security and State departments.

The White House even urged bug bounties for selected applications and computer tools across the government in a December IT modernization report.

The excitement surrounding government bug bounties, however, takes little account of how difficult they are to pull off and how much work must go into them, practitioners say, especially at federal agencies with limited staff and resources that are scrambling to keep up with the software vulnerabilities they already know about.

“[Those agencies] don’t need outsiders pointing out more bugs in exchange for cash if the problem is keeping up with the volume of bug issues they already know about,” said Katie Moussouris, a former chief policy officer with HackerOne when the company helped launch the first Hack the Pentagon bug bounty in 2016.

Bills that mandate additional bug bounties, she said, are “well-meaning, but misdirected,” and would reroute money that would be better spent beefing up agency’s existing IT security teams.  

Who’s working on clean up?

The biggest problems come down to resources and maturity, Moussouris and other bug bounty practitioners say.  

When a major tech firm announces a bug bounty, you can be pretty sure that the company’s own security staff has already cleared out all the low-hanging fruit. That staff also probably knows just the sort of bugs they’re looking for and how to communicate with independent security researchers about those bugs.

Finally, the company’s security team probably has the resources and mandate to get the new vulnerabilities that independent hackers uncover patched as soon as possible.

The Pentagon and military services, which have staff and resources at least on par with major companies, can count on the same, though, perhaps, with a tad more bureaucracy.

When it comes to civilian government agencies, however, many are struggling to keep up with patching the bugs they already know about, including vulnerabilities that are regularly-disclosed in open source software and by major commercial software, vendors such as Microsoft, Oracle and Adobe.

Civilian agencies frequently miss deadlines for installing new software patches, according to audits. If agency IT staff can’t keep up with patching the hackable software flaws they know about, it won’t do much good to bombard them with dozens, hundreds or even thousands of new bug reports, practitioners say.

There’s also the issue of the resources to manage the bounty program itself.

If agencies aren’t vetting their software for obvious vulnerabilities, they’re also likely to receive dozens of duplicate reports about the same bug, practitioners said.

Plus, even the best bug bounty will draw a high proportion of bogus, confused or poorly vetted reports. During the Hack the Army competition, for example, which focused on comparatively well-vetted software, participants submitted 416 bug reports but only 118, or about one-fourth, were deemed “unique and actionable,” according to the Army report.

All of that takes time to sort through and reduces the time an agency’s IT staff can spend on its other responsibilities—including patching known vulnerabilities.

“The Defense Department has an enormous workforce that’s responsible for [patching], both internal and contractor,” said Lisa Wiswell, a former top Defense Department cyber adviser who helped organize the Pentagon bug bounty with HackerOne.

“Forgive the example, but who the hell’s at the Department of the Interior to fix their stuff?” Wiswell asked.

And That’s Just For Starters

There are plenty of other possible barriers to government bug bounties.

To begin with, government apps and websites are typically built by contractors rather than by the agency’s own IT staff and those contracts may be years old. If the contracts don’t include a provision requiring the contractor to participate in a bug bounty or to fix publicly submitted vulnerabilities, that could be a stumbling point.

Contractor-built tech systems are also common in the private sector, but the problem tends to be more prevalent and thornier in government, said Marten Mickos, CEO of HackerOne, the security company that facilitated all the Pentagon bug bounties.

Government, especially on the civilian side, also buys and updates its technology using an arcane and convoluted acquisition system that can make it difficult to make big moves, such as mandating compliance with a bug bounty.

Following the Hack the Pentagon pilot, defense officials shifted to ensure future tech contracts both permitted bug bounties and required agencies to fix vulnerabilities that were discovered during the process, the former defense official Lisa Wiswell said. That didn’t fix problems with current contracts, though, said Wiswell, whose official title at the Defense Digital Service was “bureaucracy hacker.”

Another problem is that bug bounties, by their very nature, deal with some gray areas of law and policy. That’s partly because internet security research is largely governed by pre-internet laws that are being interpreted in the internet era, such as the 1986 Computer Fraud and Abuse Act.  

If a bug bounty is written too broadly or carelessly, for example, it might seem to authorize illegal and malicious hacking of sensitive information, Moussouris said.  

This is a concern for private companies, but it really gives government officials heartburn, she said, noting that government agencies not only hold an inordinate amount of sensitive and legally protected information but are also among the most targeted organizations in the world, both by criminal hackers and nation-state intelligence services.

That’s just the beginning of the myriad legal and policy concerns an organization faces when running a bug bounty—concerns are particularly important for government.

A bug bounty organizer, for example, must ensure that no bounty money goes to a person or organization targeted by U.S. sanctions, must shield her organization from third-party lawsuits that might crop up as a result of something going haywire during the bounty and must create a plan for when a participant breaks the rules.

Finally, there’s the issue of building relationships between a government agency and the ethical hacker community, which might be skeptical about that agency’s mission or policies.

In advance of Hack the Pentagon, Moussouris reached out to a number of prominent ethical hackers to explain the program and build support, she said. After the announcement, she spent a lot of time on social media answering questions and making sure people understood correctly what the program was about.

“These were influential hackers who might have strong opinions if they misunderstood what DOD was actually trying to do,” she said.

Like most security researchers, Moussouris would like to see more government agencies adopt vulnerability disclosure policies, or VDPs—a cousin to bug bounties in which organizations describe how well-meaning researchers can pass along computer vulnerabilities in their public-facing apps and websites but don’t offer cash rewards.

She’s skeptical, however, of the congressional push for full bug bounties, and thinks federal money could be better spent building up agencies’ own ability to test their systems for vulnerabilities and patch the vulnerabilities that are exposed.

“There’s an absolute misunderstanding by members of Congress who say ‘let’s just repeat the success of Hack the Pentagon,’” Moussouris said. “They don’t understand the years of advisory work that happened leading up to Hack the Pentagon…One very unfortunate effect of all the work that went into making Hack the Pentagon successful is that now people think it’s easy and it’s not.”

Hacking DHS and State

The only civilian government division to launch a bug bounty, so far, is the General Services Administration’s IT innovation wing, the Technology Transformation Service, which has a limited web presence and a highly-sophisticated IT staff, many of whom have backgrounds in the private sector.

Lawmakers in both chambers, however, have introduced bills mandating a bug bounty at the Homeland Security Department and a House bill would mandate one at the State Department.

The most prominent Senate version of the Homeland Security bounty was included as an amendment, sponsored by Sen. Maggie Hassan, D-N.H., to a bill reauthorizing the full department. The House version is in a stand-alone bill sponsored by Reps. Scott Taylor, R-Va., and Ted Lieu, D-Calif. Lieu also co-sponsored the State Department bounty with Rep. Ted Yoho, R-Fla.

The Homeland Security Department is the lead government agency for civilian cybersecurity and manages numerous governmentwide cyber programs out of its cyber operations office, the National Protection and Programs Directorate. The department is also large and diverse, however, constructed from ad hoc elements after the Sept. 11 terrorist attacks, and some of its divisions are less than prepared for a full bug bounty, practitioners say.  

The overall department got middling grades on its annual information security scorecard and many divisions have been slow to implement the department’s own governmentwide mandates for cybersecurity.

The bills’ wording leaves open the possibility the bug bounty could be limited to the most digitally advanced Homeland Security divisions.

The State Department has received consistently poor cybersecurity marks in audits going back several years. The department also suffered a major email breach in 2014 that was likely launched by Russian government-backed hackers.  

A staffer for Rep. Lieu disputed the notion, Monday, that the State Department isn’t ready for a bug bounty, saying: “Our understanding is that the State Department already has the resources to move forward with our bill’s provisions.” If State officials are concerned that they’re not ready, however, they can “always flag major resource concerns that they hope to address as we move the bill through the committee process,” the staffer said.

The staffer also noted that the bill calls for a vulnerability disclosure policy within six months and a pilot bug bounty within one year to avoid “putting the cart before the horse.”

Ari Schwartz, a former top cyber advisor to President Barack Obama, provided a quote praising the vulnerability disclosure portion of the State Department bounty bill that Lieu and Yoho included in their press release.

If the bill advances, however, Schwartz would want to work with the sponsors to ensure a bug bounty wasn’t imposed on the State Department before it’s ready, he told Nextgov.

“I do think having a vulnerability disclosure program is a good idea for [the State Department],” said Schwartz who currently leads the Coalition for Cybersecurity Policy and Law, an advocacy group for cybersecurity companies.

“Whether that leads to large cash rewards immediately the way DoD moved is open for discussion,” he said.

Starting the Discussion

Schwartz noted, however, that the Hack Your State Department bill is “a good discussion point” that highlights concerns about where the department’s security program is now and where it ought to be.

The bill also tells that story in a way that’s easily comprehensible to other lawmakers, which is a useful service even if the bill doesn’t become law or doesn’t become law in its current form, Schwartz noted.

Marten Mickos, the HackerOne CEO, similarly acknowledged that some civilian agencies may not be mature enough for bug bounties, but said he nevertheless supports the legislative push for them.

“We’re not focused on the essence if we get upset by that,” he said. “The essence is lawmakers know they have to set a bar and set a mandate for this and we should support that…I don’t think any action is happening too fast. Whatever we’re doing in cybersecurity we’re doing it too slow.”