Companies should state clearly when they won’t sue ethical hackers for sharing vulnerabilities, draft guidance states.
A task force charged with opening communication lines between manufacturers and researchers who discover hackable vulnerabilities in their systems hopes to release draft guidance for comment in the next couple months, members said Monday.
Members of the Cybersecurity Vulnerabilities Multistakeholder Process agreed to release their various reports for public comment “sooner rather than later” and discussed timeframes in late 2016 and early 2017 but did not commit to a firm release date.
The process was organized in 2015 by the Commerce Department’s National Telecommunications and Information Association.
» Get the best federal technology news and ideas delivered right to your inbox. Sign up here.
Concern about undiscovered computer vulnerabilities has grown significantly in recent years as the internet has crept into ever-more consumer devices, including cars and medical implants where an unpatched vulnerability could be a matter of life and death.
The massive Mirai botnet attack in October that briefly halted access to sites including Netflix and The New York Times was powered in large part by connected consumer devices that had been manipulated to work on the hackers’ behalf.
“This is not niche anymore,” Allan Friedman, director of NTIA cybersecurity initiatives, said during Monday’s meeting.
Some companies have been slow, however, to work with ethical or “white hat” hackers who discover vulnerabilities in their systems. Those companies have sometimes sued hackers who disclose vulnerabilities publicly citing the 1998 Digital Millennium Copyright Act, the 1986 Computer Fraud and Abuse Act and various other laws.
A 3-year exception to the DMCA that allows for ethical hacking that bypasses technical copyright protections went into effect in October.
Working groups released two draft documents during Monday’s NTIA meeting. The first is a set of recommendations and a draft vulnerability disclosure policy for safety critical industries such as car companies and medical device makers. The document urges companies to communicate clearly with security researchers and in a way that accounts for those systems’ unique incentives.
“When human safety is involved, the calculus of when and how to publicly disclose vulnerabilities is likely different than in other industries,” the document states. “Researchers may be more reluctant to disclose if they know a vulnerability has not been (or cannot be) fixed. In contrast, some researchers may be more likely to publicly disclose if they feel like it will motivate a vendor to fix vulnerabilities faster than going through their disclosure program.”
Those manufacturers should clearly state they will not sue researchers who disclose specific types of vulnerabilities and list any exceptions to that promise, the document states, preferably with simple layman’s language of the “we will not take legal action if…” variety.
The companies should also create a clear process for how ethical hackers can share vulnerabilities with them and when they should hear back, the document states.
Companies should ask researchers to not publicly disclose vulnerabilities for a specified or negotiated time period after sharing them with the company, the guidance recommends. If that time period is too long, however, researchers might not participate because they fear a vendor “sitting on a bug,” the guidance states.
Vendors should also state particular models and versions of their technology they’re interested in seeking vulnerability disclosures about, the guidance states.
The other draft document outlines various complex disclosure scenarios in which a single vulnerability affects multiple vendors.