Over the long weekend, rather than taking a break, the hacker community was up in arms about proposed rules that would restrict the free and open use of attack tools and software exploits invaluable for their work. And some big voices have backed them, with Facebook, Google and Yahoo executives and researchers battling with the human rights activists who appeared to have caused the problems two years ago.
At the heart of the matter is the Wassenaar Agreement, a set of weapons export rules signed by various nations. Signatories aren’t required to ascribe to the rules, but they agree to implement controls in whatever way they can. In December 2013, it was updated to include intrusion software on its list of dual-use technologies, which can be used for defending against security threats, or for more surreptitious activity, such as surveillance. Signatories from the E.U. were quick to draw up plans for their own applications of the agreement, but the US had, for unknown reasons, delayed delineating its own plans.
But controversy erupted last week, when the U.S. Department of Commerce’s Bureau of Industry and Security (BIS), as part of its plans to implement the 2013 agreement, proposed the enforcement of a license requirement for “the export, reexport, or transfer (in-country) of these cybersecurity items to all destinations, except Canada”, effectively saying anyone who wanted to take attack code, malware samples and other offensive tools outside of the US would require a license. Such code is regularly used by security researchers to find vulnerabilities in software and company networks in order to push for a fix, which would benefit the masses; adding export controls would make life considerably more difficult for the average researcher.
There was further anxiety around BIS’s decision not to clarify what kinds of technology the broad term “intrusion software” covered. The current wording indicates any tool, whether used by penetration testers looking for holes in companies’ networks in order to fix them, or cybercriminals looking to make a quick buck by infecting PCs, is covered under Wassenaar.
The specific wording from BIS is as follows: “’Software’ ‘specially designed’ or modified to avoid detection by ‘monitoring tools,’ or to defeat ‘protective countermeasures,’ of a computer or network-capable device, and performing any of the following: (a) The extraction of data or information, from a computer or network-capable device, or the modification of system or user data; or (b) The modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions.”
But even widely-used tools, including Google Chrome, could be affected by the export controls, as they circumvent certain security protections to auto-update, as noted by Robert Clifton Burns, counsel at Bryan Cave LLP. He also noted Apple iPhone jailbreaks could also be controlled, given they exploit the iOS operating system.
Anyone who produces or exports any of the technology covered by the rules would likely have to go through two sets of compliance checks too, as many of the tools are already covered under America’s encryption export controls. “This rule would essentially double the compliance burden for cybersecurity items with encryption functionality,” noted Fried Frank Harris Shriver & Jacobson LLP’s Mario Mancuso and Michael Gershberg. “This licensing policy is far stricter than the current export authorization for some cybersecurity items that are now controlled as encryption items.”
With so many aspects of the BIS proposal of concern, security pros from major companies have voiced their disgust, with diatribes across blogs and social media. Google researcher Thomas Dullien, also known as Halvar Flake, said in his personal blog the Wassenaar Agreement would not just lead to the criminalisation of security research, but would also lead to a “balkanization” of an industry that has benefitted from international cooperation. Flake claimed the export restrictions would make it more likely bugs would be handed to local governments rather than shared with the wider community or software vendors, who would likely work to patch the vulnerability. For instance, researchers would not be able to effectively work with one another on particularly nasty strains of malware, such as surveillance tools used on activists, he claimed.
Flake believes governments would quickly grant licenses to their typical government contractors, such as Raytheon and Lockheed Martin, giving them the advantage. And as the rules already stymie the widespread dissemination of high-quality encryption, they actually harm privacy as they attempt to improve it, he added.
On Twitter, Alec Muffett, from Facebook’s engineering security infrastructure team, criticised ACLU principal technologist and senior policy analyst Chris Soghoian and Privacy International’s deputy director Eric King for their part in the introduction of the 2013 rules.
— Alec Muffett (@AlecMuffett) May 26, 2015
Yahoo chief information security officer Alex Stamos said he was deeply worried about the overbroad terminology used by BIS and that the proposals would do nothing to stop “lawful interception” tools and exploits from dealers like VUPEN and Gamma International, who have been implicated in attacks on activists in recent years.
First read on #Wassenaar: it looks to me that these rules are overly broad and that the stated goals are better accomplished technically.
— Alex Stamos (@alexstamos) May 26, 2015
Interested parties have until 20 July to submit comments. Many, including those aforementioned tech titans, are set to lobby hard.
This article was written by Thomas Fox-Brewster from Forbes and was legally licensed through the NewsCred publisher network.