A firestorm has erupted in the computer security community represented by the twitter hashtag #wassenaar.
That hashtag refers to the international arms treaty known as the Wassenaar Arrangement, named for the European town where it was first signed in 1995.
Wassenaar restricts the export and transport of weapons, dangerous chemicals or potentially dangerous technologies.
Wassenaar is updated regularly with new rules, and on May 20, new restrictions were proposed by the US Bureau of Industry and Security (BIS).
These new restrictions are so widespread, security pros fear it will put them at risk for being jailed for up to 20 years, or fined up to $US1 million, for doing normal things they often do today as part of their jobs today.
And the situation has blown up on Twitter, even turning into a meme like this:
Make a mistake and go to jail?
Prominent cybersecurity researcher Robert Graham summarized the uproar in a blog post. He writes:
The proposed BIS rules go beyond the simpler Wassenaar rules, affecting a large number of cybersecurity products, and cybersecurity research. These rules further restrict anything that may be used to develop a cyberweapon, which therefore make a wide number of innocuous product export-restricted … It’s easy to make mistakes — and a mistake can cost a person 20 years in jail and $US1 million. This will create a huge chilling effect even among those who don’t intend to export anything.
Specifically, the new rules cover three types of technology known as intrusion malware, intrusion exploits, and IP surveillance products. The first two refer to a kind of malware that lets hackers break into networks or software. The last refers to tech that monitors a network, sometimes called spying software.
If a researcher has some of the restricted tools on a laptop and travels internationally to a conference to discuss the work, that researcher may … be subject to fines or jail,
The problem is that most security researchers need all of these tools to do their jobs, which is to find and fix security holes before the bad guys find them and use them.
Researchers typically keep the holes they find a secret, then the company responsible for the software with the holes so that it can be fixed. A hole that is not yet fixed is known as a “zero-day” vulnerability or exploit.
Wassenaar is attempting to close the loophole that allows researchers to sell information about zero-day exploits to foreign buyers instead of responsibly reporting them to the software maker.
One suggestion is that researchers get a licence in order to export their zero day exploit information. This upsets researchers, explains Graham:
“One of the controversial provisions of the export licence is that companies/individuals may have to share their secret 0-days with the NSA in order to get a licence.”
Right now, the NSA reportedly buys information about zero-day exploits from companies and many security researchers do not engage in that kind of business.
But there are other things in the rules that researchers don’t like.
For instance, if a researcher has some of the restricted tools on a laptop and travels internationally, that researcher may have violated the export restrictions and be subject to fines or jail, Graham warns.
Lawyer Bryan Cave warns that travelling with apps that auto-update themselves, such as Google Chrome, might be violating the new rules, too.
The good news is that these new rules are not yet set in stone. BIS is accepting comments on them until July 20. And it will likely get an outpouring.
One security researcher summed up the feeling in a tweet, referencing the controversial Gamma Group which makes software that was said to be used to watch dissidents, journalists, and activist groups.