On Tuesday, the BBC published an article laying out how ISIS has been using Telegram, a secure messaging app, to distribute propaganda online.
The militant Islamist organisation has been using a new feature in the app — the ability to make public “channels,” akin to a Facebook page — to spread its message to more than 4,500 subscribers (and counting). The Yemeni branch of Al-Qaeda also apparently has a Telegram channel, as do some other terrorist organisations.
ISIS’ shift comes after it has been largely forced off other social networks like Twitter. Of course, there’s no reason why something similar can’t happen to its public broadcasts on Telegram — as the BBC notes, the app “suggests it will take down illegal material that is made publicly available via the app,” and that surely includes militant propaganda.
But there have been multiple reports in the past that Telegram — along with other secure messaging apps — have been frequently utilised by terrorists looking to evade detection by authorities. Telegram CEO Pavel Durov conceded as much last month.
Here’s why that’s not a big deal.
This is the new reality
Telegram is one of many apps and software suites that say they use “strong” encryption. This refers to a way of scrambling messages/data in such a way that in cannot be understood without a valid key or password. Apple uses strong encryption to secure the data on iPhones, and it cannot be decrypted by the company — even if it is supplied with a valid court order.
Strong encryption has been increasingly incorporated into tech products after Edward Snowden’s revelations about NSA surveillance provoked global privacy concerns, but the tech existed long before that.
Many authorities are concerned about the rise in encryption products. It means evidence law enforcement previously had access to is “going dark,” and it can be used by terrorists, paedophiles, and other nefarious individuals to hide their communications. It’s precisely this appeal that will have drawn groups like Islamic State to Telegram.
It’s worth noting here that Telegram has been criticised by some cryptography experts, who have concerns about its technical implementation. One professor described it as “like someone had never seen cake but heard it described [and] tried to bake one. With thumbtacks and iron filings.” But regardless of Telegram’s ultimate security — the point about strong encryption products in general.
So should we be panicking?
The use of encryption products by bad actors is well-documented. But this is inescapable. Because it’s not just used by criminals: Strong encryption underpins modern finance, secures our data, supports government communications. We couldn’t function without it. And it’s impossible to tell which uses are “legitimate” and which uses facilitate illegal activity because it’s all, well, encrypted.
Yes, this will be immensely frustrating to law enforcement unable to access certain communications. But there are still workarounds when investigators bump up against encryption.
Michael Hayden, the former director of the NSA, disagrees with the FBI’s current push to undermine encryption. After early efforts in the 1990s to regulate encryption failed, “we were still able to do a whole bunch of other things [to get the info needed],” Hayden said at a panel on Tuesday attended by Motherboard. ” Some of the other things were metadata, and bulk collection and so on.”
Encryption is a tool, like any other.
An analogy encryption enthusiasts sometimes make is that of a lock. Locks are fantastically useful for securing banks, homes, and government offices. But criminals can still use them to protect stolen goods, or stop kidnapped victims from escaping. Does this mean we should ban locks, or demand that all locks must correspond to government-approved master keys? Of course not.
In fact, a recent incident involving master keys helped demonstrate one of the key dangers of allowing back-door government access or having third-parties hold keys in escrow. The American transport agency TSA has a set of master keys for “approved” locks. Travellers don’t have to use these locks, but the TSA prefers it — the logic being they can gain easy access to the luggage when required, without compromising the security of the passenger.
Except, it turns out these master keys have compromised passenger security — a lot. The Washington Post inadvertently published a photo of the set of master keys. From this photo, a security researcher was able to build a set of key designs for a 3D printer, which, when printed, were able to open the corresponding locks. So now the belongings of anyone who secures their luggage with a TSA-approved lock have been put at risk.
This brings us back to Telegram, and encryption products in general. It’s an alluring idea that we should require government access. But it would be impossible to enforce, software developers outside of Western jurisdictions would totally disregard it, and it would put ordinary people’s data at risk.
Yes, terrorists use encryption, and will continue to do so. But this is our new reality. As security researcher the Grugq puts it: “If your secure communications platform isn’t being used by terrorists and pedophiles, you’re probably doing it wrong.”