British authorities recently revealed that the Westminster Bridge attacker, who killed four people and injured scores of others, may have sent or received messages through the encrypted WhatsApp service minutes before launching the horrific attack. The revelation has rekindled the controversial debate over whether tech companies should implement backdoors that allow governments access to encrypted information on digital devices.

The UK's home secretary, Amber Rudd, threw down the gauntlet on the issue in an interview with the BBC. "We need to make sure that organizations like WhatsApp, and there are plenty of others like that, don't provide a secret place for terrorists to communicate with each other," Rudd said ahead of her upcoming meetings with several technology firms.

Last year, WhatsApp instituted end-to-end encryption across all of its communications. This essentially rendered all messages sent through the app unreadable by anyone other than the sender and recipient.

"No one can see inside that message," WhatsApp announced last year when unveiling the encryption update. "Not cybercriminals. Not hackers. Not oppressive regimes. Not even us."

The current UK/WhatsApp conflict recalls a similar situation last year when the FBI demanded Apple unlock an iPhone recovered from one of the perpetrators of the mass shooting in San Bernardino. Apple resisted the order from the FBI insisting that creating a backdoor into its iPhones would compromise the security of millions of customers.

"The government suggests this tool could only be used once, on one phone," Apple CEO Tim Cook wrote in a message to the company's customers last year. "But that's simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks."

After several months of legal stoushes the FBI withdrew its demands, claiming to have unlocked the phone through the services of a "third-party."

Some have dubbed this 21st century encryption battle a redux of the infamous 1990s "Crypto Wars." These centered around the NSA's determination to define strong cryptography as a munition. This determination essentially allowed authorities to strongly regulate civilian use of cryptographic algorithms.

In 1995, the Electronic Frontier Foundation began a long legal battle, ultimately winning with a ruling declaring that cryptographic algorithms are not weapons, but in fact should be classified as free speech or expression, and therefore protected under the First Amendment.

It was a groundbreaking result that has allowed us all a degree of unfettered access to encryption protocols for many years. But our developing smart devices, crossing paths with the modern fear of terrorism, have caused governments to rekindle the encryption battle.

The controversy over a company's responsibility to be able to decrypt its users' communications is heating up in the UK. Amber Rudd's latest declarations follow the passing of a bill in November 2016 called the Investigatory Powers Act (euphemistically known as the Snooper's Charter).

Among several other contentious provisions, the Act stipulated that communications service providers must have the ability to remove encryption applied by their services. The current Act notably only covers providers operating from within the UK and does not include foreign companies. It's also unclear how this provision would relate to a WhatsApp-styled service. What is clear, though, is that provisions such as these certainly pave the way for laying pressure on companies to ensure they cannot create "uncrackable" services or devices.

While it's fair to argue that governments should be able to protect their citizens through responsible and targeted surveillance, it is hard to find a convincing stance on forcing companies to add backdoors to their software or devices. When WhatsApp rolled out its end-to-end encryption in 2016 it was forcing the situation out of its hands.

This kind of blanket encryption is secure because it has no backdoor. It is secure because the company itself has no way to crack it. Despite the long-standing privacy versus security debate, there is no way the integrity of an encrypted service can be broken without it compromising the data of all users.

Apple CEO Tim Cook succinctly summed it up at a conference in 2015 when he proclaimed, "You can't have a backdoor that's only for the good guys."

Copyright © Gizmag Pty Ltd 2017

http://newatlas.com/whatsapp-backdoor-government-encryption/48629