U.S. Attorney General William Barr discusses Pensacola Naval Air Station Shooting in Washington, DC.
Michael Brochstein | Barcroft Media | Reuters
President Donald Trump and the nation’s top law enforcement official are facing off against Apple, the most valuable American company.
The fight started because the FBI says it cannot extract data from two iPhones used by Mohammed Saeed Alshamrani, who is suspected of killing three people last month in a shooting at a Navy base in Pensacola, Florida. Attorney General William Barr and Trump want Apple to help by unlocking the phone it manufactured.
Although the current fight is over these two password-protected phones, it’s only the latest skirmish in a long-running battle over whether technology companies should give law enforcement special access to customers’ data.
Barr and other law enforcement officials call it the “going dark” problem and argue that all data should be accessible with a warrant. Apple and techies tend to call the concept a “backdoor” and argue that it would hurt security for everyone who uses that device.
During Barr’s press conference on Monday, he explicitly framed the issue as bigger than just the two Pensacola iPhones: “We call on Apple and other technology companies to help us find a solution so that we can better protect the lives of Americans and prevent future attacks.”
Barr also discussed his goal last summer, months before the Pensacola shooting: “The Department has made clear what we are seeking. We believe that when technology providers deploy encryption in their products, services, and platforms they need to maintain an appropriate mechanism for lawful access.”
Apple is not against helping law enforcement. But it objects to building a general method that could be used to break encryption, arguing that it will have unintended consequences.
“Backdoors can also be exploited by those who threaten our national security and the data security of our customers. Today, law enforcement has access to more data than ever before in history, so Americans do not have to choose between weakening encryption and solving investigations,” an Apple representative said in a statement earlier this week.
Apple’s not the only company in this pickle. Pretty much every single major piece of digital technology uses encryption to protect information from prying eyes. Barr took aim at Facebook last year, for example, for the encryption it uses in WhatsApp.
We’ve seen this fight before
FBI Director James Comey Jr. and Apple CEO Tim Cook
The battle between pro-privacy techies and law enforcement officials who want access to encrypted data to investigate crimes has been raging since at least 1993 when the Clinton White House said that encryption “can be used by terrorists, drug dealers, and other criminals” when promoting the Clipper Chip, which would allow law enforcement access to encryption. The chip never took off and encryption thrived.
Apple clashed with the Justice Department and then-FBI Director James Comey over encryption in a very similar case in 2016, with one major difference.
In that case, the FBI wanted to break into an iPhone used by a mass shooter in San Bernardino, California. The case spilled out into a court battle with specific legal arguments before the FBI said it found a third-party that could unlock the device and gave up the court battle, leaving the issue of whether Apple would be compelled to unlock the phones unsettled.
So far in the Pensacola case, the Justice Department hasn’t filed for a court order to compel Apple to give it access, and , and Barr declined to comment in a press conference on Monday whether he would seek a court order.
Instead, this conflict is playing out in the press and in tweets.
As Ron Gula, a former NSA employee and current security technology investor said about Barr’s request to unlock the iPhone: “They are making a public appeal of it. They are trying to do it to get political points and change policy, which is their job.”
Could it be done?
Because it’s not a court battle yet, Apple’t been compelled to say if it’s possible to unlock a customer’s iPhone.
But in a filing in the San Bernardino conflict in 2016, an Apple privacy engineer outlined how Apple would start to build software to unlock the iPhone, including assigning a team of 6 to 10 Apple engineers and other employees to work on the project for up to a month. It also warned that the software it built for the government could become dangerous, and Apple wouldn’t want it to leave its facilities.
Apple continues to argue that building a backdoor would create a vulnerability for all of its products — if the FBI had a tool to extract information for legitimate reasons, criminals could use that same tool to extract health or financial data from a lost or stolen iPhone, foreign governments could use that tool to spy on Americans, and so on.
Historically, the pro-security technologists have been right: the Clipper Chip was later found to have significant security holes. If the chip had been widely adopted, it would have given hackers several different methods to break into Clipper-equipped devices.
That’s why Apple CEO Tim Cook called the government’s 2016 request “the software equivalent of cancer.” He even threatened to resign in 2016 if Apple didn’t fight the request, the New York Times reported, citing Apple’s former general counsel.
Could the FBI unlock the phone on its own? Barr said that it’s “virtually impossible” to unlock the phones without the password, even with the help of the FBI crime lab. But it found a third-party vendor in 2016 to unlock the San Bernardino phone, and several companies currently claim they can help law enforcement unlock iPhones, especially older ones.
“The terrorist in this case had an Apple 7 iPhone, that’s an old phone that has many remote issues with them. Law enforcement should make use of these things,” Gula said. “Today we can decrypt that phone.”
A general view of the atmosphere at the Pensacola Naval Air Station following a shooting on December 06, 2019 in Pensacola, Florida. The second shooting on a U.S. Naval Base in a week has left three dead plus the suspect and seven people wounded.
Photo by Josh Brasted/Getty Images
In 2016, Cook wrote an open letter to Apple users posted on the company’s website. He hasn’t commented yet in the 2020 case.
But ever since San Bernardino, Apple has made privacy — including no backdoors — one of its key corporate values as well as a selling point to distinguish its phones from Android phones.
“Apple believes privacy is a fundamental human right,” according to a document sent to shareholders early this month. “Every Apple product is designed from the ground up to protect privacy and security.”
But this stance appears to apply only to the hardware Apple sells. Apple — like other tech companies — provides data from its servers to law enforcement on a regular basis. Apple said it turned over gigabytes of iCloud data related to the Pensacola shooting, and Apple has responded to 127,000 requests made by law enforcement agencies in the United States since 2013, according to statistics on its website.
Trump, when he entered the encryption discussion, explicitly linked Apple’s willingness to unlock the phone with trade, and the “help” he’s given Apple, likely referring to his warm relationship with Cook. He also framed Apple as protecting criminals. That puts Apple in a tough spot.
Apple is likely to stand its ground going forward, but it doesn’t necessarily want to be known as the encryption company. Even if the Justice Department doesn’t advance this case in the courts, the issue will come up again.