FBI Recruits Apple to Help Unlock Your iPhone

FBI Recruits Apple to Help Unlock Your iPhone

February 19, 2016

FBI Recruits Apple to Help Unlock Your iPhone

By: Ifrah Law

Alushta, Russia - October 27, 2015: Woman with headphones holding in the hand iPhone6S Rose Gold. iPhone 6S Rose Gold was created and developed by the Apple inc.

It is a well-known maxim that “bad facts make bad law.”  And as anybody even casually browsing social media this week likely has seen, the incredibly tragic facts surrounding the San Bernadino attacks last December have led to a ruling that jeopardizes the privacy rights of all law-abiding Americans.

First, it is important to clearly understand the ruling.  After the horrific attack in San Bernadino on December 2, 2015, the FBI seized and searched many possessions of shooters Syed Rizwan Farook and Tashfeen Malik in their investigation of the attack.  One item seized was Farook’s Apple iPhone5C.  The iPhone itself was locked and passcode-protected, but the FBI was able to obtain backups from Farook’s iCloud account.  These backups stopped nearly six weeks before the shootings, suggesting that Farook had disabled the automatic feature and that his phone may contain additional information helpful to the investigation.

Under past versions of iOS, the iPhone’s operating system, Apple had been able to pull information off of a locked phone in similar situations.  However, Farook’s iPhone—like all newer models—contains security features that make that impossible.  First, the data on the phone is encrypted with a complex key that is hardwired into the device itself.  This prevents the data from being transferred to another computer (a common step in computer forensics known as “imaging”) in a usable format.  Second, the iPhone itself will not run any software that does not contain a digital “signature” from Apple.  This prevents the FBI from loading its own forensic software onto Farook’s iPhone.  And third, to operate the iPhone requires a numeric passcode; each incorrect passcode will lock out a user for an increasing length of time, and the tenth consecutive incorrect passcode entry will delete all data on the phone irretrievably.  This prevents the FBI from trying to unlock the iPhone without a real risk of losing all of its contents.

As Apple CEO Tim Cook has explained, this system was created deliberately to ensure the security of its users’ personal data against all threats.  Indeed, even Apple itself cannot access its customers’ encrypted data.  This creates a unique problem for the FBI.  It is well-settled that, pursuant to a valid search warrant, a court can order a third party to assist law enforcement agents with a search by providing physical access to equipment, unlocking a door, providing camera footage, or even giving technical assistance with unlocking or accessing software or devices.  And, as the government has acknowledged, Apple has “routinely” provided such assistance when it has had the ability to access the data on an iPhone.

But while courts have required third parties to unlock doors, they have never required them to reverse-engineer a key.  That is what sets this case apart: to assist the government, Apple would have to create something that not only does not exist, but that it deliberately declined to create in the first instance.

On February 16, Assistant U.S. Attorneys in Los Angeles filed an ex parte motion (that is, without providing Apple with notice or a chance to respond) in federal court seeking to require Apple to create a new piece of software that would (1) disable the auto-erase feature triggered by too many failed passcode attempts and (2) eliminate the delays between failed passcode attempts.  In theory, this software is to work only on Farook’s iPhone and no other.  This would allow the FBI to use a computer to simply try all of the possible passcodes in rapid succession in a “brute force” attack on the phone.  That same day, Magistrate Judge Sheri Pym signed what appears to be an unmodified version of the order proposed by the government, ordering Apple to comply or to respond within five business days.

Though Apple has not filed a formal response, CEO Tim Cook already has made waves by publicly stating that Apple will oppose the order.  In a clear and well-written open letter, Cook explains that Apple made the deliberate choice not to build a backdoor into the iPhone because to do so would fatally undermine the encryption measures built in.  He explains that the notion that Apple could create specialized software for Farook’s iPhone only is a myth, and that “[o]nce created, this technique could be used over and over again, on any number of devices.  In the physical world, it would be the equivalent of a master key . . . .”

This has re-ignited the long-standing debate over the proper balance between individual privacy and security (and the debate over whether the two principles truly are opposed to one another).  This is all to the good, but misses a key point: Judge Pym’s order, if it stands, has not only short-circuited this debate, it ignores the resolution that Congress already reached on the issue.

Indeed, a 1994 law known as the Communications Assistance for Law Enforcement Act (“CALEA”) appears to prohibit exactly what the government requested here.  Though CALEA preserved the ability of law enforcement to execute wiretaps after changing technology made that more complicated than physically “tapping” a telephone line, it expressly does not require that information service providers or equipment manufacturers do anything to open their consumers to government searches.  But instead of addressing whether that purpose-built law permits the type of onerous and far-reaching order that was granted here, both the government and the court relied only on the All Writs Act—the two-century-old catch-all statute that judges rely on when ordering parties to unlock doors or turn over security footage.

Though judges frequently must weigh in and issue binding decisions on fiercely contested matters of great importance, they rarely do so with so little explanation, or after such short consideration of the matter.  Indeed, when the government sought an identical order this past October in federal court in Brooklyn, N.Y., Magistrate Judge James Orenstein asked for briefs from Apple, the government, and a group of privacy rights organizations and, four months later, has yet to issue an opinion.  Yet Judge Pym granted a similar order, without any stated justification, the same day that it was sought.

An order that is so far-reaching, so under-explained, and so clearly legally incorrect is deeply concerning.  And yet, but for Apple’s choice to publicize its opposition, this unjustified erosion of our privacy could have happened under the radar and without any way to un-ring the bell.  Fortunately, we appear to have avoided that outcome, and we can hope that Apple’s briefing will give the court the additional legal authority—and the additional time—that it will need to revisit its ruling.

Ifrah Law

Ifrah Law

Ifrah Law is a passionate team of experts that understands the importance of listening to and addressing specific concerns of clients – when facing the heat of a federal investigation or the ire of a business competitor. Experience in complex cases related to online gambling and sports betting, internet marking and advertising, and white collar litigation.

Related Practice(s)
Other Posts
A Tale of Two Courts
White-Collar Crimes |
Feb 16, 2024

A Tale of Two Courts

By: James Trusty
A Scandal’s Fine Print
White-Collar Crimes |
Jan 19, 2024

A Scandal’s Fine Print

By: James Trusty
Human Trafficking Blindspot
White-Collar Crimes |
Nov 27, 2023

Human Trafficking Blindspot

By: James Trusty
Equal Justice as Another Casualty of War
White-Collar Crimes |
Nov 9, 2023

Equal Justice as Another Casualty of War

By: James Trusty

Subscribe to Ifrah Law’s Insights