SAN FRANCISCO — Apple has long positioned the iPhone as a secure device that only its owner can open. That has led to battles with law enforcement officials who want to get information off them, including a well-publicized showdown with the F.B.I. in 2016 after Apple refused to help open the locked iPhone of a mass killer.
The F.B.I. eventually paid a third party to get into the phone, circumventing the need for Apple’s help. Since then, law enforcement agencies across the country have increasingly employed that strategy to get into locked iPhones they hope will hold the key to cracking cases.
But now Apple is closing the technological loophole that let authorities hack into iPhones, angering police and other officials and reigniting a debate over whether the government has a right to get into the personal devices that are at the center of modern life.
Apple said it was planning an iPhone software update that would effectively disable the phone’s charging and data port — the opening where users plug in headphones, power cables and adapters — an hour after the phone is locked. In order to transfer data to or from the iPhone using the port, a person would first need to enter the phone’s password. (Phones could still be charged without a password.)
Such a change would hinder law enforcement officials, who have typically been opening locked iPhones by connecting another device running special software to the port, often days or even months after the smartphone was last unlocked. News of Apple’s planned software update has begun spreading through security blogs and law enforcement circles — and many in investigative agencies are infuriated.
“If we go back to the situation where we again don’t have access, now we know directly all the evidence we’ve lost and all the kids we can’t put into a position of safety,” said Chuck Cohen, who leads an Indiana State Police task force on internet crimes against children. The Indiana State Police said it unlocked 96 iPhones for various cases this year, each time with a warrant, using a $15,000 device it bought in March from a company called Grayshift.
But privacy advocates said Apple would be right to fix a security flaw that has become easier and cheaper to exploit. “This is a really big vulnerability in Apple’s phones,” said Matthew D. Green, a professor of cryptography at Johns Hopkins University. A Grayshift device sitting on a desk at a police station, he said, “could very easily leak out into the world.”
In an email, an Apple spokesman, Fred Sainz, said the company is constantly strengthening security protections and fixes any vulnerability it finds in its phones, partly because criminals could also exploit the same flaws that law enforcement agencies use. “We have the greatest respect for law enforcement, and we don’t design our security improvements to frustrate their efforts to do their jobs,” he said.
Apple and Google, which make the software in nearly all of the world’s smartphones, began encrypting their mobile software in 2014. Encryption scrambles data to make it unreadable until accessed with a special key, often a password. That frustrated police and prosecutors who could not pull data from smartphones, even with a warrant.
The friction came into public view after the F.B.I. could not access the iPhone of a gunman who, along with his wife, killed 14 people in San Bernardino, Calif., in late 2015. A federal judge ordered Apple to figure out how to open the phone, prompting Timothy D. Cook, Apple’s chief executive, to respond with a blistering 1,100-word letter that said the company refused to compromise its users’ privacy. “The implications of the government’s demands are chilling,” he wrote.
The two sides fought in court for a month. Then the F.B.I. abruptly announced that it had found an undisclosed group to hack into the phone, for which it paid at least $1.3 million. An inspector general’s report this year suggested the F.B.I. should have exhausted more options before it took Apple to court.
Since then, two main companies have helped law enforcement hack into iPhones: Cellebrite, an Israeli forensics firm purchased by Japan’s Sun Corporation in 2006, and Grayshift, which was founded by a former Apple engineer in 2016. Law enforcement officials said they generally send iPhones to Cellebrite to unlock, with each phone costing several thousand dollars to open. In March, Grayshift began selling a $15,000 GrayKey device that the police can use to unlock iPhones themselves.
Apple has closed loopholes in the past. For years, the police used software to break into phones by simply trying every possible passcode. Apple blocked that technique by disabling iPhones after a certain number of wrong passcodes, but the Grayshift and Cellebrite software appear to be able to disable that Apple technology, allowing their devices to test thousands of passcodes, Mr. Green said.
Cellebrite declined to comment. Grayshift did not respond to requests for comment.
Opening locked iPhones through these methods has become more common, law enforcement officials said. Federal authorities, as well as large state and local police departments, typically have access to the tools, while smaller local agencies enlist the state or federal authorities to help on high-profile cases, they said.
Law enforcement agencies that have purchased a GrayKey device include the Drug Enforcement Administration, which bought an advanced model this year for $30,000, according to public records. Maryland’s state police have one, as do police departments in Portland, Ore., and Rochester, Minn., according to records.
Hillar Moore, the district attorney in Baton Rouge, La., said his office had paid Cellebrite thousands of dollars to unlock iPhones in five cases since 2017, including an investigation into the hazing-related death of a fraternity pledge at Louisiana State University. He said the phones had yielded crucial information, and he was upset that Apple planned to close such a useful investigative avenue.
“They are blatantly protecting criminal activity, and only under the guise of privacy for their clients,” he said.
Michael Sachs, an assistant district attorney in Manhattan, said his office uses workarounds — he declined to specify which — to access locked iPhones several times a week. That has helped solve a series of cases in recent months, including by getting into an iPhone to find videos of a suspect sexually assaulting a child. The man was convicted this year.
In the first 10 months of 2017, the Manhattan district attorney’s office said it had recovered and obtained warrants or consent to search 702 locked smartphones, two-thirds of which were iPhones. Smartphones running Google’s Android software have been generally easier to access, partly because many older devices lack encryption.
The encryption on smartphones applies only to data stored solely on the phone. Companies like Apple and Google regularly give law enforcement officials access to the data that consumers back up on their servers, such as via Apple’s iCloud service. Apple said that since 2013, it has responded to more than 55,000 requests from the United States government seeking information about more than 208,000 devices, accounts or financial identifiers.
The tussle over encrypted iPhones and opening them to help law enforcement is unlikely to simmer down. Federal officials have renewed a push for legislation that would require tech companies like Apple to provide the police with a backdoor into phones, though they were recently found to be overstating the number of devices they could not access.
Apple probably won’t make it any easier for the police if not forced by Congress, given that it has made the privacy and security of iPhones a central selling point. But the company has complied with local laws that conflict with its stated values. In China, for instance, Apple recently began storing its Chinese customers’ data on Chinese-run servers because of a new law there.
Apple’s latest move is part of a longer cat-and-mouse game between tech companies and law enforcement, said Michelle Richardson, an analyst at the Center for Democracy and Technology, which supports protections for online privacy.
“People always expected there would be this back-and-forth — that government would be able to hack into these devices, and then Apple would plug the hole and hackers would find another way in,” she said.
This post was originally published here