Can the police search your phone?
The answer to that question is getting complicated.
But it’s an important thing to know. The reason is that your phone, and the phones of every employee at your company, almost certainly contain company secrets — or provide access to those secrets.
Phones can provide access to passwords, contact lists, emails, phone call metadata, photos, spreadsheets and other company documents, location histories, photos and much more.
Proprietary data — including information that would enable systematic hacking of company servers for sabotage, industrial espionage and worse — is protected from legal exposure by a complex set of well-understood laws and norms in the United States. But that same data is accessible from company phones.
Can the police simply take that information?
Until recently, most professionals would have said no.
Why? Because business and IT professionals tend to believe that smartphones are covered by the Fourth Amendment’s strictures against “unreasonable searches and seizures,” a protection recently reaffirmed by the Supreme Court. And smartphones are also protected by the Fifth Amendment, many would say, because divulging a passcode is akin to being “compelled” to be a “witness” against yourself.
Unfortunately, these beliefs are wrong.
The trouble with passcodes
Apple last year quietly added a new feature to iPhones designed to protect smartphone data from police searches. When you quickly press the on/off button on an iPhone five times, it turns off Touch ID and Face ID.
The thinking behind the so-called cop button is that, because police can compel you to use biometrics, but not a passcode, to unlock your phone, the feature makes it impossible for the legal system to force you to hand over information.
Unfortunately, this belief has now been undermined.
We learned this week that a Florida man named William John Montanez was jailed for six months after claiming that he forgot the passcodes for his two phones.
Montanez was pulled over for a minor traffic infraction. Police wanted to search his car. He refused. The police brought in dogs, which found some marijuana and a gun. (Montanez said the gun was his mother’s.) During the arrest, his phone got a text that said, “OMG, did they find it,” prompting police to get a warrant to search his phones. That’s when Montanez claimed he didn’t remember the passcodes, and the judge sentenced him to up to six months in jail for civil contempt.
As a precedent, this cascading series of events changes what we thought we knew about the security of the data on our phones. What started as an illegal turn ended up with jail time over the inability or unwillingness to divulge what we thought was a constitutionally protected bit of information.
We’ve also learned a lot recently about the vulnerability of location data on a smartphone.
The solution for individual users who want to keep location and other data private is to simply switch off the feature, such as the Location History feature in Google’s Android operating system. Right?
Not really. It turns out Google has been storing location data even after users turn off Location History.
The fiasco was based on false information that used to exist on Google’s site. Turning off Location History, the site said, meant that “the places you go are no longer stored.” In fact, they were stored, just not in the user-accessible Location History area.
Google corrected the false language, adding, “Some location data may be saved as part of your activity on other services, like Search and Maps.”
Stored data matters.
The FBI recently demanded from Google the data about all people using location services within a 100-acre area in Portland, Maine, as part of an investigation into a series of robberies. The request included the names, addresses, phone numbers, “session” times and duration, log-in IP addresses, email addresses, log files and payment information.
The order also said that Google could not inform users of the FBI’s demand.
Google did not comply with the request. But that didn’t keep the FBI from pushing for it.
In fact, police are evolving their methods, intentions and technologies for searching smartphones.
Police data-harvesting machines
A device called GrayKey, from a company called GrayShift, can unlock any iPhone or iPad.
GrayShift licenses the devices for $15,000 per year and up to 300 phone cracks.
It’s a turnkey system. Each GrayKey has two Lightning cables. Police need only plug in a phone, and eventually the phone’s passcode appears on the phone’s screen, giving full access.
That may be why Apple introduced in the fall a new “USB Restricted Mode” for iPhones. That mode makes it harder for police (or criminals) to crack a phone via the Lightning port.
The mode is activated by default, which is to say that the “switch” in settings for USB Accessories is turned off. With that switch off, the Lightning port won’t connect to anything after an hour of the phone being locked.
Unfortunately for iPhone users, “USB Restricted Mode” is easily defeated with a widely available $39 dongle.
And the U.S. isn’t the only country with police data-harvesting machines.
A world of trouble for smartphone data
Chinese authorities have their own technology for harvesting the data from phones, and that technology is now being deployed by police in the field. Police anywhere in the country can demand that anyone hand over a phone, which is then scanned by a device, the use of which is reportedly spreading across China.
Chinese authorities have both desktop and handheld scanner devices, which automatically extract and process emails, social posts, videos, photos, call histories, text messages and contact lists to aid them in looking for transgressions.
Some reports suggest that the devices, which are made by both Israeli and Chinese companies, are unable to crack newer iPhones but can access nearly every other kind of phone.
Another factor to be considered is that the protections of the U.S. Constitution end at the border — literally at the border.
As I’ve detailed here in the past, U.S. Customs is a “gray area” for Fifth Amendment constitutional protections.
And once abroad, all bets are off. Even in friendly, pro-privacy nations such as Australia.
The Australian government on Tuesday proposed a law called the Assistance and Access Bill 2018. If it becomes law, the act would require people to unlock their phones for police or face up to ten years in prison (the current maximum is two years).
It would empower police to legally bug or hack phones and computers.
The bill would force carriers, as well as companies such as Apple, Google, Microsoft and Facebook, to give police access to the private encrypted data of their customers if technically possible.
Failure to comply would result in fines of up $7.3 million and prison time.
Police would need a warrant to crack, bug or hack a phone.
The bill may never become law. But Australia is just one of many nations affected by a new political will to end smartphone privacy when it comes to law enforcement.
If you take anything away from this column, please remember this: The landscape for what’s possible in the realm of police searches of smartphones is changing every day.
In general, smartphones are becoming less protected from police searches, not more protected.
That’s why the assumption of every IT department, every enterprise and every business progressional — especially those of us who travel internationally on business — must be that the data on a smartphone is not safe from official scrutiny.
It’s time to rethink company policies, training, procedures and permissions around smartphones.