Skip to main contentSkip to navigationSkip to navigation
Zachary McCoy was considered a suspect in a house burglary because his Google location data showed police he had been near the home.
Zachary McCoy was considered a suspect in a house burglary because his Google location data showed police he had been near the home. Photograph: Samuel Jones/AP
Zachary McCoy was considered a suspect in a house burglary because his Google location data showed police he had been near the home. Photograph: Samuel Jones/AP

The new warrant: how US police mine Google for your location and search history

This article is more than 2 years old

Geofence location and keyword warrants are new law enforcement tools that have privacy experts concerned

It was a routine bike ride around the neighborhood that landed Zachary McCoy in the crosshairs of the Gainesville, Florida, police department.

In January 2020, an alarming email from Google landed in McCoy’s inbox. Police were requesting his user data, the company told him, and McCoy had seven days to go to court and block its release.

McCoy later found out the request was part of an investigation into the burglary of a nearby home the year before. The evidence that cast him as a suspect was his location during his bike ride – information the police obtained from Google through what is called a geofence warrant. For simply being in the wrong place at the wrong time, McCoy was being investigated and, as a result, his Google data was at risk of being handed over to the police.

Geofence location warrants and reverse search warrants such as the ones McCoy dealt with are increasingly becoming the tool of choice for law enforcement. Google revealed for the first time in August that it received 11,554 geofence location warrants from law enforcement agencies in 2020, up from 8,396 in 2019 and 982 in 2018.

It’s a concerning trend, argue experts and advocates. They worry the increase signals the start of a new era, one in which law enforcement agencies find ever more creative ways to obtain user information from data-rich tech companies. And they fear agencies and jurisdictions will use this relatively unchecked mechanism in the context of new and controversial laws such as the criminalization of nearly all abortions in Texas.

“As long as the data exists, all it takes is a creative law enforcement officer to say, ‘Hey, we can get a warrant or we can send a subpoena for this particular subset of the data that’s already being harvested’,” said Caleb Kenyon, the defense attorney who represented McCoy, to the Guardian. “They’re coming up with everything they can to do their job. That’s all it takes for the next type of [reverse] search warrant to come about.”

Dragnet search warrant

Lawyers such as Kenyon and privacy experts argue geofence and other broad warrants such as those that ask companies to sift through keywords people searched for are akin to a general warrant, made illegal by the fourth amendment right against unreasonable searches and seizures. Unlike other kinds of search warrants, which are targeted and seek information about people who law enforcement has probable cause to believe has committed a specific crime, these warrants don’t have a particular person in mind.

In other words, with reverse search warrants law enforcement is still looking for their suspect and they’re asking tech companies to give them a list of people to investigate. For geofence warrants, anyone in a certain place at a certain time becomes a suspect and is subject to further investigation which could mean giving police even more of their user data. For keyword search warrants, another relatively new mechanism to obtain user information that has emerged, anyone who searched for a certain phrase or address becomes a suspect.

The latter is potentially more far-reaching than geofence warrants, Kenyon argues, because keyword search warrants are not necessarily geographically or tangibly tied to a specific crime and could make suspects out of people around the world who happened to search for specific terms. “It’s what I would frame more of as a true digital warrant, without any ties or connections or tethers to the physical world,” he said.

Privacy groups argue that tech companies bear responsibility in law enforcement’s growing access to these types of data by developing new features that index user information in a way that makes it more searchable.

One such feature is Apple’s proposed child sexual abuse material detection (CSAM) function, which would analyze images to detect child sexual abuse images.

“From our position, creating more vulnerabilities on our devices that can be abused whether by authoritarian governments or by law enforcement or hackers doesn’t make anyone safer,” said Caitlin Seeley George, the director of campaigns and operations at Fight for the Future, which organized a protest outside Apple stores in 11 cities to pressure the company into abandoning its plans for the feature. “It absolutely fits into the dragnet search and surveillance function of law enforcement because it makes images searchable.

“For communities being disproportionately targeted by law enforcement surveillance based on the color of their skin, their religion, their home country – this is adding more fodder,” she said.

Information vulnerability

The normalization of these mechanisms is particularly worrying as controversial laws such as the Texas abortion ban are being passed, privacy advocate Albert Fox Cahn, the founder of the Surveillance technology oversight project, said. While Texas’s law doesn’t allow public officials to sue abortion providers or those who help, it doesn’t prohibit them from aiding private citizens who do sue, he pointed out.

“You could use a pretext to get a reverse search warrant targeting an abortion provider’s location, using literally any other law on the books, and then provide that information to activists,” Cahn said.

And it’s getting easier. A company called Hawk Analytics offers services that purport to put together Google geofence warrants in “just a few clicks”. In one webinar hosted exclusively for law enforcement, the company said it would walk attendees through “everything Google” including “what’s available, how to get it, and what to do with it, with an emphasis on the Google geofence reverse location returns”.

Without specifying how many, a Google spokesperson, Genevieve Park, said the company has challenged many overly broad government requests.

“We use a rigorous process designed to honor our legal obligations while narrowing the scope of data disclosed,” Park said in a statement.

It’s not just major tech players like Google and Facebook that are targets, however. Personal information in the hands of smaller companies that may not have the resources or wherewithal to withstand sweeping warrants is just as vulnerable, Cahn said. “You could subpoena period tracker apps to provide any users who apparently became pregnant during a given time period, for example,” he said.

“This information is flowing to so many different companies and vendors, even if you get one company trying to protect your location data you have so many more points of vulnerability in the commercial market than a decade ago,” Cahn said. “All it takes is one company to give up that information without a fight or more often than not sell it.”

While there is legislation in the works that would impose safeguards on other means of getting hold of vast swaths of sensitive location data, such as cell site simulators and the outright sale of that information, there isn’t currently a publicly known congressional effort to do the same for geofence warrants, according to Jake Laperruque, a senior policy counsel at the project on government oversight. In the meantime, the onus is on state and local jurisdictions not to issue overbroad warrants and on tech companies to fight off those warrants, Laperruque and Cahn argue.

For tech companies that count advertising among their revenue streams – or as a major source of revenue, as is the case for Google, there’s no real technical solution to curbing government requests for their data. “It would be technically impossible to have this data available to advertisers in a way that police couldn’t buy it, subpoena it or take it with a warrant,” Cahn said.

That’s why Apple’s now-postponed plan to launch a feature that scans for CSAM caused such a furor. When the FBI in 2019 asked Apple to unlock the phone of the suspect in a mass shooting in San Bernardino, California, Apple resisted the request arguing the company couldn’t comply without building a backdoor, which it refused to do. Once Apple begins scanning and indexing the photos of anyone who uses its devices or services, however, there’s little stopping law enforcement from issuing warrants or subpoenas for those images in investigations unrelated to CSAM.

“We can’t ignore that these technologies are sold within legal regimes where if you create a tool to address one set of crimes you can’t the refuse when governments are forcing you to use them to identify other sorts of crimes like political dissent and religious expression,” Cahn said.

For McCoy, while he was one of several known individuals swept up in a broad, dragnet-style warrant, he’s among the fortunate few. The police withdrew the subpoena after Kenyon filed a motion to quash. That Google notified McCoy about the request at all is a relative anomaly. Subpoenas and warrants issued to tech companies often contain a non-disclosure clause. Still, as is typical of these notices, McCoy had just days to hire a lawyer who knew what a geofence warrant was and how to handle it – which Kenyon said is still hard to find in many places. For many without resources, that’s a near impossible ask.

“Every single time one of these warrants is signed, it erodes a little bit of the bedrock of the protections we have under the law,” Kenyon said.

Most viewed

Most viewed