The Los Angeles police department has been a pioneer in predictive policing, for years touting avant-garde programs that use historical data and software to predict future crime.
But newly revealed public documents detail how PredPol and Operation Laser, the department’s flagship data-driven programs, validated existing patterns of policing and reinforced decisions to patrol certain people and neighborhoods over others, leading to the over-policing of Black and brown communities in the metropole.
The documents, which include internal LAPD documents and emails and were released as part of a report by the Stop LAPD Spying coalition, also suggest that pledges to reform the programs amid rising public criticism largely rang hollow.
A new program that took shape after Operation Laser and PredPol were shuttered bears a striking resemblance to the programs it was supposed to reform, Stop LAPD Spying and independent experts who reviewed the documents say.
LAPD’s efforts to rebrand its predictive policing experiments mirror a broader shift in the private surveillance industry, the experts say, as companies increasingly reinvent existing products in response to negative press on predictive policing.
“Rather than re-evaluating their whole business model, they’re just trying to reframe the value of the product,” said Albert Fox Cahn, the founder of the Surveillance Technology Oversight Project (Stop), another anti-police-surveillance advocacy group. “They’re saying: here’s how you can prevent crime by allocating officers and changing patrols and changing who you engage with. And that’s going to result in the exact same outcomes.”
How Operation Laser created a vicious cycle
Launched in 2011, Operation Laser (an acronym for Los Angeles Strategic Extraction and Restoration) got its name from what LAPD hoped it would do: extract “offenders” with the precision of a doctor using laser surgery to remove a tumor.
On its face, using a data-backed approach to remove a “tumor” may seem logical. The problem was, according to critics and experts, that the data the program ran on was malignant.
Operation Laser used historical information such as data on gun-related crimes, arrests, and calls to map out “problem areas” (called “laser zones”) and “points of interest” (called “anchor points”) for officers to focus their efforts on. A newly established group, the crime intelligence detail, worked to create chronic offender bulletins, assigning criminal risk scores to people based on arrest records, gang affiliation, probation and field interviews. Information collected during these policing efforts was again fed into computer software that further helped automate the department’s crime-prediction efforts.
Central to Operation Laser’s success, wrote Craig Uchida, the program’s architect at LAPD, in a research paper in 2012, was Palantir. The software, controversial for aiding US Immigration and Customs Enforcement in surveilling immigrants, made it easier and faster for the department to create chronic offender bulletins and put together information from various sources on people deemed suspicious or inclined to commit a crime, Uchida said.
But the picture of crime in LA the software drew up was based on calls for service, crime reports and information collected by officers, the documents show, creating a vicious loop.
“When police target an area it generates more crime reports, arrests, and stops at that location and the subsequent crime data will lead the algorithm, risk assessment, or data analytic tool to direct police back to the same area,” the Stop LAPD Spying report explains.
In 2019, the LAPD inspector general, Mark Smith, said the criteria used in the program to identify people likely to commit violent crimes were inconsistent.
Documents included in the Stop LAPD Spying report, as well as documents that had previously been made public, confirm that Operation Laser in some cases was all but precise. Relying on information collected in field cards (the interview cards officers are required to fill out when stopping someone) to help identify chronic offenders or areas that needed more patrolling, for example, meant that even random stops could mark a person as a potential suspect or make them subject to more surveillance.
Officers were instructed to fill out the field interview cards with as much information as possible every time they stopped someone. Uchida told Wired in 2017 that he knew “most of the time [the cards] didn’t lead to anything, but it was … data that went into the system, and that’s what I wanted”.
As the Guardian revealed on Sunday, one of the locations that Operation Laser targeted was the Crenshaw district, where the rapper Nipsey Hussle was based. Hussle had long complained about policing in his neighborhood, saying in a 2013 interview that LAPD officers “come hop out, ask you questions, take your name, your address, your cell phone number, your social, when you ain’t done nothing. Just so they know everybody in the hood.”
The documents show LAPD identified the site of Hussle’s clothing store, Crenshaw Boulevard and Slauson Avenue, as an anchor point due to suspected gang-related activity as early as 2016.
The full extent of the operation targeting Hussle and his businesses remains unclear, but the documents show police efforts in the area were intense, and often imprecise. Searching for a robbery suspect described only as a Black man between the ages of 16 and 18, officers stopped 161 people and arrested 10 at the intersection where the store was located in a span of two weeks.
The consequences could be severe. The information of civilians stopped in the intersection would be fed into the data system, even if they hadn’t committed any offenses.
Some encounters turned deadly. In 2016, LAPD shot and killed 31-year-old Keith Bursey Jr at the intersection, after the car Bursey was a passenger in was stopped by gang enforcement police investigating “an odor of marijuana”. The officers shot Bursey in the back as he attempted to flee, one of six Black and Latino men to be shot by police in Laser zones in a six-month period in 2016.
Cliff Dorsey, a public defender and Bursey’s cousin, said treating an entire location as a “gang area” could lead to unjustified contact with police and criminalized people based on their neighborhood affiliation. “It creates a culture of distrust where people don’t feel comfortable talking to the police,” he said. “When there’s no trust, the community doesn’t feel like their humanity is being respected ... It’s this ‘us versus them’ mentality.”
PredPol’s earthquake theory of crime
In addition to running Operation Laser, LAPD contracted with PredPol, a company that grew out of a research project between LAPD and the UCLA professor Jeff Brantingham.
PredPol applied an earthquake prediction model to crime. The underlying theory – which the company once compared to the unproven and controversial broken windows policing strategy – was that like earthquakes and their aftershocks, smaller crimes were gateways to bigger crimes and occurred in similar places. While the mathematics might look complicated for “normal mortal humans”, PredPol said in a 2014 presentation obtained by Motherboard, the model was “based on nearly seven years of detailed academic research into the causes of crime pattern formation”.
But academics say the theory is flawed, and the math the company pitched to police was too simple to effectively predict crime. The model was essentially assessing where arrests had been made and sending police back to those locations, according to those academics.
More than a dozen police departments experimented with PredPol, including in Palo Alto and Mountain View. But by the end of 2019, both Operation Laser and PredPol had garnered intense criticism, with skeptics charging that the systems perpetuated discrimination.
By that time, several police departments had dropped their contracts with PredPol, saying there was little proof it helped reduce crime. After three years of use, the Palo Alto police department “didn’t get any value out of it”, a spokesperson, Janine De la Vega, said at the time.
LAPD initially promised reform, but ultimately shuttered Operation Laser in April 2019 and canceled its contract with PredPol in April 2020. LAPD conceded the data used in Operation Laser “was inconsistent” and needed to be reassessed. PredPol, it said, was terminated because of budgetary constraints due to the pandemic. Still, the police chief, Michel Moore, maintained the underlying principles of the program were valuable.
A bid to establish ‘digital trust’
With the data-driven programs the LAPD had promoted for years gone, a new effort took their place. Days before announcing the end of PredPol, LAPD published information about what it called data-informed community-focused policing. The intention of DICFP, the department said, was to establish a deeper relationship between community members and police and address some of the concerns the public had with previous policing programs, all while working to prevent crime. “The legitimacy of a police department is dependent on a community’s trust in its police officers,” an April 2020 LAPD brochure on the program read.
In the brochure, Moore conceded previous strategies that focused “solely on proactive suppression” left “neighborhoods feeling over-policed, singled out, and unnerved”. To that end, LAPD would be more transparent and its processes more standardized, working to collaborate more closely with local residents.
DICFP has three goals, according to the documents in the report: increase trust, reduce crime, and assist victims of crime.
Program leaders felt regaining that public trust was critical, the documents show. Without it, they argued in internal emails, police could be forced to relinquish predictive policing tools entirely. A year before DICFP was introduced, Sean Malinowski, the product manager for predictive policing at LAPD at the time and now a police consultant, asked Andrew Ferguson, a law professor and author of The Rise of Big Data Policing, for help with establishing “digital trust” between police departments and communities. Malinowski wrote that he worried bad PR could cause the department to “lose good tools” if they didn’t “get out in front of it”.
However, the brochure also shows the program bears a striking resemblance in its implementation to the predictive policing programs it purportedly reformed, several experts who reviewed the brochure argued.
The similarities start with what LAPD now calls “neighborhood engagement areas” or “neighborhoods experiencing crimes and low community engagement”. Like anchor points, those areas are identified based on information such as crime data and calls for service, which include anything from calls about robberies to traffic-related incidents and “non-emergency” calls, according to a daily operations guide.
To address crime in neighborhood engagement areas, according to the brochure, LAPD would use a problem-solving model first introduced under Operation Laser called Sara – an acronym for scanning, analysis, response and assessment. As part of that model, police and stakeholders would use tools such as increased patrolling and surveillance to prevent future crimes.
Similar in process to Operation Laser, DICFP would lead to similar results: at least one anchor point under the previous regime was also selected as a neighborhood engagement area in 2020 and at least one other area of interest was located within what was previously a Laser zone, the documents show.
Martin Luther King Jr Park in south-west LA – which documents show was an anchor point in 2016 and 2018 – was also identified as a neighborhood engagement area in March 2020 because the parking lot next to it was “where gang members are loitering”. A section in one of the documents that asks for a description of the “crime trend, activity, or quality of life issues” describes complaints of “tailgating activities with barbecue grills and alcohol” as well as overnight parking and encampments. In order to prevent future crime, the document notes, police did sweeps of the park, cited vehicles and dispatched additional gang units and patrols.
Where the document asks the officer to indicate which of the three goals of DICFP the project accomplished, nothing is circled.
Data-driven policing helps automate existing police logic, according to Shakeer Rahman, a Stop LAPD Spying community organizer. “That includes targeting poor people, targeting unhoused people, targeting Black, brown and disabled people. This is now helping to automate those practices and automate the harm, automate the banishment, automate the displacement that policing has always been responsible for.”
A relic of Operation Laser, the crime intelligence detail – which was responsible for the chronic offender bulletins – was combined with another unit and renamed the area crime and community intelligence centers. The center would determine where to deploy resources using similar sources of information as under Operation Laser, including investigative reports, arrest reports and field interviews, as well as similar tools, such as Palantir and gang databases.
LAPD said in 2020 it would conduct a study on the efficacy of DICFP. The department did not respond to questions on the status of the study.
“I don’t know about you but I’m not building trust with someone who spies on me,” said Tracey Corder, the deputy campaign director at Acre, a group that helps local organizations campaign against racial injustice.
“It sounds like a rebrand,” she continued. “It’s a co-option of organizer demands and organizing wins. We have set the stage and said policing as it exists does not work. All of this has been an effort to not actually change, but rebrand and reuse what they’ve already been doing.”
Cahn, the Surveillance Technology Oversight Project founder, said: “It seems like the worst sort of fear of organizers. Rather than actually addressing any of the substantive harms that come from predictive policing, they’re simply providing this veneer of community engagement.”
LAPD did not reply to repeated and detailed requests for comment.
‘They’re trying to do some whitewashing’
The LAPD was not alone in rebranding its predictive policing efforts. A month after the department introduced DICFP, PredPol changed its name to Geolitica. On the company website, where there was once a banner that said it was “the predictive policing company” that works to “predict critical events”, Geolitica now boasts “data-driven community policing” that helped public safety teams “be more transparent, accountable, and effective.”
Privacy advocates say LAPD and PredPol’s efforts were part of a larger trend in the predictive policing industry – both in police departments and private companies. In response to public criticism of predictive policing, companies have rebranded existing products or launched new products that promote police accountability and transparency.
“They’re definitely aware of all the negative connotations of predictive policing,” said Brian Hofer, the executive director of the government reform advocacy group Secure Justice and the chair of the Oakland Privacy Commission. “They’re trying to really do some whitewashing by rebranding different verbiage and talking about serving these communities instead.”
But safeguards shouldn’t be left to police or tech companies to implement, Corder argues.
“When you think about the way police respond to any kind of calls for reforms from civilians, it’s always oppositional,” Corder said of police departments that use these purported accountability services. “But now all of a sudden we’re supposed to believe they are fine with oversight coming from tech companies? Anybody should be concerned about that and we should start asking the question of why.”
Sam Levin contributed reporting