Minority Report

On December 3, 2024, the Pasco County Sheriff and four Plaintiffs entered a novel settlement agreement pertaining to the Sheriff’s use of predictive policing analytics, sometimes referred to as Intelligent Led Policing. You can think of it as Minority Report of Philip K. Dick (short story) and/or Tom Cruise fame (movie).

According to the Brennan Center:

“Predictive policing uses computer systems to analyze large sets of data, including historical crime data, to help decide where to deploy police or to identify individuals who are purportedly more likely to commit or be a victim of a crime.”

While doing nothing to prevent or reduce crime, Pasco’s Sheriff Department did succeed in ruining people’s lives, as their ongoing harassment eventually led to evictions and code enforcement, further evidence of the mission creep such technologies enable.

“As the case proceeded, IJ uncovered explosive documents in which Sheriff’s Office employees laid out objectives of the program. In one email, a deputy stated that “the goal is to get them to move away or go to prison.” Another deputy, responsible for code enforcement citations, bragged in his annual performance review that his “most significant work related accomplishment” was that he “assisted in getting the people (mainly prolific offenders) from [thirteen] addresses evicted” from their homes. In response, his supervisor praised him for his performance.” (Institute for Justice press release)

Critics like Secure Justice argue that the use of historical data militates against use of this technology, as it cements into place historical racist policing practices. It was because of such concerns that Oakland banned the use of predictive policing analytics.

Pasco County is located in Florida, and like many other troubling parts of their state and local governments, their Sheriff was an enthusiastic participant in helping to build the school-to-prison pipeline. “According to the Sheriff, the program enables law enforcement to “identify at-risk youth who are destined for a life of crime.” Based on student data, including absences and grades, law enforcement records, and records from the Florida Department of Children and Families, the Sheriff assigns a score to every youth labeled “at-risk” by the Pasco County Schools to create a secret list of targeted youth. The Sheriff then uses that secret list to illegally harass and surveil young people in the community, which is particularly troublesome in a school district with stark racial disparities in school discipline.” (Apr. ‘21 Legal Defense Fund press release)

The settlement agreement was noteworthy for a couple of reasons:

  • It’s the first we’re aware of pertaining to predictive policing analytics

  • Although it went the traditional “no liability” route as most civil settlement agreements do, there are several “admissions” made by the Sheriff that can serve as inspiration for legal challenges in other jurisdictions.

While acknowledging that the Fourth Amendment does not require that a police officer obtain a warrant prior to simply knocking on one’s door, the Sheriff’s practice of repeatedly “checking” on specific individuals, including visits at night, “exceeded that implied license…”

Also, these same “prolific offender checks…directly and substantially interfered with the Plaintiffs’ right of intimate association” protected under the First Amendment.

Finally, while due process protects our individual liberty, such “prolific offender checks…interfered with Plaintiffs’ liberty interests…”

In plain language, these clauses equate to evidence of “over-policing.” We used this same reasoning in Oakland during our successful campaign to ban this harmful (and ineffective) technology. Oakland’s police department is now in its record-setting 22nd year of federal oversight due to systemic racist policing practices. Even today, Oakland’s state mandated reporting of traffic stop-data reveals a disproportionate impact on Black drivers.

As several US Senators and Members of Congress wrote in their January 24, 2024 letter to US Attorney General Merrick Garland:

“Mounting evidence indicates that predictive policing technologies do not reduce crime. Instead, they worsen the unequal treatment of Americans of color by law enforcement. Predictive policing systems rely on historical data distorted by falsified crime reports and disproportionate arrests of people of color. As a result, they are prone to over-predicting crime rates in Black and Latino neighborhoods while under-predicting crime in white neighborhoods. The continued use of such systems creates a dangerous feedback loop: biased predictions are used to justify disproportionate stops and arrests in minority neighborhoods, which further biases statistics on where crimes are happening.”

In this letter, they argued that the DOJ’s funding of predictive policing analytics via the Edward Byrne Memorial Justice Assistance Grant Program violated Title VI of the Civil Rights Act of 1964, which prohibits funding of programs that might discriminate. Although President Biden gave lip-service to studying the impact of such technology in his October 20, 2023 Executive Order, it’s yet another example of releasing a product into the wild without first considering its impact, and the Biden Administration did nothing to stop funding of such products. Like the Urban Areas Security Initiative (UASI), the JAG program is a primary vehicle for federal funding of local surveillance technologies.

The use of such technologies is also worrisome as SoundThinking, maker of the widely used ShotSpotter gunshot detecting technology, has now acquired almost all predictive policing analytics companies of note in the country, including the originator, PredPol. Like Axon and Motorola, SoundThinking is now just one of several police/surveillance tech verticles that local municipal police departments can do business with. The Shotspotter market penetration, like Axon’s body worn cameras and Evidence.com platform, gives SoundThinking an outsized advantage at pushing their dangerous technologies onto local police departments. With Uncle Sam footing the bill, it’s the rare elected official that will decline.

Oakland’s ban on predictive policing analytics arose during the Privacy Advisory Commission’s review of Forensic Logic, a so-called “Google search for cops,” which had acquired Cop Logic, the original developer. Unlike the original ACLU CCOPS model, Oakland’s ordinance is unique as it mandates that operating manuals and contracts (proposed or operative) be provided to the commission for review. Chair Hofer (and Secure Justice ED) noticed a predictive analytics feature that was glossed over during a Forensic Logic demonstration provided by the company. This technology is now owned by SoundThinking.

Secure Justice applauds the Pasco Plaintiffs for standing up for their civil rights, and the PASCO Coalition for supporting the termination of this dangerous technology and practice.

Next
Next

Newsom v. Oakland