ALPRs and the Fourth Amendment: What Schmidt v. City of Norfolk Gets Wrong

(by Pegah K. Parsi)

In Schmidt v. City of Norfolk, No. 2:24-cv-621 (E.D. Va. Jan. 27, 2026), a federal court upheld Norfolk’s citywide automated license plate reader (ALPR) program against a Fourth Amendment challenge. The court concluded that the program did not amount to a “search” because it did not provide continuous tracking, contained gaps, and captured only information visible on public roads.

At first glance, the ruling may seem narrow and technical. In reality, it has broad consequences. If the court’s reasoning stands, governments across the country will be free to (continue to) collect, store, and analyze large-scale location data about ordinary people without warrants, meaningful oversight, or individualized suspicion.

The decision reflects a deeper problem: courts are struggling to apply Fourth Amendment principles to modern, data-driven surveillance increasingly shaped by artificial intelligence. By relying on analog-era assumptions about observation and exposure, the court misapplies Carpenter v. United States and Leaders of a Beautiful Struggle v. Baltimore Police Department, and underestimates how modern analytics transform scattered data into powerful surveillance.

The result is a decision that is legally fragile and consequential for privacy.

Why This Case Matters Beyond Norfolk

ALPR systems are largely invisible. Cameras mounted on poles, streetlights, or police vehicles record license plates, locations, and timestamps. Each record seems insignificant. Over time, however, those records accumulate into a detailed map of where people live, work, learn, socialize, worship, seek medical care, and engage in political activity. They reveal patterns of association and behavior.

The real question in Schmidt is not whether police can observe a car on a public street. It is whether the government can build a persistent, searchable database of everyone’s movements without judicial oversight. If courts treat such systems as constitutionally harmless because data points are public or infrequent, Fourth Amendment protections risk becoming hollow in an era of mass data collection and artificial intelligence.

The Court’s Core Mistake: Separating Collection from Analysis

The court suggests that data collection is harmless unless officers actively analyze it. At one point, it states that a collection of photos taken in public places reveals nothing about a person’s movements if it is never accessed or analyzed.

That premise is flawed.

The constitutional problem is not observation, but the creation of persistent traceability. The Fourth Amendment concern is not that the government observes what is public, but that it converts fleeting public activity into durable, searchable state knowledge. Mass location collection is best understood as a seizure of data, and the privacy harm arises when the state creates the capacity to trace past movements and infer patterns about people’s lives.

In Carpenter v. United States, the Supreme Court emphasized that the constitutional harm lies in the government’s ability to reconstruct a person’s movements through a searchable dataset. 585 U.S. 296, 312–13 (2018). ALPR systems are designed to create precisely this kind of retrospective record.

Moreover, data collection itself creates concrete risks before any review occurs. Once collected, data can be breached, repurposed, shared, or misused, and individuals may alter their behavior in anticipation of surveillance. By treating analysis rather than collection as the constitutional trigger, the court understates both the logic of Carpenter and the realities of modern data practices.

Requiring Near-Perfect Surveillance Misreads Carpenter

The court emphasizes that Norfolk’s system is not like an ankle monitor and does not provide near-constant tracking. It highlights gaps in time and space and concludes that the system is therefore not constitutionally problematic.

But Carpenter rejected this logic. The Supreme Court warned against treating surveillance as constitutionally significant only when it is perfect or continuous. 585 U.S. at 312–13. Even partial datasets can reveal sensitive patterns, including home and work locations and visits to religious, medical, or political sites. Insight does not grow linearly with data volume; relatively sparse data can yield rich inferences when aggregated over time and combined with other information.

By effectively requiring near-total surveillance before constitutional protections apply, the court resurrects a quantitative threshold the Supreme Court has already rejected.

Misunderstanding Modern Inference

A central feature of the court’s reasoning is that gaps in ALPR coverage prevent meaningful tracking. Because cameras do not capture every moment of movement, the court concludes that the system cannot reveal a person’s life patterns.

This reflects a misunderstanding of modern analytics.

Continuous observation is no longer necessary to reconstruct movement. With repeated captures, road networks, traffic models, and vehicle identification techniques, ALPR systems can infer likely routes, destinations, and routines. In contemporary surveillance systems, inference is not incidental; it is essential.

This approach also conflicts with Leaders of a Beautiful Struggle, where the Fourth Circuit held that surveillance may violate the Fourth Amendment when it enables deductions about the whole of a person’s movements over time, even if interpretation requires additional steps. 2 F.4th 330 (4th Cir. 2021) (en banc).

Power Asymmetry and Chilling Effects

The court frames the issue largely as a technical question about how much information ALPR systems collect. But Fourth Amendment reasonableness also implicates asymmetries of knowledge and power, as well as chilling effects.

ALPR systems differ from ordinary cameras because they are systematic, identity-linked, retrospective, automated, and scalable, and increasingly AI-enabled. They give the state a persistent observational advantage over citizens. As Carpenter explained, people do not expect the government to secretly and continuously monitor and catalogue their movements over time, even in public spaces. 585 U.S. at 314. The privacy harm arises at the moment when the state captures and retains data, because that act alters the balance of power between individuals and government in ways that are difficult to reverse. Those structural consequences should matter in any Fourth Amendment analysis.

The Stakes

If courts require near-total surveillance before recognizing a Fourth Amendment search, governments will be free to build expansive location databases so long as they avoid perfect coverage. That is not a narrow outcome. It is a blueprint for dragnet surveillance.

ALPR systems are not just cameras. They are components of distributed surveillance networks capable of reconstructing the patterns of everyday life.

Suggestions

For Courts and Judges

Courts should recognize that aggregated location surveillance is fundamentally different from isolated public observation. When the government creates an identity-linked database of movements over time, Fourth Amendment scrutiny should apply even if each individual data point was captured in public.

Judges should:

  • apply Carpenter and Beautiful Struggle to modern surveillance systems rather than analogizing to traditional observation,

  • reject thresholds that require near-perfect surveillance before constitutional protections attach, and

  • evaluate surveillance programs based on identity linkage, longitudinal persistence, queryability, analytic capability, scale, automation, and systemic impact, not merely the frequency of data collection.

For Legislators and Policymakers

Legislatures should not assume that existing safeguards adequately protect privacy. Retention limits and internal policies do not address the core issue: the creation of large-scale location databases without individualized suspicion.

Policymakers should consider:

  • warrant requirements for law enforcement access to ALPR data,

  • strict limits on retention, sharing, and secondary use,

  • transparency and public reporting requirements,

  • prohibitions on bulk or suspicionless querying of location data, and

  • regular external audits of law enforcement practices.

These measures would not eliminate ALPR technology. They would align its use with constitutional values and democratic accountability.

For the Public and Civil Society

The Schmidt decision exposes a gap between how surveillance technologies actually operate and how courts sometimes understand them. Civil society organizations, technologists, and scholars play a critical role in closing that gap by building factual records and informing courts and policymakers.

Without sustained attention, tools designed for traffic enforcement or crime prevention can quietly evolve into infrastructure for pervasive monitoring.

Conclusion

Schmidt v. City of Norfolk illustrates a defining challenge for Fourth Amendment law: whether constitutional protections will adapt to data-driven, AI-enabled surveillance or remain tied to outdated assumptions about observation and privacy. If courts continue to require near-total surveillance before recognizing a search, they risk normalizing pervasive monitoring as the constitutional baseline.

Donate To Support Secure Justice

Pegah K Parsi is a privacy attorney and advocate. Besides consulting on law enforcement impacts on privacy, she works on education, research, and health privacy. Pegah previously served two terms as Vice-Chair on the City of San Diego’s Privacy Advisory Board.

Next
Next

California’s Immigration Firewall Has a Blind Spot