Secure Justice

View Original

What To Read/Watch


Grab your copy here


After hiQ Labs, Is Scraping Public Data Legal? (Guest Blog Post)

Last year, the most important case in the history of web scraping—hiQ Labs, Inc. v. LinkedIn Corp.—settled. After two trips to the 9th Circuit, a remand from the Supreme Court, and nearly six years of motions and posturing, the outcome of the litigation was a permanent injunction against hiQ, a win for LinkedIn, and insolvency for scraper hiQ Labs.

Read Kieran McCarthy blog post here.



This document provides an overview of how lower courts have applied the Supreme Court’s landmark location tracking case, Carpenter v. United States, 585 U.S. ___, 138 S. Ct. 2206 (2018), to other digital searches. Below I identify various government electronic surveillance techniques and flag cases where courts have addressed whether the technique at issue is a search in light of Carpenter. I have tried to identify the major cases, but this document is not comprehensive, and focuses primarily on federal circuit court and state Supreme Court decisions that decide the issue on federal Fourth Amendment grounds.

I plan to update this document occasionally. The most recent version will be viewable at https://n2t.net/ark:/85779/j4ww8w


How should society respond to police surveillance technologies? This question has been at the center of national debates around facial recognition, predictive policing, and digital tracking technologies. It is a debate that has divided activists, law enforcement officials, and academics and will be a central question for years to come as police surveillance technology grows in scale and scope. Do you trust police to use the technology without regulation? Do you ban surveillance technology as a manifestation of discriminatory carceral power that cannot be reformed? Can you regulate police surveillance with a combination of technocratic rules, policies, audits, and legal reforms? This Article explores the taxonomy of past approaches to policing technologies and—finding them all lacking—offers the “tyrant test” as an alternative.

The tyrant test focuses on power. Because surveillance technology offers government a new power to monitor and control citizens, the response must check that power. The question is how, and the answer is to assume the worst. Power will be abused, and constraints must work backwards from that cynical starting point. The tyrant test requires institutional checks that decenter government power into overlapping community institutions with real authority and enforceable individual rights.

The tyrant test borrows its structure from an existing legal framework also designed to address the rise of a potentially tyrannical power—the United States Constitution and, more specifically, the Fourth Amendment. Fearful of a centralized federal government with privacy invading intentions, the Fourth Amendment—as metaphor and methodology—offers a guide to approaching surveillance; it allows some technologies but only within a self-reinforcing system of structural checks and balances with power centered in opposition to government. The fear of tyrannical power motivated the original Fourth Amendment and still offers lessons for how society should address the growth of powerful, new surveillance technologies.