Watching the Watchers

Privacy, Technology, and the Slow Surrender of the Self

Marc Friedman
Marc Friedman
PUBLISHED IN Privacy - 9 MINS - Mar 25, 2026
Watching the Watchers

Few rights feel more instinctively human than the right to be left alone. Long before legislatures codified it or courts enforced it, people understood that certain spaces and certain conversations belonged to them alone. The Fourth Amendment to the United States Constitution formalized that intuition in 1791, protecting citizens against unreasonable searches and seizures.

In 1890, Louis Brandeis and Samuel Warren wrote a landmark law review article arguing that privacy was nothing less than "the right to be let alone," a phrase that has echoed through legal scholarship ever since. The Supreme Court has since recognized privacy as implicit in the liberties guaranteed by the Constitution, rooted in the First, Third, Fourth, and Fourteenth Amendments. Internationally, Article 12 of the Universal Declaration of Human Rights declares that no one shall be subjected to arbitrary interference with their privacy. The foundation is solid. The question is whether it is holding.

It may not be holding for long. The technologies bearing down on privacy today are not the stuff of science fiction. They are already here, already deployed, already generating data about you whether you know it or not.

Surveillance cameras number in the hundreds of millions worldwide, blanketing city streets, transit stations, parking lots, retail stores, and apartment lobbies. Drones, once the exclusive province of the military, are now flown by law enforcement agencies, private investigators, real estate companies, and curious neighbors. License plate readers mounted on police cruisers or fixed to highway overpasses track the movements of vehicles across entire metropolitan areas, storing records of where you were, when, and for how long.

One company, Flock Safety, illustrates how quickly these systems can metastasize beyond their original justification. Flock sells cloud-connected license plate reader cameras to police departments and private customers across the country, funnels the readings into its own centralized servers, and allows law enforcement to conduct nationwide searches of the resulting database, giving even a small-town police chief access to a surveillance tool of staggering scope. The system was marketed as a way to catch local criminals. It has become something considerably more expansive.

Records obtained by the technology news outlet 404Media revealed that local officers have used Flock's database to conduct searches on behalf of Immigration and Customs Enforcement, tracking individuals for deportation purposes. A Texas officer used the system to search nationwide for a woman who had obtained an abortion, legal in the state where it occurred, but not in Texas.

Flock is also planning to link its license plate data directly to commercial data brokers, allowing law enforcement to, in the company's own words, "jump from LPR to person," instantly connecting a plate reading to a detailed personal profile. The company has begun upgrading its cameras from still images to video, with AI-powered natural language search tools that could allow officers to search footage by descriptions of vehicle occupants and bystanders.

Flock has also launched what it calls a "Flock Business Network," inviting private companies to share surveillance data with one another and maintain shared hotlists of vehicles to be flagged on sight. What began as a tool for catching car thieves is rapidly becoming an infrastructure for comprehensive movement tracking, corporate blacklisting, and the outsourcing of government surveillance to private hands.

Facial recognition technology amplifies all of this dramatically. Where a camera once captured an image, a facial recognition system can now identify who is in that image, cross-reference the identity against databases, and flag the person for follow-up. The technology has been deployed in airports, sports stadiums, schools, and shopping malls.

Some police departments use facial recognition to identify suspects from surveillance footage. The error rates, particularly for people with darker skin tones, have been documented as alarmingly high, leading to wrongful arrests. Yet adoption continues. The appeal to law enforcement and security professionals is obvious. The implications for everyone else deserve far more scrutiny than they are receiving.

Beyond the physical infrastructure of surveillance, a quieter and perhaps more insidious data economy has taken shape online. Every search query, every purchase, every like, every location check-in generates a data point. Individually, these fragments mean little.

Aggregated, they create a portrait of your habits, beliefs, relationships, health concerns, and political sympathies that is more detailed than anything a private investigator could compile through weeks of legwork. Companies you have never heard of hold files on you. Brokers buy and sell your personal information the way commodity traders buy and sell soybeans.

This brings us to the data aggregation industry, a sprawling and largely unregulated business that operates in the background of modern life. Companies like Acxiom, one of the oldest and largest data brokers in the world, have been compiling detailed consumer profiles since 1969, drawing on purchase histories, loyalty programs, survey data, and online behavior to build portraits of hundreds of millions of individuals.

Spokeo and LexisNexis gather publicly available records, social media activity, and court filings to assemble profiles that employers, landlords, law enforcement agencies, and private investigators can access for a fee. Gravy Analytics and its subsidiary Venntel have collected and sold precise consumer location data, tracking where people go with a granularity that once would have required a physical tail.

In December 2024, the Federal Trade Commission took enforcement action against both Gravy Analytics and a company called Mobilewalla for what regulators described as the unfair collection and sale of sensitive consumer location data, including visits to medical facilities, places of worship, and other locations that can reveal deeply personal information about a person's life.

The defenders of this industry, and it is a large and profitable one worth an estimated 270 billion dollars globally in 2024, argue that they are simply collecting publicly available information. If you posted something on social media, registered to vote, filed a business license, or appeared in a news article, that information is technically public. There is something to this argument. Transparency has its own virtues, and accountability journalism depends on the ability to compile and analyze public records.

But critics raise a point that the aggregators tend to gloss over: the aggregation problem. Information that seems harmless in isolation becomes something very different when combined. Knowing that someone attended a particular church is not very revealing. Knowing that they also attended a support group meeting, filled a prescription at a certain pharmacy, and searched online for information about a particular medical condition tells a story that person may have shared with no one.

None of those individual facts was secret. Together, they constitute an intimate disclosure that the person never consented to make. The whole, in this case, is not just greater than the sum of its parts. It is categorically different.

The harder question is whether most people actually care. Survey data consistently suggests that Americans are concerned about privacy in the abstract. Pew Research has found that roughly 73 percent of Americans feel they have little to no control over what companies do with their data, and that 67 percent say they understand little or nothing about how their information is being used.

Yet their actual behavior tells a different story. People trade away detailed personal information for free apps, loyalty discounts, and the minor convenience of not having to remember passwords. They post photographs of their children online, share their locations in real time, and sign terms of service agreements without reading them. Privacy, it seems, is a value people hold firmly right up until the moment it costs them something.

This gap between stated preferences and actual behavior is not entirely the fault of individual consumers. The system is designed to make privacy-protecting choices difficult and privacy-surrendering choices easy. Opting out of tracking requires navigating menus buried in settings pages. Declining cookies on websites often means clicking through multiple screens of dark patterns engineered to wear down your resistance.

Even when data brokers are legally required to provide opt-out mechanisms, as they are in a handful of states including California, Vermont, Texas, and Oregon, the process can be laborious enough to discourage all but the most determined. The architecture of the modern internet is built on surveillance, and navigating it without contributing data is like trying to drive across the country without using roads built by the government you distrust.

So are we approaching the Orwellian state that George Orwell described in 1984, where the government and its agents peer into every corner of private life? The honest answer is that we are not quite there, but the infrastructure to get there is being built at a remarkable pace. What distinguishes our current situation from Orwell's nightmare is not the absence of surveillance capability. It is the absence, so far, of the political will to deploy it fully against the general population.

That is a reassuring fact that carries within it a deeply unsettling conditional. Governments that lack the will to surveil their citizens today may not lack it tomorrow. And when that emerges, the cameras, the databases, the aggregated profiles, and the facial recognition systems will already be in place.

The challenge is not to halt technological development, which would be neither possible nor desirable. Technology creates genuine benefits: safer cities, faster justice, more efficient services. The challenge is to insist on governance that keeps pace with capability.

In my view this means strong data protection laws with real teeth and real enforcement. It means transparency requirements forcing companies to disclose what they collect and how they use it. It means limits on how long data can be retained and strict rules about when government agencies can access commercial databases without a warrant. It means algorithmic accountability, so that decisions affecting people's lives, their insurance rates, their job applications, their interactions with police, cannot be laundered through an opaque system that no one can audit.

None of this requires choosing between security and freedom, between innovation and privacy. It requires choosing leaders and enacting laws that take both seriously. Privacy is not a luxury for people with something to hide. It is the condition under which people think freely, speak honestly, form relationships, make mistakes, and change their minds. Lose it, and you lose something that cannot easily be recovered. The watchers are watching. The question is whether those being watched are paying attention.