Logo

The AI surveillance state isn’t coming. It’s here

The AI surveillance economy is global: ICE buys the data, Brussels rewrites its rules, schools monitor students, and clouds keep everything alive

 Marius Becker/Picture Alliance via Getty Images

The woman at the airport probably doesn’t think of herself as data. But she is.

She’s late for an early morning flight, shoes still half untied, coffee cooling in one hand, face tilted toward a camera she barely notices. The screen flashes green — her name, her gate, her future — and she moves forward without ever showing a boarding pass or a passport. Somewhere in the system, her likeness is matched to a database, logged, and scheduled for deletion after 24 hours. Maybe it will be. Maybe.

At 35,000 feet, the woman in seat 17C opens her laptop. The Wi-Fi portal asks her to verify her identity through the same account that stores her boarding pass. She agrees, of course — who has time to retype a password? Somewhere, an algorithm recognizes her as she fills out an online form. Somewhere else, another system tags the metadata to an advertising ID. The network learns a little more about her, and about the rest of us, too.

Outside the terminal, her world hums with the same quiet choreography. The car that picks her up has already sent her route to the cloud, tracking speed and stops in the name of insurance and safety. Down the street, a doorbell blinks blue as a delivery driver passes — footage automatically uploaded, cross-referenced, stored. Across the street, a new metal pole rises beside the stop sign — the latest addition to the homeowners’ association’s “security upgrade.” The logo is small, the camera is smaller, but the promise is big: Crime down 70%, the flyer said. The stoplights adjust in sync with a traffic model trained on thousands of license-plate scans. And a thousand miles away, in an office park that could be anywhere, a dashboard updates: eight billion social-media posts ingested today, a few hundred flagged for “operational relevance.”

Inside, her home assistant greets her by name. The thermostat has already adjusted itself to match her morning pattern. On her laptop, an alert flickers from the school district: an AI-monitoring service has flagged student activity — just a test, nothing to worry about. The tone is breezy but corporate. Kind. She clicks “acknowledge.”

The rhythm of her day unfolds like choreography. She moves from gate to car to home to screen, each interface one step ahead, smoothing the edges of time. When she forgets her password, the network reminds her. When she forgets a meeting, the network joins for her. When she looks away from the news, the network queues another story she might like.

The new surveillance state isn’t made up of a fortress of cameras. No, it’s a cloud of interfaces — airport kiosks, neighborhood feeds, student dashboards, analytics suites — all running different software but speaking the same language. Some belong to governments, some to companies, and a large number to both. Together, they hum like a nervous system, sensing, classifying, and predicting. What’s remarkable about today’s AI surveillance state isn’t how omnipotent it seems — but how ordinary it all feels. 

The quiet economy of being seen

Every day, someone is being recorded by a camera they didn’t buy and watched by a system they can’t see. ICE pays private firms to track faces and names across social media; school districts let algorithms read students’ messages for signs of “risk.” The TSA logs movements at airports, neighborhood cameras track our license plates, cameras on every corner now feed private databases that police can access on demand, and the data flows endlessly upward. We’re all inside the machine now, and it’s learning faster than we can look away.

In 2025, the surveillance state runs on interlocking contracts. Each piece feels benign enough on its own, but together, they form a lattice of observation, a design language written in the grammar of “safety.”

And the machine is global. In Brussels, lawmakers who once sold Europe as the moral counterweight to Silicon Valley are preparing to cut holes in the General Data Protection Regulation — the world’s strictest privacy law. A draft of the Commission’s digital omnibus package would let AI companies process “special categories” of personal data — political opinions, religious beliefs, and health records — all reclassified as “inputs for innovation.” Even pseudonymized data could escape protection altogether. The bloc that turned privacy into export policy is now preparing to dismantle it in the name of competitiveness.

Across the Atlantic, the U.S. has skipped any moral hand-wringing and gone straight to automation. Federal records show that Immigration and Customs Enforcement paid $5.7 million for access to Zignal Labs, an AI platform that scans eight billion social-media posts a day in more than a hundred languages. The tool can recognize images, extract text, and flag “operationally relevant” content for law enforcement. ICE has since planned to expand the human side of that dragnet, budgeting full-time analysts — 12 in Vermont, 16 in California — to monitor major platforms around the clock. The agency also holds contracts for license-plate and cell-phone-location data from brokers such as Vigilant Solutions and Venntel, effectively letting it track movement without a warrant. This is the kind of program once imagined for counter-terrorism, now pointed inward — toward communities that ICE already polices with fear and opacity. The vocabulary is antiseptic — “real-time intelligence,” “curated detection feeds” — but the function is simple: surveillance on autopilot

The State Department has followed suit: Its “Catch and Revoke” program uses AI to flag visa-holders’ social media posts for signs of political dissent — an algorithmic loyalty test with life-altering consequences. And at the border, the Department of Homeland Security is reportedly planning a fleet of AI-powered surveillance trucks — the Modular Mobile Surveillance System, or M2S2. Each truck would mount radar, heat sensors, and cameras on telescoping masts to patrol remote terrain, feeding data back into CBP’s command centers through the Pentagon’s TAK platform. The program sits inside a Trump administration push to expand DHS’s budget authority to roughly $65 billion, part of a $160 billion immigration-enforcement package. The trucks’ onboard AI can run “autonomous detection and reporting” under any conditions, with mission data retained for at least 15 days and classified as Controlled Unclassified Information — a category that limits access without ever declaring the data secret.

Private industry has taken the same technology and wrapped it in pastel marketing. Flock Safety’s license-plate readers line around 5,000 communities, sending 20 billion scans a month into shared databases accessible to both police and neighborhood groups. Even the commute reports back: Connected-car systems quietly transmitted driver behavior to data brokers and insurers until the FTC banned GM from selling location data. TSA’s face-matching kiosks, now in more than 250 airports, keep images for 24 hours for audit purposes. Digital IDs in Apple and Google Wallet are becoming the next step in that evolution — convenience today, biometric default tomorrow.

And in classrooms, AI-powered monitoring software is becoming standard equipment. Tools from companies such as GoGuardian and Lightspeed Systems run silently on tens of millions of school-issued laptops, scanning students’ chats, emails, and searches for what the companies call “signals of harm.” According to Bloomberg and the Associated Press, GoGuardian now covers roughly 25 million students, Lightspeed about 20 million, Securly another 20 million, and Gaggle around 6 million. The Center for Democracy and Technology found that 29% of teachers say schools track students’ personal devices, and 6% report instances where ICE was contacted, based on monitored data.

The tech has already drawn scrutiny: Last year, the FTC accused Evolv Technologies of deceiving customers about its AI weapons detection accuracy in schools and subways, but the contracts keep coming. And the justification is always the same: safety, efficiency, and prevention. But each justification hides the same trade-off. These systems don’t just watch for danger; they define it. The algorithm decides which expression looks risky, which phrase counts as a threat, and which face matches a pattern. The result of all of this is a feedback loop that never sleeps: a country convinced that it’s safer because it is constantly seen.

Parents consent to this structure in terms-of-service agreements; school boards renew it in procurement cycles; regulators defend it as modernization. The surveillance state no longer needs coercion when it can rely on compliance. The cameras don’t blink, the feeds don’t rest, and the data never stops climbing the hierarchy — from homes to servers to agencies to markets — until visibility itself becomes the price of participation.

Surveillance as a service

Surveillance has gone legit — sold by the terabyte, renewed by subscription, and priced like progress. The watchers you can see are only the front end. Behind them sits a market that treats visibility like inventory. Data moves from one ledger to the next: scraped by a contractor, enriched by a broker, hosted by a cloud, queried by software, invoiced by the month. Visibility is the world’s fastest-growing commodity, and the only thing maybe more valuable than being seen is making sure everyone else is.

Perhaps no company embodies that evolution more clearly than Palantir. Its software underwrites the modern surveillance economy — translating chaos into dashboards, turning power into data fluency. The firm’s government business now brings in more than half its revenue, led by the Department of Defense, ICE, and public health agencies that use its platforms to forecast movement and risk. CEO Alex Karp describes it as patriotic work — an extension of the American project itself: “The chance of world survival goes up as America becomes stronger, more dominant.” In the same interview, he dismissed critics entirely — saying that “discussion about human rights… only serves people who want to live in a world that doesn’t work.”

The cloud makes the architecture of surveillance durable. Amazon’s GovCloud, Microsoft’s Azure Government, and Google’s public-sector unit rent out the compute and storage that keep the feeds alive. Body-camera footage flows into Axon’s evidence platform; city command centers sync video, sensors, and 911 audio through Motorola Solutions; agencies bolt on off-the-shelf “AI” to flag faces, plates, and patterns at scale. These aren’t spy tools in the traditional sense; they’re infrastructure contracts, quietly binding federal agencies, police departments, and city councils to private architectures they’ll never fully control. Once the data lives inside those servers — facial images, license-plate histories, predictive-policing outputs — it’s nearly impossible to extricate.

The brokers are the quiet kings. Thomson Reuters’ CLEAR and LexisNexis Risk Solutions sell dossiers built from court records, credit headers, utility files, and scraped web trails; their products are staples in government procurement lists. Venntel and similar firms package precise app-location traces and license them out under “marketing” or “public safety.” Fog Data Science pitched local police a way to search phones’ movement histories without a warrant by buying them on the open market. 

And facial recognition sits at the crossover point between private appetite and public power. Clearview AI scraped billions of images from the social web and pitched apparent matches to investigators; court settlements narrowed who it can sell to, but didn’t erase the template. Retail chains deploy enterprise face-matching to spot “known offenders.” Stadiums screen for banned fans. Airports expand “identity verification.” Each sector claims a narrow use case. The effect is general: Faces become search fields.

Police work has been retooled around queries that would have sounded impossible a decade ago. Geofence warrants pull every device that lingered near a crime scene; keyword warrants pull everyone who searched a particular phrase. Even when courts push back, the habit remains: Investigate the data first, the person second. That logic pairs neatly with the brokers’ catalogs and the clouds’ capacity. It also sells.

Schools and workplaces are steady customers. Gaggle, Securly, and Bark monitor student documents and chats for “indicators of risk.” Proctoring software watches eye movements during exams. On the job, productivity suites ship with keystroke, email, and meeting-behavior analytics; logistics firms instrument drivers down to the second. Each product is framed as care, compliance, or efficiency. Each one normalizes being scored by a system you don’t control.

Lobbying keeps the gears greased. Trade groups argue that access to large, messy datasets is essential for “innovation” and “competitiveness.” Privacy is recast as drag. The pitch lands because it aligns with austerity: Replacing people with software looks like modernization on a spreadsheet. Once you’ve paid to wire the city, the cheapest path is to keep using it. 

And through it all, the money keeps flowing. Every new contract becomes a proof point for the need to see more. Faster. This is how the machine pays for itself. Brokers feed clouds. Clouds feed analytics. Analytics justify new feeds. Procurement cycles turn into pipelines; pilots turn into platforms; “temporary” exceptions become permanent features of the stack. By the time a parent asks who is behind the school’s dashboard, the answer is a supply chain.

Even if people wanted to stop the AI surveillance state, there’s no one switch to flip off. Surveillance has become infrastructure — invisible, indispensable, and everywhere at once. It powers the logistics that move our packages, the algorithms that score our credit, and the systems that promise us safety. The systems we build to understand behavior have begun to define it, flattening lives into probabilities, preferences, and risk scores. Every convenience conceals a transaction, every transaction is a trace.

We built AI to see the world more clearly. Now, the world sees us first.

📬 Sign up for the Daily Brief

Our free, fast and fun briefing on the global economy, delivered every weekday morning.