When “Privacy” Means Permanent Surveillance
Larry Ellison recently sketched his vision of a world where police officers’ body cameras never really turn off. His words are chilling in their simplicity: “Oracle, I need two minutes to take a bathroom break … The truth is, we don’t really turn it off. What we do is, we record it, so no one can see it … we won’t listen in, unless there’s a court order.”
From the article:
"Ellison’s entire remarks are worth reading because he is pitching a comprehensive surveillance apparatus that touches most parts of being in public. More importantly, every idea he is pitching currently exists in some form, and each has massive privacy, bias, legal, or societal issues that have prevented them from being the game-changing technology that somehow makes us all safer.
[A police officer can say], ‘Oracle, I need two minutes to take a bathroom break,’ and we’ll turn it off. The truth is, we don’t really turn it off. What we do is, we record it, so no one can see it, so no one can get into that recording without a court order. So you get the privacy you requested, but court order, we will—a judge will order, that so-called bathroom break. Something comes up, I’m going to lunch with my friends. ‘Oracle, I need an hour of privacy for lunch with my friends.’ God bless. We won’t listen in, unless there’s a court order. But we transmit the video back to headquarters, and AI is constantly monitoring video.
Public access to body camera footage is also incredibly uneven; public records laws differ in each state about whether that footage can be obtained by journalists and police accountability organizations. Ellison proposes a situation where the footage would be held and analyzed not by a public police department but by Oracle and Oracle’s AI systems. “We won’t listen in, unless there’s a court order” means, of course, that it is listening in, and has all sorts of implications for who can access this sort of footage, when, and under what circumstances."
That pitch is worth sitting with. Not because it’s futuristic—but because every single piece of it already exists. Body cameras, real-time AI monitoring, cloud storage, and court-ordered access are not new. What’s new is the idea of normalizing an uninterrupted feed of human life, routed through a private corporation rather than a public agency.
The False Promise of “Turning It Off”
Ellison’s framing is slippery. “We don’t listen in” is a linguistic trick. In reality, the camera is always on. The audio and video are always transmitted. The only difference is who can access the footage, under what authority, and how much friction exists in that process.
This is not privacy. It’s surveillance deferred.
Shifting Power, Quietly
There’s a deeper structural problem here: who owns the data. Public body cam programs are already fraught—footage access is inconsistent, varying by state law and public records rules. Some communities get timely access to footage; others never see it.
Now imagine moving that footage from public custody into private corporate custody. Ellison’s proposal replaces public accountability, even if imperfect, with corporate stewardship of state surveillance infrastructure. When Oracle holds the data, who decides what counts as legitimate access? Who audits the algorithms that “monitor constantly”? Who profits from the storage, analysis, and resale of insights?
Why This Matters
The question isn’t whether AI-enhanced surveillance will work. It already does; it’s out there on the street today. The question is: What governance, accountability, and rights frameworks will we insist upon or will we abdicate before embedding such systems into everyday life?
Ellison’s remarks reveal the subtle danger: surveillance expands not through dramatic leaps, but through small promises of safety wrapped in the language of convenience and privacy. A “bathroom break” that isn’t really private. A “lunch hour” where the camera is still rolling. A court order that opens the door after the fact.
The result? Privacy becomes something you request from a corporation, rather than a right guaranteed by democratic institutions.
Default Settings and Their Implications
Ellison’s vision doesn’t just describe new technology; it describes a shift in defaults. A world where surveillance is always on unless you ask for a reprieve. A world where privacy is conditional, revocable, and managed by a corporation rather than guaranteed by public institutions.
That matters because default settings carry enormous power. They shape behavior quietly: most people won’t opt out, most organizations won’t challenge the baseline, and over time, what begins as an “option” hardens into an expectation. Seamlessness becomes the selling point, but also the trap—what feels effortless also erodes agency (for both police and those they vow to serve and protect).
When surveillance is the default, friction disappears. Without friction, we lose the pause that forces us to question whether our systems align with human dignity. Wrestling with these defaults now is the only way to prevent them from defining the terms of both our public and private lives.