A Day in the Surveilled Life

I will begin with a hot take: the Francis Ford Coppola movie with the most to say to us in 2022 is not The Godfather, not Godfather II, but The Conversation. Gene Hackman plays Harry Caul, a surveillance expert who overhears something he shouldn’t, and is in turn spied upon. The movie ends with Harry destroying his own apartment—tearing down the walls, ripping up the floorboards—in search of a listening device he knows is there, but never finds. The last, iconic shot shows him sitting in a chair in his underwear, playing the saxophone in the ruins of his home and his privacy.

Almost fifty years later, it’s an image that’s prescient and quaint at the same time. Prescient because the notion that someone powerful is watching and/or listening to us, even in our most mundane, domestic moments, has gone from a tenet of hipster paranoia to an uncontested fact—a multi-billion-dollar postulate of twenty-first century life. The quaint part is that Harry Caul, or any of us, would mind that much. We participate in the process of our own surveillance every day. The devices that record us aren’t buried in our walls (well, sometimes they are, but we’ll get to that later): we strap them to our wrists, put them on our fingers, carry them in our pockets, and install them in our homes. We gravitate toward connectivity, and if we do that without ever really asking ourselves “connected to what?”, it’s not because we’re stupid or gullible—it’s because the answer doesn’t really bother us the way it used to. Harry Caul, despite the irony of being a high-tech surveillance artist himself (and man, was the tech awful back then), was meant to be a kind of Everyman. Today he’d be a crank, a bore, a troll, a Luddite. Whatever it was he thought he was warning you about, you’d be like, “Yeah, all right, I get it, enough already, I know.”

And maybe you do know. I mean, if you’re reading this, you’re already on the internet, so the whole notion of being routinely tracked is probably not a great revelation to you. But let’s start with first principles: in the era of what author Shoshanna Zuboff has dubbed “surveillance capitalism,” you are no longer simply a potential customer to be courted or seduced. You are a kind of hive of information about yourself—your own movements, routines, fantasies, relationships, tastes, behaviors—that is constantly extracted from you and sold to others. Think of those others as bettors: anybody who’s ever placed a bet on, say, a sporting event knows the one thing that provides an edge is information. These folks, using the information that you give them, are betting on what you will need next, so that when you go get it, they can be standing there waiting to sell it to you.


It’s counterintuitive, but true, that the place where you are most vulnerable to surveillance is within the walls of your home. If you’re living anything like an average twenty-first century life, your home is basically a panopticon: a minefield where most of the mines were laid by you. Some of them are on your person, like a high-tech watch or a ring that records your sleep patterns—or, God knows, your phone, a kind of self-Lojack that understands (and shares and sells) more about you and your habits than you understand about yourself. As for devices external to you, it’s amusing to think how easily seduced we are by the word “smart.” Smart phones begat smart televisions, smart thermostats, smart refrigerators, smart vacuum cleaners, smart home security systems, smart mattresses, smart Barbies (!), and on and on. Every one of these gadgets is giving something to you—convenience, novelty, indolence-enabling, the buzz that always comes with embracing cutting-edge technology—in exchange for something from you, and that something is not the purchase price of the gadget. It is the widening of the pipeline of data that flows not into your home but out of it, sold in the predictive marketplace in order to turn those bets on your future behavior into something closer to a sure thing.

We participate in the process of our own surveillance every day.

So let’s say you think about all this a little too hard, and you start to develop a little Harry Caul-style paranoia about relaxing in the privacy of your home. To escape the feeling of being watched, you throw your phone in a drawer, walk out your front door, and hit the street. Maybe you own a car, but if you’re thinking of that car in the time-honored American way—as the instrument and symbol of freedom and fetterlessness and escape—think again. Any car with a computer in it—in other words, all of them, unless you’re driving something vintage—has remote data-sharing capabilities that probably haven’t occurred to you. Not only does it disclose your real-time location to anyone willing to pay for that information, it can also be remotely disabled if you haven’t paid your car bill, or your insurance bill, or, indeed, some other bill entirely. Think about what might happen to your insurance rates if your car itself recorded you speeding, rather than some random cop with a radar gun. Because that is happening.

Maybe just take a walk, then. And here is where it gets interesting.


In the outside world, surveillance moves from the realm of at least arguable complicity (i.e., nobody forced you to buy that iPhone) to the truly secret and the involuntary. When you download a system update on your Roomba or your Oura without reading the endless Terms and Conditions, because life is too short—well, at least you clicked something that said, “I Accept.” The ways you’re tracked outside your home ask for no such acceptance, and they involve the one bit of wearable tech you can’t take off: your face.

We’re all familiar with (and pretty accepting of) the presence of security cameras in spaces where there’s a risk of property theft or some other sort of security issue, like airports. But the cameras, like their operators, are growing less passive. The undisputed kings of facial recognition technology are the Chinese; more than half of the world’s nearly one billion surveillance cameras are located there. Chinese state and law enforcement agencies have begun placing cameras not in high-risk or high-crime areas, but wherever people go most regularly, like restaurants, hotels, residential buildings, theaters, karaoke lounges, etc. They are collecting individual voice prints using sound recorders attached to those facial recognition cameras, with a radius of at least 300 feet; then, software combines those voice prints with facial analysis to help locate suspects even faster. The category of “suspects” surely includes political dissidents or protestors of any kind, but the whole rationale of “crime prevention” is only a fig leaf in any case; the goal of these devices is not to record crimes, but to perfect the database. The Chinese, as opposed to PR-conscious American entities like Google or Amazon, are more willing to say the quiet part out loud: their goal is to “leverage the explosion of personal data in order to improve citizens’ behavior.”

The goal of these devices is not to record crimes, but to perfect the database.

Here in the US, the goal is basically the same, only the ideal toward which that “behavior” is bent is not social order, but maximum profit. The instruments of surveillance expand their reach here in different ways: across the US, for instance, budget-strapped communities actually compete for the opportunity to be turned into a “smart city,” wired top to bottom by Google or Cisco—a partnership between business and state involving millions of dollars’ worth of free surveillance equipment bearing the promise of “modernizing” things like traffic flow, parking revenue, and law enforcement. (Complaints about this partnership between oligopoly and state are met with the familiar rhetoric—“if you have nothing to hide, then you have nothing to worry about”—that always accompanies the loss of rights.) If you played Pokémon Go, you were part of a mass social engineering experiment conducted by Google in the guise of a “game” to see whether and how they could get large groups of people to travel to specific places at specific times. And then there’s the Metaverse, a monopolistic totalitarian nightmare in which life itself is recast in the form of a horrifying fishbowl and any pretense to privacy is gleefully waived. Its ultimate aim seems to be to make the world of our for-profit surveillance so airtight, we’ll forget that it’s there at all. In the chilling words of former Google CEO Eric Schmidt: “The most profound technologies are those that disappear.”

And what of it? Aren’t we all here on this American soil to try to extract a profit from one another? If I’m addicted enough to convenience to buy a refrigerator that reminds me when to buy milk, when that refrigerator connects itself to my home computer and dredges it for information about me, which it then sells to a third party, do I really have a case?

I don’t want to gloss over the fact that we are consistently, systematically lied to about what these surveillance conglomerates are doing and why. Remember the advent of Google Earth? I remember someone forwarding me a link to it when it was brand new. I marveled at the experience of typing in the address of the house I grew up in and seeing it appear, in navigable 3D, on the screen before me. Google Earth begat Google Street View, which got some fun PR out of wrapping in cartoonish plastic the camera-topped cars they sent to record every street, every house on earth. Turns out those cars weren’t just taking pictures; they were also assembling a vast databank of your and your neighbors’ wireless account names and IP addresses. When they were asked about it, they said no, of course they weren’t doing that, but then when they were taken to court over it, they admitted that yeah, they were totally doing that. Vizio at first denied that its “smart TVs” were actually recording and monetizing your conversations while you sat on the couch and watched TV. But yeah, they were totally doing that too.

Maybe part of our indifference in the face of revelations like this derives from the fact that we’re not being watched by other people, per se, which would be upsetting and viscerally creepy—we’re being watched by inanimate intelligence, which then algorithmically turns the fruit of that surveillance into profit for other people. Defending your privacy from something inanimate makes you a weirdo, or depending on where you live, a dissident.


I wrote a novel called Sugar Street about a guy who would very much like to register such a dissent: a guy old-fashioned enough to feel violated by the idea of being watched without his knowledge or permission by anyone, for any purpose. He wants to be in charge of the barrier between himself and those who would like to know where he is or what he’s doing. How old-fashioned is that? What this poor nameless guy (well, he has a name, but he’s not about to tell you) would really like is to go off the grid and live independently, beholden to no one. The problem is that “going off the grid” usually involves living in the deep woods somewhere, providing your own food, fuel, etc., and this guy—like me, and I’m guessing like most of you reading this—possesses absolutely zero ability in this area. He’s never gotten food from anywhere but a grocery store and has no survival skills whatsoever. He wouldn’t last a week off the grid. So what he tries instead is to stay on the grid, but live invisibly there. It’s not easy, but it helps that he’s also in possession of a big envelope full of stolen cash…

Sugar Street

Sugar Street

Sugar Street

Which he needs, because there are no more credit cards or electronic transactions of any kind for him. No bank accounts, no checks—a cash-only life. He can’t live anywhere requiring a lease or a rental contract, any place where a bank or a landlord would run a credit check on him. He can’t pay for electricity, gas, water; nothing in his name. No internet, no cable TV, no device with wireless capability of any kind. That’s all relatively easy—mostly a matter of learning to live with less. What he perceives as his greatest threat, though, is facial recognition technology. He may be paranoid and egocentric and obsessive and a tad out of his mind, but he’s not wrong.

People pride themselves, in this digital age, on being an open book. My novel is about a guy who wants to close the book of himself. We all have those moments, I think, even the most gleefully hyper-connected among us—moments when we long for a little more autonomy, even if the exercise of that autonomy is as pointless as sitting at home playing the saxophone in your underwear. But the peculiar thing about the indignation we feel, in those moments when the technology itself crosses some boundary of what we think belongs to us and what doesn’t, is what a short life it has. We get used to everything, and they count on that. The outrage over Google Glass—the nerdish, interactive spectacles that turned you into a kind of walking, Matrix-like data battery—was so pronounced that it actually drove that tech off the market. I’ll bet you a million dollars it’s coming back, and this time we’ll consider any objection to it to be old-fashioned and uncool. Capitalism is very good at the waiting game. No one ever went broke gambling on the “meh” of the American public.

Jonathan Dee is the author of eight novels including The Privileges and Sugar Street, published this month by Grove Press. 

This content is imported from OpenWeb. You may be able to find the same content in another format, or you may be able to find more information, at their web site.