The Future of AI Is Ubiquitous Surveillance

A photo of a security camera with the title of the article overlayed.

With AI companies running out of data to scrape while quickly polluting the existing sources with synthetic data, they’re growing desperate. With their backs against the wall, they’ve decided the next frontier is you. Get ready for the great push for wearables.

Table of Contents

Introducing the Corporate Surveillance State

Recently, I saw a video by Ordinary Things titled Will A.I. Slop Kill the Internet? In short, it was mostly in line with the types of critiques I’ve been making in this series, but it clued me into something I hadn’t thought about: the next step for AI companies is ubiquitous surveillance.

See, as I’ve said before, tech bros are running out of data to scrape for their models. As a result, their models can’t get better. In fact, as long as synthetic data outpaces genuine “data,” new models will only be worse than their predecessors. That means today’s models are as good as they’re going to get, right?

Well, it turns out that tech bros are willing to break down the entire fabric of society to bring about their tech deity. To do that, they need more organic data, and there’s a huge untapped collection of organic data: day-to-day life.

To get this data, it’s not like tech companies are going to bring about a surveillance state directly. They’re going to do it in a way that only people like the king of violating privacy, Mark Zuckerberg, would do. They’re going to create a product that provides some silly service in exchange for you as data.

The product that Ordinary Things references is the Limitless AI Pendant, a necklace that transcribes your daily conversations. This might not seem like something new. After all, every app seems to have a transcription service. You can’t even have a Zoom meeting now without your conversation transcribed and backed up into a database somewhere forever.

What is new, however, is having a recording device active at all times in broad daylight, which seems way more dystopic. A casual conversation at the grocery store is no longer sacred. Even now, I don’t record my lectures because I see my classroom as a place where I can speak freely. If something like this pendant were to become ubiquitous, it would be the death of candor.

Personally, I remember when constant recording was something people fearmongered about with Alexa, but now companies are pitching the idea of openly subjecting yourself—and more importantly, others without their consent—to constant surveillance through wearables. The future looks bleak.

The Failure of Wearables

In the last couple decades, wearables have been tried and failed on many occasions. After all, who remembers Google Glass? Also, who couldn’t forget the first AI pin casualty, Humane AI? Even today, the only wearables I’ll sport are a FitBit, which Google sadly killed last year, and noise-cancelling headphones.

It’s unclear why wearables beyond watches and headphones never caught on. If I had to guess, many of them were either too bulky, required constant charging, or looked too stupid to really catch on. Perhaps that’s why freaks like Elon Musk have moved beyond wearables and straight to implants. Alternatively, it could be that smart phones are good enough for the average person. What more could you want?

Anyway, I’m not sure wearables have gotten less dorky, but now they have the power of AI. That means, they can act as your personal assistant like a constantly active Siri or Alexa on-the-go.

In addition, companies are getting more clever with their use of AI. Now, they can offer on demand transcription and analysis to allow you to offload your memory and thinking to the device entirely. Surely, there will be visual description services soon enough as well.

Now that I think about it, I wouldn’t be surprised if some evil version of Tamagotchi made an appearance as an AI-powered wearable. After all, we’re literally dying for companionship with bots. Likewise, it won’t be long before Bumble gives you a wearable assistant to coach you on your dates. Oh wait, I think Cluely is already selling something like that.

My only hope is that Jony Ive is so washed that not even he can convince us that wearable tech is cool. The last thing we need is people getting stupider because they’ve offloaded their entire brains to machines while openly violating the autonomy of everyone around them. It’s bad enough we have everyone recording everything for a viral moment.

Ethics Outside of the Tech Bubble

All of this is sort of funny to me because I conducted an ethnography for my dissertation research. As a part of that ethnography, I performed participant observations in the classroom by writing down what I saw and heard. I also asked a handful of participants to meet me over Zoom, so I could have more explicit transcripts to reference later.

Because of how invasive an ethnography can be, I was required to weigh the risks and benefits of such work. For example, I was required to receive informed consent from every student who was willing to participate. Likewise, because Zoom recordings are slightly more invasive, I offered literal payment (in the form of gift cards) to participants who took part in them.

So to me, the idea of being able to train AI models by harvesting conversations from folks—who otherwise might not even know they’re being recorded—is morally bankrupt at best. Even offering a service in exchange for that data doesn’t begin to approach a fair trade. Not to mention that these companies know you’ll pay a subscription for it, so they’ll harvest your data and collect a bag at the same time. It’s gross.

Until recently, tech bros have been more than happy to scrape the entire internet because “what can you expect, you put it up there for all to see” (which I’m not even sure is true). Now, I’m not sure how they can justify recording conversations, especially in private? I mean Limitless advertises their device for meetings, which certainly include confidential discussions. Are they going to officially move the goal posts to “well, it’s for the greater good”? I would like to see that argument.

Anyway, that’s enough of a rant for the day. If their was a moral of the story, don’t fall for new wearable tech. It’s just another way of harvesting data for their insatiable models.

Hopefully, you enjoyed that. If so, there is plenty more I’ve ranted about in the past:

Likewise, you can take your support a bit further by checking out my list of ways to grow the site. Otherwise, thanks for reading.


Hello! As of late, I’ve been using this little space to drop thoughts and ideas that don’t make their way into the article. For instance, while I would be truly surprised if wearables made a comeback, I do think that gives us a unique opportunity to really poison these models. After all, in digital spaces, there are a lot of safeguards you can put in place to fight back against poisoning, but the real world is a different ball game.

I could see folks buying these wearables just to leave them in a room with the same song or movie playing. I could also see folks building out their own devices to disrupt the recording of audio and visual data. Perhaps even those noisy stickers will make a comeback to confuse models.

If everything is data to these corrupt corporations, then the least we can do is make it a bit harder for them to get what they want.

Edit 2025-07-05: I don’t usually include these specific edits, especially dated since this article is going out way after I have this edit included. That said, I conveniently wrapped up this article yesterday when the “We’re in Hell” channel posted their KILLER AI video. A huge section of that video is on AI being the solution for the big data problem but also how AI itself has a need for more data. While I focused on consumer surveillance, their video is on a much more insidious form of surveillance used to train weapons of war—mostly thanks to Israel and the US. Isn’t the military industrial complex wonderful, folks?

The Hater's Guide to Generative AI (14 Articles)—Series Navigation

As a self-described hater of generative AI, I figured I might as well group up all my related articles into one series. During the earlier moments in the series, I share why I’m skeptical of generative AI as a technology. Later, I share more direct critiques. Feel free to follow me along for the ride.

Jeremy Grifski

Jeremy grew up in a small town where he enjoyed playing soccer and video games, practicing taekwondo, and trading Pokémon cards. Once out of the nest, he pursued a Bachelors in Computer Engineering with a minor in Game Design. After college, he spent about two years writing software for a major engineering company. Then, he earned a master's in Computer Science and Engineering. Most recently, he earned a PhD in Engineering Education and now works as a Lecturer. In his spare time, Jeremy enjoys spending time with his wife and kid, playing Overwatch 2, Lethal Company, and Baldur's Gate 3, reading manga, watching Penguins hockey, and traveling the world.

Recent Blog Posts