The Device We Trust

It started with the weather. Then a timer for the pasta. Then music while we cooked dinner. Before long, we were using it for everything — shopping lists, news briefings, checking our calendars, calling our moms hands-free while we folded laundry.

Alexa learned our habits. She turns on the lights when we get home. She locks the front door at 10 p.m. She knows which brand of paper towels we buy, what shows we watch, and what time we set our alarms. She plays white noise for the baby and audiobooks when we can’t sleep.

It’s genuinely useful. Voice assistants have made life easier for millions of people — especially older adults, people with mobility challenges, people juggling kids and work and everything in between. The convenience is real. Nobody’s pretending otherwise.

But there’s a second conversation happening in our living rooms. One we didn’t start. One we can’t hear. And it never stops.

What It’s Actually Doing

Every voice command we give Alexa is recorded, transmitted, and stored on Amazon’s servers. Not just the command itself — the audio. Our actual voices. Amazon employees and contractors have reviewed these recordings. The company admitted it in 2019 after Bloomberg broke the story: thousands of workers around the world were listening to Alexa recordings from people’s homes, including conversations that users believed were private.

But the voice recordings are only the beginning. Amazon has filed patents for analyzing voice data to detect emotional states, health conditions, and physical characteristics. The pitch of our voices. The pace of our speech. A cough pattern that might indicate illness. A tremor that might suggest a neurological condition. Our voices are biometric data, and Alexa is capturing them every time we speak.

Then there’s our Roombas. In 2022, Amazon tried to acquire iRobot — the company that makes Roomba — for $1.7 billion. The EU blocked it on antitrust grounds, and Amazon abandoned the deal in January 2024. But the attempted acquisition revealed what Big Tech actually wants: home mapping data. Think about what a Roomba does. It maps our homes. It knows the dimensions of every room, where our furniture sits, how our space is laid out. It knows which rooms are large and which are small. It knows if we have a nursery. And the story didn’t end there. In December 2025, iRobot filed for Chapter 11 bankruptcy and agreed to be acquired by Shenzhen Picea Robotics — the Chinese manufacturer that had been building the Roombas all along. American home mapping data, potentially headed to a Chinese company. The surveillance didn’t stop when Amazon walked away. It just found a new buyer.

And then there’s fall detection. Newer Alexa devices can use ultrasonic sensors and microphone arrays to detect when someone falls. For elderly users living alone, this could be lifesaving. The device listens for the sound of impact, monitors for movement afterward, and can call emergency services automatically. It’s a genuinely important safety feature.

But here’s the paradox: the same always-on microphone that detects a fall also detects everything else. The sensor array that monitors whether grandma is moving around the house is the same sensor array that builds a behavioral profile of everyone in the home. The life-saving feature and the surveillance feature are the same feature. We can’t separate them — not the way the system is currently built.

Amazon knows when we wake up, what we eat, what we watch, who we call, and what our floor plans look like. And they didn’t break in. We invited them.

Who’s Profiting

Amazon didn’t build Alexa out of generosity. The Echo was sold at cost — sometimes below cost — because the device itself was never the product. We were. Every interaction with Alexa feeds Amazon’s advertising and retail ecosystem. Ask Alexa to order batteries and she defaults to Amazon’s store. Ask for a recipe and she surfaces sponsored content. The assistant isn’t neutral. It’s a salesperson that lives in our kitchens.

The voice data feeds Amazon’s machine learning models, which power its ad targeting across the web. The shopping data informs product recommendations. The smart-home data — when we turn lights on and off, when we lock the door, when we’re home and when we’re not — builds a behavioral profile that has value far beyond selling us paper towels.

Amazon’s attempt to buy iRobot failed, but the playbook is clear: combine spatial data with behavioral data. Ring doorbell footage from outside, Alexa data from inside — Amazon has already assembled a surveillance infrastructure that covers the approach to our homes, the interior of our homes, and the daily patterns of everyone who lives there. And now iRobot’s mapping data — the layout of millions of American homes — is on its way to Chinese ownership through bankruptcy court.

No single government agency has that kind of access. No law enforcement body could get a warrant that broad. But Amazon built it one convenience feature at a time, and millions of people opted in because each piece — the doorbell, the speaker, the vacuum — seemed harmless on its own.

In 2023, the FTC fined Amazon $25 million for violating children’s privacy through Alexa — retaining kids’ voice recordings and geolocation data even after parents requested deletion. A $25 million fine against a company worth $1.5 trillion. That’s not a penalty. That’s a rounding error.

The mic that saves our lives is the mic that sells us wheelchairs. That’s not a conspiracy — it’s a business model. And until we separate the service from the surveillance, every new convenience comes with an invisible cost.

I’m not saying fall detection is bad. I’m saying it shouldn’t require handing Amazon a complete behavioral map of our homes and our health. We can build devices that detect falls without monetizing the data. We can build voice assistants that answer our questions without recording the conversation. The technology exists. What’s missing is the business incentive to use it — because right now, surveillance pays better than service.

What Real Privacy Looks Like

The answer isn’t unplugging everything and going back to light switches and paper maps. The answer is building technology that works for us instead of on us.

  • Local-processing voice assistants — projects like Home Assistant and Open Voice OS (OVOS) have proven that voice control can work entirely on our own hardware. Our voice commands get processed on a chip in our houses, not on a server in Virginia. No recording leaves our homes. No corporation stores our audio. The lights still turn on. The timer still goes off. The only difference is that nobody else is listening.
  • Right to disconnect — every smart device sold in America should be required to function in an offline mode. If we buy a speaker, it should play music without phoning home. If we buy a vacuum, it should clean our floors without uploading our floor plan. The baseline functionality we paid for should never depend on surrendering our data.
  • Devices that work offline by default — cloud connectivity should be opt-in, not opt-out. The device works out of the box with zero data transmission. If we want cloud features — remote access, cross-device sync, voice history — we can enable them with a clear, plain-English explanation of what data will be collected and where it will go. And “no thanks” doesn’t brick the device.
  • Separation of safety and surveillance — fall detection, smoke alarms, medical alerts — these are critical services. They should be built on architectures that keep the safety function without the data harvesting. A device that calls 911 when we fall doesn’t need to know what we watched on TV last night. These functions can and should be decoupled.
  • Meaningful enforcement — when a company worth $1.5 trillion gets fined $25 million for violating children’s privacy, the incentive structure is broken. Penalties need to be proportional to revenue. Executives need personal liability. And consumers need a private right of action — the ability to sue when their data is misused, without waiting for a federal agency to act on their behalf.

Our living rooms should be the most private spaces we have. The place where we talk to our families, relax after work, live our lives without an audience. Technology should make that space more comfortable — not more surveilled. We have the tools to build it right. We just need the rules to demand it.