A tiny cylindrical gadget is positioned between a framed family photo and a houseplant on a bookshelf in a suburban American living room. When you say its name, the blue ring on top of it illuminates. or occasionally when you don’t. One of the more significant privacy cases presently making its way through the federal courts revolves around this distinction—the difference between when Alexa is supposed to be listening and when it appears to be.
Since June 2021, when plaintiffs initially claimed that Amazon Echo devices were recording and storing user conversations even when no one had spoken the wake word, the class action lawsuit against Amazon over its Alexa voice assistant has been growing. The case, which was filed in Washington federal court and has withstood several attempts by Amazon to have it dismissed or narrowed, is based on these so-called “false wakes”—instances where the device activates without the trigger it’s designed to require. About 1.2 million registered Alexa device users were granted class certification by U.S. District Judge Robert Lasnik in July 2025. This decision enables the nationwide class to pursue damages and injunctive relief under Washington’s Consumer Protection Act. After four years of sluggish progress, the lawsuit reached a pivotal point.
The plaintiffs did not fully win the class certification ruling. Because their claims required too much individualized investigation, Judge Lasnik declined to certify a class for users who had not formally registered their devices. Specifically, he asked whether each unregistered user had a reasonable expectation of privacy in the vicinity of another person’s Alexa device. The judge’s caution is understandable given the genuine complexity of the question. However, the decision means that the case will proceed as a group rather than as individual claims for the 1.2 million registered users who were certified, significantly increasing the stakes for Amazon and the pressure to find a solution.
Amazon Alexa Class Action Lawsuit: 1.2 Million Users, Secret Recordings, and a Federal Judge Who Let It Proceed
| Category | Details |
|---|---|
| Company | Amazon.com, Inc. / Amazon.com Services LLC |
| Headquarters | Seattle, Washington, USA |
| Product at Issue | Amazon Echo / Alexa voice assistant devices |
| Primary Case | Kaeli Garner, et al. v. Amazon.com Inc., et al. |
| Case Number | 2:21-cv-00750 |
| Court | U.S. District Court, Western District of Washington |
| Original Filing Date | June 2021 |
| Presiding Judge | U.S. District Judge Robert Lasnik |
| Class Certification | Granted July 7, 2025 — approximately 1.2 million registered Alexa device users |
| Core Allegation | Devices recorded and stored conversations without wake word activation (“false wakes”) |
| Additional Claims | Biometric voiceprint collection (Illinois BIPA); Children’s voice data retention |
| Laws Cited | Washington Consumer Protection Act; Federal Wiretap Act; Illinois Biometric Information Privacy Act (BIPA) |
| FTC Settlement (2023) | Amazon paid $30 million to settle FTC allegations re: children’s voice data and Ring camera privacy |
| Amazon’s Defense | Claims recordings were never exploited commercially or subject to human review |
| Canadian Action | Separate class action filed in British Columbia — all Canadian Alexa users from launch through July 2023 |
| Relief Sought | Injunctive relief, damages, end to alleged recording practices |

The main tenet of Amazon’s defense is that any recordings that might have been obtained through fake wakes were never used for profit and were never examined by staff members. For precisely those reasons, the company filed a motion to dismiss the lawsuit in November 2022. The judge dismissed that argument, concluding that class certification cannot be defeated by mere conjecture about specific issues and that Amazon’s claims only addressed “limited facets” of the allegedly unfair and deceptive conduct. When it comes to whether recording someone’s private conversations without their consent constitutes a legal harm, the judge was not convinced that the lack of commercial exploitation is the whole story. That’s pretty pointed language from a federal court.
A different but connected set of allegations coexists with the Alexa lawsuit. Amazon reached a $30 million settlement with the Federal Trade Commission in 2023 over allegations of two separate privacy violations: first, that the company had violated the Children’s Online Privacy Protection Act by keeping children’s voice recordings and location data for far longer than it had disclosed or that parents had given their consent; and second, that its Ring home camera subsidiary had permitted employees and contractors to access customers’ private video footage without authorization. Amazon was not required to acknowledge any wrongdoing as part of the settlement. However, it came with a clear message from regulators about what constitutes appropriate data practices, and the Alexa voice data lawsuit indicates that this message did not fully address the issue.
It’s worth taking a moment to notice a pattern in the comment sections of legal news websites reporting on this case. The same thing has been reported by hundreds of people: Alexa lighting up without prompting. discussions about items that, hours later, show up as recommendations in their Amazon shopping feed. Unbeknownst to her, one person found recordings of private conversations between herself and her partner saved in her account. Another woke up in the middle of the night to discover that her gadget had been talking, seemingly in response to a trigger that no one else had mentioned. Although these are anecdotal and courts consider evidence differently than comment sections, the volume and consistency of these reports are noteworthy. It’s difficult to ignore the impression that something more than a software bug is being discussed here.
An additional layer is added by the case’s Illinois component. In a different lawsuit, Amazon is accused of violating Illinois’s Biometric Information Privacy Act by obtaining users’ voiceprints—basically, digital voice signatures—without the express written consent required by the state’s legislation. A number of large corporations have been subject to significant liability under Illinois BIPA litigation, which has proven to be one of the more effective tools available to consumers facing biometric data collection. In 2021, Facebook paid $650 million to resolve a BIPA voiceprint lawsuit. Amazon will keep a careful eye on that precedent.
On behalf of all Canadian Alexa users from the product’s launch until July 19, 2023, Charney Lawyers filed a separate class action lawsuit in British Columbia. They claimed that Amazon collected far more personal data than it disclosed, kept it indefinitely even after users attempted to remove it, and used it to train algorithms and artificial intelligence systems for profit. The Canadian action gives a case that is essentially about what people understood they were agreeing to when they installed a listening device in their homes more global context. Specifically, the question is whether any reasonable person would have predicted that their conversations might end up as training data for a technology company’s machine learning models after reading the product description in a holiday sales advertisement.
Whether the U.S. case will be settled or go to trial is still up in the air. Observing this lawsuit develop over years, it is evident that the central question—who owns the audio of your everyday life—will not be resolved amicably or affordably. Tens of millions of homes have Alexa installed thanks to Amazon. It is now up to the courts to decide what obligations accompany that kind of presence.
