Want $20 for each Apple device that might have eavesdropped on you? The Cupertino-based tech giant is ready to shell out up to $95 million to settle claims that Siri was a bit too eager to listen in on private conversations.
Apple hasn’t admitted wrongdoing, but agreed to compensate U.S. users up to $100 per household, according to court documents filed December 13, 2024 in Oakland, California.
The settlement addresses allegations that its voice assistant recorded conversations without the “Hey Siri” wake word and potentially stored and shared this data with advertisers—something Apple has denied in previous occasions.
Here’s a quick guide to claiming your share of the $95 million settlement.
How to get paid
To qualify for the settlement, you’ll need to be a U.S. resident who owned one or more qualifying devices between September 17, 2014, and December 31, 2024. The process requires submitting a claim by May 15, 2025, and verifying under oath that Siri activated without your permission.
The website to submit your claim is not active right now. Users will have to keep an eye on the news and refer to the official page in order to be considered once the site goes active. It should be ready in less than 45 days.
The settlement covers a wide range of Apple devices, including iPhone 6 and newer models, iPads released since 2014, all generations of the Apple Watch, the HomePod and HomePod Mini, as well as MacBooks and iMacs manufactured since 2014.
Under the settlement terms, users can receive $20 per qualifying device, with a maximum payout of $100 per household for up to five devices. The final payment could increase if fewer claims are filed than expected. The legal team representing the plaintiffs will receive approximately $30 million from the settlement fund.
The claims process begins with the launch of the official settlement website, expected by February 2025. Users should gather their device serial numbers or proof of purchase beforehand. Once the site launches, claimants can complete the online form, submit any requested documentation, select their preferred payment method, and submit their claim before the May 15 deadline.
Hey Siri, stop listening
The lawsuit comes from a 2019 exposé by The Guardian, which revealed that Apple contractors regularly accessed private Siri recordings. According to the claims, contractors reported hearing medical appointments, business deals, and intimate moments—and also allegedly shared them with advertisers.
Lead plaintiff Fumiko Lopez’s experience highlights the potential privacy breach. As reported by the BBC, shortly after discussing Air Jordan shoes at home, she and her daughter noticed targeted advertisements for the exact models they mentioned. Another plaintiff reported seeing ads for specific medical treatments shortly after discussing them with their doctor.
“Apple has at all times denied and continues to deny any and all alleged wrongdoing and liability,” the court filing states. The company maintains that Siri data collection serves only to improve the service and remains anonymized.
Besides the $95 million payment, the settlement also requires Apple to confirm the permanent deletion of all Siri audio recordings collected before October 2019.
This settlement arrives amid growing concerns about AI-powered voice assistants, and AI in general. Similar lawsuits targeted other tech giants, with Google facing a parallel class action suit also in California.
“Plaintiffs in the lawsuit allege that Google Assistant can activate and record communications even when a user does not intentionally trigger Google Assistant with a hot word, like ‘Okay Google,’ or manually activate Google Assistant on their device,” the official site for the class action lawsuit reads.
Amazon agreed in 2023 to pay $25 million for similar privacy violations tied to its Alexa devices, with the SEC’s statement noting that its “complaint alleges that Amazon retained children’s voice recordings indefinitely by default” in violation of a law.
Of course, all of these companies have previously claimed to respect and protect their users’ privacy. This is especially important considering that all are developing their own generative AI models to improve their user experience, and this requires tons and tons of data.
If you want to be extra careful and protect your privacy, you can prevent Siri from automatically activating—or stop using AI assistants at all. Not ideal, but that’s the world we live in.
Edited by Andrew Hayward