Jose Saucedo arrived for a standard physical examination. Like most people after a checkup, he didn’t give it much thought before leaving. He didn’t notice anything until he was looking through his medical records on the patient portal. According to the documentation, he was informed that his visit was being filmed, and he gave his consent. He hadn’t. Not that he remembered. Not in a manner that was remotely similar to a real discussion about it.
What followed was a lawsuit that could change how clinics and hospitals around the nation consider using AI tools in exam rooms, depending on how the courts resolve it. One of the biggest health systems in Southern California, Sharp HealthCare, and a number of its affiliated medical groups are named in the complaint, which was submitted to San Diego Superior Court in November 2025. Abridge, a Pennsylvania-based company whose platform records patient-provider conversations, creates a transcript, and uses that audio to draft clinical notes for doctors to review and sign, is the AI tool at the heart of the case. More than 200 major health systems, including Johns Hopkins and the Department of Veterans Affairs, currently use Abridge’s technology. There are a lot of exam rooms there. Furthermore, it appears from this instance that some of them didn’t stop to figure out the consent piece before activating the microphones.
It is worthwhile to examine the mechanics of what is said to have occurred because they provide insight into how quickly these tools were adopted and how slowly the paperwork caught up. The complaint claims that without Saucedo’s knowledge, a clinician’s microphone-enabled device recorded the audio of his visit and sent it to Abridge’s cloud system, where it was processed and used to create a clinical note. When Saucedo later got in touch with Sharp to ask for the recording to be deleted, he was informed that the vendor keeps audio for about 30 days and that the file couldn’t be deleted right away. Instead, Sharp allegedly offered to edit or remove the AI-generated note, which is not exactly the same as erasing the recording.
Key Information: Sharp HealthCare AI Scribe Lawsuit
| Field | Details |
|---|---|
| Defendant | Sharp HealthCare (San Diego, California) |
| Affiliated Defendants | Sharp Rees-Stealy Medical Group, SharpCare Medical Group, Sharp Community Medical Group |
| Plaintiff | Jose Saucedo |
| AI Tool Involved | Abridge — ambient clinical documentation platform |
| Abridge Developer | Pennsylvania-based; led by UPMC cardiologist Shiv Rao, MD |
| Lawsuit Filed | November 26, 2025, San Diego Superior Court |
| Alleged Violations | California Invasion of Privacy Act (CIPA); Confidentiality of Medical Information Act (CMIA) |
| Sharp-Abridge Partnership Announced | April 2025 |
| Incident Date | July 2025 — routine physical exam at Sharp Rees-Stealy clinic |
| Potential Class Size | More than 100,000 Sharp patients |
| Relief Sought | Damages, medical record corrections, court order blocking AI tool use without proper consent |
| Abridge Deployment | Used at 200+ health systems including VA, Johns Hopkins, University of Illinois Chicago |

Regarding this, California’s privacy law is clear. All parties must give their consent before recording a private conversation in accordance with the California Invasion of Privacy Act. Not implicit consent. Not consent tucked away in the intake documents. Everyone involved. In addition to failing to get that consent, the complaint claims that Sharp added boilerplate language to patient charts that claimed patients had been informed and had given their consent. The plaintiff calls this language blatantly false. More so than the recording itself, that particular detail may be the most detrimental aspect of the filing. A violation of compliance occurs when consent is not obtained. It is quite another to record consent that never occurred.
It’s difficult to ignore the fact that three months prior to Saucedo’s visit, in April 2025, Sharp announced its collaboration with Abridge. By any reasonable interpretation, the rollout seems to have proceeded more quickly than the infrastructure for consent required to support it. Sharp is not the only one with that pattern. AI documentation tools have been enthusiastically embraced throughout the healthcare industry due to actual issues like physician burnout, documentation burdens, and the amount of time clinicians spend staring at screens instead of patients. Businesses like Abridge make a strong case, and when done correctly, the outcomes appear to be significant. In 2024, UW Health in Madison, Wisconsin, started testing the technology. Since then, they have released an open-source implementation guide that covers precisely the consent procedures that Saucedo’s case seems to have lacked. When a patient had reservations, a family medicine doctor there talked about stopping the recording in the middle of the visit, describing it as empowering for the patient. It’s not difficult. However, before the device is put in the pocket, someone must create the workflow.
Any Californian who had a medical visit with a Sharp provider that was recorded without consent on or after April 1, 2025, is covered by the lawsuit’s request for class-action certification. There may be more than 100,000 patients in the potential class. The statutory damages under CIPA, which are $5,000 for each infraction, make the math sobering for health systems and, to be honest, for any company that interacts with customers and uses voice-activated AI. The legal theories in this case—unauthorized recording, improper third-party access, false consent documentation, and inadequate deletion workflows—are not unique to the healthcare industry, according to law firms monitoring what is known as “digital wiretapping litigation.” Using call-center AI, chatbot summarization, or conversation analytics exposes retailers, financial services firms, and hospitality brands to the same structural risks. The consequences are felt most keenly in the exam room.
According to David Simon, an associate professor of law at Northeastern University School of Law, cases like this one must navigate current privacy and consumer protection laws because states and the federal government have been slow to develop AI-specific disclosure requirements for medicine. There is a real gap that cuts both ways. Health systems are left to interpret California’s all-party consent law as best they can in the absence of clear federal regulations. Some take great care. Others seem to move quickly and believe the advantages of the technology will outweigh the administrative burden of actually informing patients about their conversations.
Sharp has stated that patient safety and privacy continue to be organizational priorities while declining to comment on the details of the lawsuit. Abridge has not addressed the accusations in public. Whether the class will be certified or how the courts will ultimately decide are still up in the air. However, the lawsuit has already made the public face a question that ought to have been raised before the first microphone was turned on. The expectation of privacy is not a technicality in a doctor’s examination room, where people say things they don’t say anywhere else. That’s the whole idea.
