Source Article – National Law Review | August 21, 2025 “HIPAA Compliance Risks with AI Scribes in Health Care: What Digital Health Leaders Need to Know”
With the workforce shortages facing healthcare tools such as AI scribes that create efficiencies in encounter documentation and reduce clinician burnout are growing more attractive all the time!
AI scribes are proving to save clinicians valuable time in documenting patient encounters but selecting a vendor to implement but be vetted thoroughly to make sure that the PHI data recorded are accurate, secure, and HIPAA compliant. This article in the National Law review outlines potential HIPAA risks associated with using AI scribes:
- Training AI on PHI Without Proper Authorization – If the AI tool is trained using prior encounter notes containing PHI without patient authorization there is potential risk of a HIPAA violation.
- Improper Business Associate Agreements (BAAs) – The BAA or service contract must clearly define the data being accessed, how it is being used and stored.
- Lacking Security Safeguards – AI scribes are expected to be targets of security breaches due to the nature of the data captured, putting the healthcare entity at risk of fines, lawsuits and damage to their reputation.
- Model Hallucinations and Misdirected Outputs – AI scribes come with the risk of fabricating clinical information or even attributing information to the wrong patient. Scribe information should always be reviewed by the clinician for accuracy.
- De-Identification Fallacies – Scribe vendors often fail to strictly follow either of the two permissible methods of de-identification under HIPAA at 45 C.F.R. § 164.51
For further details on HIPAA compliance and mitigating risk related to AI Scribe implementation see the article linked above.