Hidden Trade-Offs: What Gets Lost in The Conversation About AI In Clinical Documentation
- Dr. Annabelle Frazier

- 6 days ago
- 8 min read
Your last client walks out. You sit down to chart, and you know you're looking at another hour of typing tonight. Or tomorrow morning before your first appointment. Or both.
Enter AI-powered documentation: software that listens to your session, generates structured notes, and deposits them into your EHR while you're already seeing the next client. Sounds perfect. And it seems like therapy and psychology practice owners agree. A recent report indicated that 65% of psychology practices intended to integrate AI tools in the next year.
But before you sign up for that ambient listening scribe, it's worth asking what you're actually getting… And what you might be giving up.
AI in Documentation: What's The Appeal?
Let's be honest: the admin burden is real. Therapists spend entire evenings and weekends on notes and other documentation. The promise of five or more (as many as a whopping thirty!) hours of time savings per week isn't just highly compelling marketing copy. For some clinicians, it's the difference between a manageable schedule and burnout.
Beyond the time savings, AI tools promise several appealing features. You stay present with clients instead of half-listening while scribbling notes. Your documentation comes out consistent and structured. Your EHR gets populated almost instantly, with some tools promising superior clinical quality, as well. It all feels like finally getting your evenings back.
For someone drowning in paperwork while trying to do good work, that relief is compelling. More than compelling. Necessary-feeling.
What Gets Missed in The Conversation
Here's the piece that keeps getting glossed over in the marketing materials: writing your own clinical notes is not just administrative work. It's where you actually process the session.
When you sit down to write about your session note, you're translating what happened into specific clinical terms. You're deciding what mattered. You're noticing patterns. You're integrating what you heard and saw into your clinical understanding of that person. That act of translation is clinical thinking. It's how you develop your case formulation. It's how you catch things you might have missed in the moment.
When AI generates your notes, you become someone reviewing and editing rather than someone thinking. You check for typos. You maybe rework a sentence or add a detail. But the deep work of reflecting on what you heard and what it means? That gets skipped.
It's not that AI is necessarily bad technology. It's that you're trading clinical reflection for efficiency. And clinical reflection is kind of central to being a therapist.
Recording Changes the Room
There's another piece worth naming. Therapy works partly because it's a human-only space. Private. Confidential. Just you and your client.
Recording introduces a third presence into that relationship. Most clients will consent. Some will refuse, while others will consent but feel the shift in what they're willing to say. When someone is processing trauma or shame or fear, that subtle change in safety matters. They might talk less freely. They might edit what they say. They might decide not to go as deep.
And then there's the data question: where does that audio actually go? How long does it stay? Is it used to train the model? If the vendor gets acquired, what happens to your clients' recorded sessions? Without explicit contractual language spelling this out, you're making an assumption about your data security.
It sounds like a technical detail. But it's not. It's about what you're promising your clients when you say this space is confidential and asking them to allow an AI tool in the room.
What These Tools Are, and What They Do
The variety of AI tools for documentation has grown immensely over the past couple of years. Ambient listening tools, like Freed, Suki, Berries, or Abridge, run continuously, transcribe your session, and auto-generate a note you review before signing.
Dictation-based tools (AutoNotes, TherapyNotes Fuel), on the other hand, let you summarize your session verbally after it ends. The AI structures that summary into a note. Faster than typing but still requires you to remember key details. Then, there are EHR-integrated systems, such as the SimplePractice Note Taker, which live in your existing platform and use recording or dictation.
Finally, purpose-built specialty platforms (e.g., Twofold, Supanote, Mentalyc, Upheal) have emerged, and these market themselves as designed specifically for mental health providers. Each claims to have strong privacy standards, clinical language recognition, and handy templates.
The critical differences are in security, accuracy, and consent handling. Some require proper Business Associate Agreements. Others skirt around them. Some delete audio immediately; others keep it longer. These details matter because they're your liability if something goes wrong.
That All Sounds Great. So, What’s The Problem?
Oh, there are many. Let me tell you.
First, there are the HIPAA issues. Your AI tool records Protected Health Information. That means it needs actual security: end-to-end encryption, access controls, secure storage, and a signed Business Associate Agreement with the vendor. Many vendors won't do this. Some avoid answering the question entirely. That's a huge red flag. Even if a vendor claims HIPAA compliance, you're responsible for making sure it's real. An audit or licensing board complaint will hold you accountable for your choice, not the vendor.
But even if security can be verified, you’re often left with accuracy issues.
AI documentation tools train on thousands of therapy notes. Sounds good until you realize those notes carry the same blindnesses as the clinicians who wrote them: cultural stereotypes, missing context, lack of nuance, and poor clinical writing (because let’s face it, most clinicians haven’t been taught how to write clinical records well) …
Recall your clinical internship, and the writing you and your peers did at that time. Would you copy another intern’s notes, to save time? That’s how AI learned to write its notes, and AI thinks that’s what a good note is. And don’t even get me started on what AI considers to be a quality assessment or intake.
Practitioners and researchers report that AI-generated notes frequently misinterpret tone, include interventions you never used, omit clinically relevant details, and produce verbose summaries that don't actually save time. The editing requirement eats up most of your supposed time savings.
And biases also get baked in. Your AI tool was trained on data written by thousands of clinicians. That data reflects their biases. If you use a model trained on a particular set of clinical notes, you're likely reproducing patterns embedded in those notes. A note that mischaracterizes a client due to cultural bias is your documentation error, not the vendor's. And if your client reviews their records and sees bias, that becomes a discrimination claim against you.
Perhaps the scariest part of the trend, to me, is that you can’t know where your data goes. Your session gets recorded. The note gets generated. The audio disappears (or so you're told). The transcript disappears (or so you're told). But what about the processing data? The training feedback? The parameter adjustments? How long do those persist? What happens if the company is acquired? If law enforcement (or some other federal authority) requests the data for some reason? If the company decides to make a therapy chat-bot, based on you and your work? Without explicit contractual language spelling all this out, you're essentially guessing. I know – no one actually reads the whole contract, or the terms of service. But, in healthcare practice, guesses usually go badly.
If an AI note is inaccurate, incomplete, or misleading, it's still your clinical record. Your liability. Your licensing risk. Courts and licensing boards assess the whole record, and "the AI wrote it" won't save you in a complaint or lawsuit. You're responsible for everything the tool generates when you sign your name.
There Are Alternatives to AI-Generated Documentation
If the last few paragraphs got you seriously considering reading those terms of service, you can stop now. You're not stuck between burnout and compliance risk. Real alternatives exist.
You could…
Use one of several existing clinical record templates, offered by several reputable clinical documentation training and coaching providers. Most of these folks have good advice to offer, and good templates that minimize audit risks while increasing efficiency.
Create your own documentation template. It can be as simple as an outline for each section of a progress note, and a list of the interventions you most often use. Or, you can go all out, creating checkboxes (yes, they’re fine to use!), and short narrative sections for relevant elements. Regardless of the level of complexity, templates cut decision fatigue without requiring AI. You still write, but you're not starting from scratch every time.
Use human dictation instead. Record yourself summarizing your session immediately afterward while it's fresh. You keep full control of interpretation. Your clinical thinking stays yours. Speech-to-text tools (like Nuance Dragon software) have been used as accessibility accommodations in counseling agencies for years and have an established record of safety.
Or, You Could Just Address The Root Cause of Your Documentation Struggle.
I know – easier said than done. But still… If documentation is such an unsustainable burden, the root cause might be one (or more) of the following three issues:
Caseload and scheduling madness – you have too many clients, or your schedule has no time built in for documentation between sessions. Is yours a time management and scheduling issue? Consider whether you can schedule differently, or whether you can reduce the number of clients you see. Even a small change (like going from 60 to 55-minute sessions) can make a big difference.
Writing skill deficits, and so much fear – Most therapists were never actually taught to write efficient, high-quality clinical notes (or any other documentation, for that matter!). You might not know what’s relevant, what to skip, how to structure for clarity… So, you might find yourself writing extremely long notes (pages upon pages) or struggling to write anything at all. You might even theoretically know what should be in a note but feel paralyzed by anxiety every time you sit down to write them. These problems are best remedied with good, supportive training and practice, with a supportive consultant or supervisor. Yes, you should have had that during your internship. No, it's not too late.
A (misplaced) disdain for documentation– you may have been taught that documentation cannot be a core part of your actual therapy work. You may have been told that billable documentation requires you to “fake it.” Somehow, only CBT interventions (for example) belong in billable documentation. If, god forbid, you practice in a more psychodynamic style, you must translate the work into CBT interventions or whatever else is in the intervention bank. None of it feels authentic to what happened in the session, and taking the time to write it feels worse than wasting time – it takes too long, it doesn’t help you think, and it’s a lie. But that idea is a myth. You absolutely can write high-quality, efficient, psychodynamic billable records, and you really should do that, if that’s the work you’re doing. Whoever told you that your kind of work didn’t “count” just didn’t know how to write about your work. You could learn how to write records that actually reflect what you do. A bit of training can get you there.
It doesn’t have to be this way.
AI documentation promises time savings. For some clinicians in some situations, it may even deliver. But that time often comes at a cost to clinical reflection, data security, and the integrity of the therapeutic relationship. Amazingly, it’s a trade-off many good therapists are willing to make. The thing is, you don't have to accept these costs as necessary tradeoffs. It doesn't have to be this way.
Not every record has to be time-consuming. Some perfectly reasonable billable notes might be four sentences long. Yes, four sentences. I’ve seen a few good ones that were even shorter still. If you’re sitting there wondering how anyone can document an entire session in four sentences, you might be over-writing (see the three core issues above).
The answer isn't handing your work over to an algorithm. It's re-shaping the way you think about (and operate) your practice, so documenting the work you did feels like a meaningful part of the work itself. A tool for processing and planning, not a box to check; an important post-session moment to breathe and reflect, and not just another administrative burden.


Comments