Burnout 8 min read

The Two-Hour Tax: A Family Doctor's Honest Take on AI Notes

ActiveScribe Team •

Notes from one of our early pilot physicians, edited and published with permission. — The ActiveScribe Team

There's a number that gets quoted in every burnout study, and it's the number that finally made me pay attention to AI scribes. For every hour I spend in front of a patient, I used to spend roughly two hours documenting that visit. Some of that was during the day, between rooms, with a patient already waiting. A lot of it was at night, on my couch, after my kids went to bed. The medical literature has a nickname for that — "pajama time" — which I find both accurate and infuriating.

I want to write honestly about what changed when I started using ActiveScribe in clinic, and what didn't change, because the honest version is more useful than the marketing version.

What changed: the room

The first thing that changed wasn't documentation. It was eye contact.

I had spent eleven years training myself to type on a laptop while a patient told me their symptoms. I was good at it — I could type a chief complaint while nodding at the right moments — but I knew, every visit, that I was performing two jobs badly instead of one job well. The day I stopped typing and just listened, the visit got better in ways I hadn't planned for. Patients told me more. I caught things I would have missed. I left the room knowing what was going on.

That alone would have been worth the cost.

What changed: the note

ActiveScribe transcribes the visit in real time and produces a SOAP note that's ready for me to review by the time the patient is in the hallway. The note is in my style — not because it was trained on my data, but because I uploaded a few of my old notes when I onboarded, and the system uses them as exemplars to match my voice. My SOAPs sound like I wrote them. The abbreviations are mine. The section headers are mine. The verbosity (or lack of it) is mine.

The notes also have something I didn't expect: citation badges. Every claim in the generated note has a small marker I can click to jump to the exact part of the transcript where the patient said it. If a note says "patient denies suicidal ideation," I can verify in two seconds that the patient actually said that, in those words, at that moment in the visit. That feature alone moved me from "I trust this with caveats" to "I trust this."

What didn't change: my responsibility

Here's the part that gets glossed over in every AI scribe pitch: I still review every note. ActiveScribe is fast — I'm the doctor. The College of Physicians and Surgeons doesn't care that an AI generated my chart. If I sign it, it's mine. If it's wrong, that's on me.

What ActiveScribe changed is the kind of work I do on the chart. Before, I was constructing the note from memory, which meant the bottleneck was my recall. Now I'm reviewing a draft against a transcript, which means the bottleneck is my attention. Reviewing is genuinely faster than constructing — for me, by a factor of about four. But it's not free, and it shouldn't be.

I've seen colleagues skim AI-generated notes and click "approve" without reading. That's malpractice with extra steps. The tool doesn't absolve you of judgment. It frees up the time you used to spend on mechanics, so you can spend it on judgment.

What didn't change: the visits where AI doesn't help

Some visits don't benefit from AI scribing at all. A complex psychiatric assessment with a patient in crisis is not the moment to be thinking about whether your microphone is picking up the conversation. A pelvic exam isn't a moment when anyone wants a recording running. A hostile patient who's asked me to put away "anything that's listening" gets a closed laptop and a pen.

The right framing isn't "AI scribe replaces note-taking." It's "AI scribe is a tool I reach for when it helps and put down when it doesn't."

The honest math

I haven't measured pajama time scientifically, but it has dropped to roughly 20% of what it was. I get home and I'm a parent again. That's the change that matters. Burnout isn't only about documentation — it's also about staffing, panel size, billing complexity, on-call, and the slow grind of running a practice in a system that wasn't designed for the work we actually do. AI scribes don't fix any of that.

But they fix the documentation part. And the documentation part was, for me, the difference between a job I could do for thirty more years and a job I was going to leave.

What I'd tell a colleague who's skeptical

Try it for two weeks. Pick the visits where it's appropriate — routine follow-ups, well-baby visits, chronic disease management. Not the hard ones, not at first. Read every note carefully, not because the tool is going to lie to you, but because you need to develop a feel for how it can be wrong, the same way you developed a feel for how lab values can be wrong.

If it works for you, you'll know within ten visits. If it doesn't, walk away. The cost of trying is one weekend of onboarding. The cost of not trying — if it would have worked for you — is measured in years of your life.

Try it for two weeks

Join the waitlist for early access to ActiveScribe.

Join the Waitlist