Artificial Intelligence in Clinical Documentation and Billing: Innovation Requires Oversight

Mar 31, 2026

Written By Elizabeth Cifers

Written By

Artificial intelligence has officially entered the exam room. It listens, drafts, suggests, and in some cases, nudges coding decisions before anyone enters the codes for the claim.

Let’s be clear: healthcare documentation is too complex, too regulated, and too financially consequential to be treated as a beta test.

In retina, the patient encounter comprises layered elements, including the patient's complaint, exam findings, diagnostic testing, and evolving and resolving pathology. Treatment decisions can change based on diagnostic test findings, subtle changes in vision, and the physician’s experience and knowledge.

AI offers operational promise, but it also introduces measurable compliance risk. The question is not whether AI will influence documentation and billing workflows (it already does), but whether practices are implementing it with appropriate oversight or simply hoping it behaves.

Hope is not a compliance strategy.

AI Is Already Writing the Chart

Ambient AI scribes are creating visit notes in real time, while coding platforms are analyzing documentation even before the ink is metaphorically dry. Some systems now suggest CPT levels and diagnosis codes as the physician is still dictating.

When it works well, AI can:

  • Reduce documentation fatigue
  • Improve note structure
  • Accelerate charge capture
  • Highlight relevant prior data
  • Provide coding prompts

In a retina clinic, where physician time is the limiting factor, operational efficiency matters.

But completeness should never be mistaken for correctness.

AI can generate highly polished documentation that is slightly off in nuance. In retina, “slightly off” matters. Trace subretinal fluid versus worsening edema. Trace subretinal fluid versus clinically significant recurrence. Observation versus treatment. Those distinctions are not academic because they directly impact medical necessity and reimbursement.

And here’s the part that doesn’t change just because the software is impressive: the physician is still the author of the chart.

CMS Has Already Weighed In

CMS has already addressed AI in documentation workflows, and the message is refreshingly straightforward.

In its July 2025 Medicare Learning Network (MLN905364) guidance on signature requirements, CMS states that when a scribe, including AI technology, is used, the physician must sign the entry to authenticate the care provided or ordered. There is no requirement to identify who or what transcribed the note.

In other words, even if AI drafts it, the physician still owns it.

CMS has also emphasized that technology-enabled care, including AI-supported tools, requires safeguards: privacy protections, human oversight, and monitoring for accuracy and safety. Assistance is permitted, but the physician remains responsible.

Now, here is where it gets interesting.

CMS has issued parallel guidance to Medicare Advantage plans on the use of algorithms and AI in coverage determinations. Plans may use algorithms to support decision-making, but those tools cannot be the sole basis for denying coverage. Determinations must still be grounded in the individual patient’s clinical circumstances, applicable Medicare coverage criteria, and documented medical necessity.

Translation: Payers do not get to say “the algorithm denied it” any more than physicians get to say “the algorithm wrote it.”

Both sides are held to the same core standards:

  • Individualized clinical consideration
  • Compliance with Medicare rules
  • Human oversight of automated tools
  • Accountability for the final decision

There is no double standard here. Physicians must ensure that documentation supports medical necessity before billing. Payers must ensure that coverage decisions reflect Medicare rules and patient-specific facts before denying.

Technology may assist the process, but it does not replace professional judgment.

The symmetry matters.

When CMS makes clear that algorithms cannot independently justify a denial, it reinforces the broader principle already familiar to physicians: automated output does not equal compliant decision-making.

Which brings us back to the practical reality in your clinic.

If AI drafts the note, you sign it.

If AI suggests the code, you validate it.

If the claim is submitted, it must be supported.

On both sides of the claim, CMS has been consistent: human accountability remains firmly in place.

Oversight Is the Strategy

AI is not a “turn it on and walk away” solution. Its performance depends on workflow design, physician editing habits, and whether anyone is actually reviewing the output beyond appreciating how polished it sounds.

Validation must be intentional and include periodic audits of AI-generated notes, spot-checks of suggested coding levels, monitoring denial trends, and reviewing documentation for inflated risk language or unsupported medical necessity.

And here is the part that should feel familiar.

One of the long-standing criticisms of EHRs has been cloned documentation, the notes that look impressive but read the same from patient to patient. AI can reproduce that problem at a faster pace. Patients are variable, so the documentation should be, too.

Longer notes do not necessarily mean stronger notes. Excess detail can obscure the clinical reasoning that supports code selection. Clarity remains more defensible than volume.

Before signing the chart, ask one simple question: Does this note clearly reflect what was unique about this patient, on this day, and why this decision was made?

If the answer requires interpretation, start editing.

Oversight is ultimately a leadership responsibility. Technology will move faster than regulations. The practices that benefit most from AI are not those that trust it blindly but those that supervise it deliberately.

A Measured Path Forward

AI can meaningfully reduce administrative friction and support physician efficiency when implemented deliberately. Without structure, it can just as easily introduce documentation drift, coding ambiguity, and audit exposure.

In many ways, AI brings us back to fundamentals:

  • Clear documentation.
  • Intentional coding.
  • Active oversight.
  • Ongoing audit awareness.

Treat AI as an operational partner, not a clinical decision-maker, and definitely not a compliance shield. Even in an AI-assisted environment, one principle remains unchanged—it is neither new nor negotiable: If it is billed, the record must clearly support it.

AI is not going away, so oversight cannot either. If your practice is adopting ambient AI, automated coding prompts, or AI-assisted documentation tools, now is the time to evaluate structure, validation processes, and compliance safeguards.

Elizabeth works exclusively with retina practices to assess AI documentation workflows, review coding integrity, and build oversight frameworks that safeguard reimbursement and professional accountability.

Innovation is welcome. Unsupervised innovation is costly.

If you would like a strategic review of your AI documentation and billing processes, contact ECC.

Need Expert Guidance?
Get personalized insights to optimize your retina clinic’s operations, compliance, and revenue. Schedule a free consultation today.
Book a Call Now
View infographic

Related Articles You Might Like

Browse More Articles
New Article:
Stop Losing Revenue

Insights for Better Retina Practice Management

Elizabeth shares actionable tips and strategies to help you run a more efficient, compliant, and profitable retina practice—no spam, just value.

Insights for Better Retina Practice Management

Elizabeth shares actionable tips and strategies to help you run a more efficient, compliant, and profitable retina practice—no spam, just value.

Sign Me Up
Important: You must check “Sign up for news and updates” to subscribe.

Thank you for signing up!

You'll now receive expert insights, industry updates, and practical tips to keep your retina clinic running smoothly. Stay tuned for valuable content straight to your inbox.

Oops! Something went wrong while submitting the form.