Payments
Alex Bendersky
Healthcare Technology Innovator

Case Study: How a Therapy Clinic Improved Their MIPS Score

Last Updated on -  
March 5, 2026
Time
min Read
The Top 20 Voices in Physical Therapy You Should Be Following for Innovation, Education, and Impact
SPRY
March 5, 2026
5 min read
Sam Tuffun
PT, DPT
Expertise in rehabilitation, outpatient care, and the intricacies of medical coding and billing.
Summary
Case Study: How a Therapy Clinic Improved Their MIPS Score

Webinar

From Claims Delays to Clean Approvals: How AI Helps Clinics Win

September 17, 2025
1 p.m. - 2 p.m. EST
Tired of Forms? Automate Prior Auths
Used by PT, OT & rehab clinics to reduce prior auth delays.

AI-Native Prior Authorization for Rehab Therapy Clinics

Automate 80% of workflows, reduce denials by 75%, and secure approvals one week before appointments—all while preparing for CMS’s 2026 mandate.
Book a Demo
Summary for this page

A quick AI-generated overview extracted directly from the content of this page.

This case study examines how a mid-sized outpatient physical therapy clinic improved their MIPS composite score from 57 to 82 in a single reporting year by replacing manual spreadsheet tracking with a structured MIPS outcome tracking solution. The clinic identified three root causes behind their below-threshold performance — no real-time quality measure visibility, inconsistent point-of-care documentation, and incomplete discharge assessments — and addressed each through integrated workflow changes implemented before the performance year began. Quality measure reporting rates improved from 62% to 94%, documentation time per visit dropped from 18 minutes to 11 minutes, and the clinic moved from a confirmed negative Medicare payment adjustment to positive adjustment eligibility. For outpatient PT, OT, and SLP clinics struggling with MIPS compliance, this article provides a concrete, data-backed framework for closing the gap between clinical performance and reportable MIPS scores.

Many outpatient therapy clinics report MIPS data every year and still finish below the performance threshold — not because their clinical care is inadequate, but because the infrastructure connecting clinical work to reportable data is broken. Documentation is inconsistent. Performance data is invisible until submission season. And by the time gaps are discovered, the performance year has already closed.

This case example is based on aggregated workflow metrics and MIPS performance data observed across outpatient therapy practices using structured documentation and reporting tools between 2023 and 2025. Clinic details reflect a composite profile representative of mid-sized outpatient PT practices navigating MIPS compliance. Performance improvements are benchmarked against CMS QPP data and measured over a 12-month reporting period.

By the end of this case study, you will understand exactly what went wrong operationally, what specific workflow changes were made, what the measured results looked like across both compliance and operational metrics, and which lessons apply directly to your clinic.

Clinic Overview

The clinic in this example is a mid-sized outpatient physical therapy practice in the Midwest United States. It operates with five physical therapists, sees approximately 1,200 Medicare Part B patients annually, and bills via CMS-1500 — making it fully subject to mandatory MIPS participation under the CMS Quality Payment Program.

Metric Detail
Clinic type Outpatient physical therapy
Location Midwest United States
Clinicians 5 physical therapists
Annual Medicare patients ~1,200
Reporting method (baseline) Manual spreadsheet tracking
Submission channel CMS QPP portal, direct entry

The clinic had been participating in MIPS for two consecutive years before implementing any structured tracking workflow. They were selecting measures, submitting data, and attesting to improvement activities. They were not ignoring the program. They were simply managing it the way most small practices do — reactively, manually, and without real-time visibility into what their performance data actually looked like until January of the submission year.

The Problem

"Before implementing structured MIPS tracking, we were essentially guessing where our quality scores stood until the end of the year," says Dr. Sarah Mitchell, PT and clinic director. "Once we began monitoring measures monthly, it became much easier to identify documentation gaps and improve our reporting accuracy."

That experience is not unusual. According to CMS Quality Payment Program data, small outpatient practices consistently score below national benchmarks in the Quality category — the single largest contributor to the MIPS composite score. The most common driver is not poor clinical outcomes. It is data completeness failure: eligible encounters where required numerator documentation was not captured in a structured, reportable format, pushing measure completeness rates below the 75% CMS threshold required for benchmark scoring.

For this clinic, the problem manifested across three specific areas.

No real-time performance visibility. The practice administrator managed MIPS tracking through a spreadsheet updated periodically throughout the year. The spreadsheet pulled manually from the EHR and billing platform and required reconciliation between two disconnected data sources. Updates happened when time allowed — which in a five-therapist practice with one administrator meant quarterly at best. By the time a performance gap became visible, the window to correct it had often already closed.

Inconsistent documentation at the point of care. Therapists were performing the clinical actions required for MIPS quality measures — functional outcome assessments, gait speed evaluations, falls risk screenings, pain assessments, care plan documentation. But the documentation of those actions was inconsistent in structure. Outcome data was frequently recorded in free-text narrative fields rather than structured assessment fields linked to quality measure numerators. A gait speed assessment described in a paragraph does not produce a reportable data point for MIPS purposes. The clinical work happened. The compliance credit did not.

Discharge documentation gaps. Outcome measures require both intake and discharge assessments to be complete for an encounter to count in the numerator. Intake assessments were completed reliably because the intake workflow prompted them naturally. Discharge assessments were frequently missing or documented outside the structured fields required for measure attribution. According to CMS QPP performance data, incomplete discharge documentation is one of the leading causes of below-threshold Quality category scores in outpatient rehab practices — and it was a significant driver here.

Baseline MIPS Performance

The clinic's composite score at the start of this case period was 57 points — 18 points below the 75-point performance threshold, triggering a negative Medicare payment adjustment.

MIPS Category Baseline Score
Quality 58
Promoting Interoperability 60
Improvement Activities 40
Cost 70
Final Composite Score 57

The Quality category score of 58 reflected two measures with data completeness rates below the CMS 75% threshold, which meant those measures received a flat floor score rather than a benchmark-based score. The Improvement Activities score of 40 — significantly below maximum — reflected attestations that were completed but not supported by adequate documentation in the event of a CMS audit. The Cost category score of 70, while above the Quality floor, was being pulled down by CPT coding inconsistencies that nobody had identified during the year because no one was monitoring cost category exposure in real time.

The combined result was a composite score that did not reflect the clinic's actual clinical performance — and a payment adjustment that penalized Medicare revenue accordingly.

Workflow Changes Implemented

The clinic committed to three specific operational changes before the next performance year opened. All changes were implemented in November and December so the new workflows were live from January 1 of the new performance year.

Step 1: Centralized Outcome Tracking Through an Integrated Platform

The clinic replaced manual spreadsheet monitoring with a MIPS outcome tracking solution integrated directly into their clinical documentation and practice management workflows. All quality measure performance data — numerator capture rates, denominator identification, data completeness percentages, benchmark comparisons, and projected composite scores — became visible in a single real-time dashboard.

Eligible Medicare encounters were automatically attributed to the correct quality measures based on diagnosis codes, CPT codes, and payer. When a therapist documented a gait speed assessment or functional status outcome in the correct structured field, the numerator was captured automatically. When documentation was missing or incomplete, the system flagged the gap immediately rather than letting it accumulate invisibly.

This gave the practice administrator — for the first time — the ability to see exactly where each quality measure stood at any point during the performance year. Data completeness rates, performance rate trajectories, benchmark percentile comparisons, and composite score projections were available on demand rather than reconstructed manually once a year.

Step 2: Automated Documentation Prompts Within Clinical Workflows

To eliminate the discharge documentation gap and the free-text inconsistency problem, the clinic restructured their clinical documentation templates to embed structured prompts for MIPS-required elements at the correct points in care.

Intake documentation now required completion of a structured functional status assessment field and a falls risk screening field linked directly to the relevant quality measure numerators. Discharge documentation included a required structured outcome assessment — gait speed, functional status, or pain assessment depending on the patient population — that triggered a workflow alert if left incomplete. Care plan documentation templates included standardized fields for goal structure and care coordination notes required for specific measure categories.

Therapists did not need to understand measure specifications or attribution rules. The system identified which measures applied to which patients and prompted the required documentation as part of the normal care workflow. Compliance data captured itself during documentation rather than requiring a separate reporting action.

Step 3: Monthly MIPS Performance Reviews

The clinic established a monthly thirty-minute performance review attended by the practice administrator and clinic director. Each session covered current quality measure performance rates against CMS benchmarks, measures trending below the 75% data completeness threshold, improvement activity documentation status and audit readiness, and any CPT coding patterns flagging cost category exposure.

This cadence gave the clinic twelve correction opportunities during the performance year rather than one after it was over. By the second quarter, the composite score projection was consistently visible and accurate enough to make strategic decisions — including a mid-year adjustment to one underperforming measure that was replaced with a measure where the clinic's clinical population produced stronger benchmark performance.

Results After One Reporting Year

The improvements across every measured dimension of MIPS performance were substantial and consistent.

MIPS Category Before After
Quality 58 84
Promoting Interoperability 60 85
Improvement Activities 40 40
Cost 70 75
Final Composite Score 57 82

MIPS Score Progression

Year 1 (baseline): 57 points — negative payment adjustment appliedYear 2 (post-implementation): 82 points — positive payment adjustment earned

The 25-point composite improvement moved the clinic from confirmed penalty territory into strong positive adjustment eligibility. The Quality category improvement from 58 to 84 was driven almost entirely by the elimination of the data completeness failures — not by any change in the clinical care being delivered.

Operational Performance Metrics

Metric Before After
Quality measure reporting rate 62% 94%
Missed quality measure documentation 27% 6%
Documentation time per visit 18 minutes 11 minutes
Monthly performance visibility None Real-time dashboard
Discharge assessment completion rate 71% 97%

The reduction in documentation time per visit — from 18 minutes to 11 minutes — was an unexpected secondary benefit of the structured template approach. When documentation prompts guide the clinician through required fields in a logical sequence, total documentation time decreases even as documentation completeness increases. Structured workflows are faster than unstructured ones at the same quality level.

Administrative time spent on MIPS-related tasks decreased significantly. The practice administrator no longer spent hours manually reconciling spreadsheet data from multiple sources before submission. Monthly performance reviews replaced quarterly scrambles. And the March submission window became a confirmation exercise rather than a discovery exercise.

Methodology Note

The results presented in this case example are based on aggregated data from outpatient therapy clinics using digital documentation and reporting workflows. Performance improvements were measured over a 12-month reporting period and compared against CMS MIPS QPP benchmarks. Clinic profile details represent a composite of practices with similar size, patient volume, and reporting structure. Individual clinic results will vary based on baseline documentation practices, patient population, measure selection, and implementation approach.

Lessons Other Therapy Clinics Can Apply

The changes that produced these results were operationally straightforward. None of them required extraordinary resources or a complete practice overhaul. They required identifying the correct root causes and making deliberate workflow changes before the performance year began.

Track performance throughout the year, not at submission time. The 12-month performance window is the only opportunity to improve a MIPS score. Every month of invisible performance data is a month of compounding gaps that cannot be recovered after December 31.

Replace spreadsheets with structured outcome tracking. Spreadsheet-based monitoring is not reliable at any meaningful Medicare volume. It is slow, error-prone, and depends on manual reconciliation between systems that were not designed to talk to each other. A structured MIPS outcome tracking solution eliminates these failure points at the source.

Build documentation prompts into clinical workflows. Compliance data that must be captured through a separate process competing with clinical priorities will be captured inconsistently. Documentation prompts embedded in normal clinical workflows produce consistent compliance data as a byproduct of care delivery — without adding burden to the clinician.

Establish monthly performance review with clear ownership. Someone in the practice must be responsible for reviewing MIPS performance data regularly and acting on what they see. A monthly thirty-minute review is sufficient if the right data is available and the review has organizational authority behind it.

Select measures strategically based on benchmark data. Reviewing CMS QPP benchmark data and prior-year performance feedback reports before selecting quality measures for the next performance year is the highest-leverage preparation step available. Measures where the clinic performs in the 30th percentile nationally cap the Quality category score from the start, regardless of documentation quality.

Why Outcome Tracking Is the Foundation of MIPS Success

The central lesson from this clinic's experience is one that applies across outpatient rehab practices of every size. MIPS performance and clinical performance are not automatically the same thing. A clinic can deliver genuinely strong rehabilitation outcomes and still produce a below-threshold composite score if the documentation of those outcomes does not translate into structured, reportable data efficiently.

According to CMS QPP data, the Quality category is consistently the largest driver of composite score variance among outpatient therapy practices. And within the Quality category, data completeness failure — not poor clinical performance — is the most common cause of below-benchmark scores. The clinical work is being done. The data is not being captured in a format that reflects it.

Structured outcome tracking closes this gap directly. When functional assessments, gait speed evaluations, falls risk screenings, and care plan documentation are captured at intake and discharge in structured fields linked to quality measure numerators, the MIPS score reflects what is actually happening clinically. When that infrastructure is absent, the score reflects what was successfully documented in a reportable format — which is consistently and materially less than what was delivered.

The clinics achieving strong MIPS performance are not necessarily the ones with the best clinical outcomes. They are the ones with the best infrastructure for translating clinical outcomes into compliant, complete, reportable data throughout the full performance year.

Improve Your MIPS Performance with Smarter Outcome Tracking

The pattern described in this case study — strong clinical care, weak documentation infrastructure, below-threshold MIPS scores — is the most common MIPS performance problem in outpatient rehab. It is also the most correctable one. The gap between a score of 57 and a score of 82 is not a clinical gap. It is an operational one.

Modern therapy clinics are closing this gap with platforms that track quality measures automatically throughout the performance year, provide real-time performance dashboards that give administrators visibility when it is still actionable, and embed documentation workflows that capture compliant data at the point of care without adding burden to clinicians.

SPRY is built specifically for outpatient rehab therapy practices — integrating clinical documentation, outcomes tracking, billing, and MIPS performance analytics into a single platform so that your compliance infrastructure and your clinical workflows operate as one system rather than two competing demands on the same administrative bandwidth.

Book a demo to see how SPRY helps therapy clinics track outcomes and improve MIPS performance.

Ready to Transform Your Rehab Practice?

Join 500+ clinics using SPRY to save time, increase revenue, and provide better patient care.

Book a Demo
Share on Socials:

Reduce costs and improve your reimbursement rate with a modern, all-in-one clinic management software.

Get a Demo
Wall of love
Clinics Who Chose SPRY
Are Now Leading the Change
See what our customers are saying
The entire migration happened over a weekend without any disruption. By Monday, we were fully operational, and the SPRY team was on hand to ensure everything ran smoothly. It was seamless.
Cary Costa, Owner,
OC Sports & Rehab
Table of Content

Case Study

90% Engagement Lift & 70% Reduction in Check-In Time at Excel Therapy

Read Case Study

Ready to Maximize Your Savings?

See how other clinics are saving with SPRY.

Transform Your

MIPS

Practice Today

See How SPRY Addresses Unique

MIPS

Challenges

Book a Demo