The Unexpected Reason Measurement-Based Care Fails It's not a platform failure, clinician resistance, or bad training. When Measurement-Based Care (MBC) falls short, it's often a quiet "drift." Learn what clinics miss: how administrative load, lack of workflow integration, and unclear navigation responsibility cause valuable data to become "flotsam and jetsam," instead of guiding treatment decisions.

1. How Clinics Actually Arrive at Measurement-Based Care

When mental health clinics adopt Measurement-Based Care, it usually is not because someone got excited about dashboards or thought, You know what would be fun? More spreadsheets.

Most clinics arrive at Measurement-Based Care for much more ordinary reasons.

Someone in leadership believes it might help improve client care. Or funders begin asking for more data-driven practices because they believe it will help improve client care. Sometimes it is both.

So the clinic begins measuring.

Questionnaires are assigned.
Measures are collected.
Data begins to accumulate.

And then something unexpected happens.

The clinic does not just have client data anymore. It now has data about the client data.

What measures were assigned?
When were they supposed to be completed?
Which ones are missing?
Which clients did not finish them?
Which clinicians forgot to review them?

Suddenly the data is breeding.

You have client data, data about the client data, and if you are anything like me at one point in my career, a constellation of Post-it notes reminding you about the data and the data’s data.

At some point, someone in the clinic discovers a platform that promises to help manage all of this.

And it does.

For a while.

Measures get organized. Reports appear. Workflows become clearer. Staff attend training to help everyone learn how to use the new system.

For a moment, it feels like the clinic has finally gotten its arms around the tidal wave of information.

And then something subtle begins to happen.

The platform is still there.
The measures are still being assigned.
The reports are still technically available.

But the system no longer seems to be doing what everyone hoped it would do.

At this point, people start looking for explanations.

Maybe the platform is not good enough.
Maybe clinicians are not using it correctly.
Maybe the training was not sufficient.

But the real problem is usually none of those things.

What is missing is something quieter.

Complex systems rarely collapse suddenly.

They drift.

2. What the Research Actually Shows

Researchers have been studying outcome monitoring in mental health care for decades.

Do things go as planned?

Boswell and colleagues examined the implementation of routine outcome monitoring in clinical practice and found that introducing outcome measures and training clinicians to use them is only the beginning. Sustaining routine outcome monitoring requires systems that support its ongoing use in practice [1].

Researchers studying measurement systems have emphasised that outcome data improves care only when it functions as part of a measurement feedback system, where information is delivered to clinicians in time to influence treatment decisions [2].

Psychotherapy research has also shown another pattern.

Clinicians are not always able to detect when treatment is going off course without structured feedback systems. Studies of outcome monitoring have shown that clinicians often underestimate deterioration in treatment progress unless systematic feedback is provided [3].

And when feedback systems are implemented correctly, outcomes can improve.

Randomised trials have shown that providing clinicians with systematic feedback about client progress can significantly improve treatment outcomes, particularly for clients who are not responding as expected [4].

Taken together, this research suggests something important.

Measurement-based care can work.

But it works only when the information generated by the system actually reaches the people making decisions about care.

3. The Ship at Sea

Once a ship is out at sea, there is always an invisible current below the surface that catches it, just as a clinic is navigating the day-to-day waters of providing mental health care.

Different parts of the crew are responsible for different stations.

Clinicians are making treatment decisions.

Administrative staff are keeping the flow of information moving so care can be documented, measured, and billed.

The entire crew keeps the ship going, but navigation does not happen somewhere abstract in the system. It happens in specific moments, when someone is deciding what to do next.

For measurement to influence care, outcome monitoring has to be woven directly into those moments of decision making [1].

Clinicians are managing complex caseloads, documentation requirements, and the emotional labour of care. Administrative staff are simultaneously tracking missing questionnaires, sending reminders, reconciling incomplete records, and making sure the information required for billing and reporting actually moves through the system.

Meanwhile, the current continues to move beneath the ship.

When outcome data appear naturally within that workflow, it becomes part of the system that helps the clinic stay oriented.

But if clinicians or staff must step outside their workflow to search for reports, interpret unfamiliar dashboards, or reconcile incomplete data, those tools quickly fall out of use.

Not because anyone resists them.

Because when every station on the ship is already busy, instruments that are difficult to reach rarely guide the course.

For feedback systems to function as clinical tools, they must be:

• accessible
• interpretable
• available at the moment decisions are made

Otherwise, they remain present in the system but gradually shift roles.

Instead of helping the clinic stay oriented, they become flotsam and jetsam drifting through the system.

4. When the Instruments Stop Guiding the Course

If you look more closely at what is happening inside the ship, you begin to notice something interesting.

The instruments are still there.
The crew is still working.

But the ship has quietly begun to drift.

Evidence-based practices often weaken over time when responsibility for maintaining the system is unclear [5].

Implementation requires more than initial training.

It requires leadership attention, role clarity, and ongoing system management.

Without these structures, systems slowly lose coherence as staff turnover, workflows evolve, and priorities shift.

Nothing dramatic breaks.

The ship does not hit a storm.

But the currents keep moving beneath the surface.

And if no one is watching the navigation instruments closely enough to correct the course, the ship gradually drifts away from where it intended to go.

5. The Hidden Load on the Crew

Another obstacle on the map appears when we look more closely at the daily work happening across the ship.

Clinicians are already managing heavy caseloads, documentation requirements, and the emotional labour of providing care.

Administrative staff are managing appointment flow, billing requirements, missing paperwork, and regulatory reporting.

When measurement systems add additional administrative friction to those existing workflows, adoption declines [1].

In many cases the problem is not the measures themselves.

It is the administrative work surrounding them.

Assignment rules.
Incomplete data.
Report interpretation.
Unclear clinical relevance.

If these processes are not actively maintained, the system becomes another administrative demand competing with clinical work.

6. Who Is Watching the Navigation Instruments?

If you keep tracing the problem deeper into the workings of the ship, a quieter question begins to emerge.

The measures exist.
The platform exists.
The reports exist.

So why does the system still drift?

Because instruments alone do not guide a vessel.

Someone has to be responsible for navigation.

Measurement-based care produces signals about client progress and treatment outcomes. But those signals only matter if someone is responsible for noticing them.

In many clinics, no one is.

Staff change roles.
Workflows evolve.
Documentation requirements shift.

Over time the system slowly drifts.

Not because the crew is careless.

Because navigation is a job.

And in many measurement-based care implementations, that job was never clearly assigned.

7. Measurement-Based Care Does Not Fail in Storms

When people talk about measurement-based care failing, the explanation is often dramatic.

The platform was wrong.
The clinicians resisted.
The training was insufficient.

But most systems do not collapse in storms.

They drift.

The measures are still assigned.
The platform still runs.
Reports can still be generated.

But the instruments stop guiding decisions.

Measurement-based care does not work simply because measures exist.

It works when the information generated by the system remains connected to the decisions people make every day about treatment, documentation, and care.

Like any vessel at sea, clinics need more than instruments.

They need navigation.