Call us toll-free: 800-878-7828 — Monday - Friday — 8AM - 5PM EST
Article by Lisa A. Eramo, MA. This article was originally published on the Journal of AHIMA website on June 1, 2017 and is republished here with permission.
In the months leading up to the ICD-10 go-live, many hospitals implemented computer-assisted coding (CAC) in the hopes that it would offset anticipated productivity losses and boost coding accuracy. Now that the industry has passed the one-year ICD-10 milestone, an important question remains: Did CAC fulfill its promise? If so, are hospitals continuing to see a return on investment (ROI)?
“Our experience has been very good. CAC definitely gives you an advantage by scanning the entire chart for you,” says Monica Pinette, RHIA, CCS, CDIP, CPC, director of HIM revenue cycle at Saint Francis Hospital and Medical Center in Hartford, CT.
Saint Francis went live with CAC shortly before October 1, 2015. Within six months, coders had returned to—and even exceeded—baseline productivity all while maintaining a 95 percent accuracy rate. As a result, Pinette increased hourly productivity standards as follows:
- 5 inpatient records (was previously 2.14)
- Five outpatient surgeries (was previously three)
- Five observation cases (was previously three)
- 100 labs (was previously 50)
Seth Jeremy Katz, MPH, RHIA, associate administrator of information management at Truman Medical Centers in Kansas City, MO, says he has also had a positive experience with CAC. Truman Medical Centers submitted an initial CAC request for proposals (RFP) in September 2011 and didn’t go live until August 2014.
An unexpected benefit of CAC has been the metadata that helps auditors and educators understand each coder’s thought process, says Katz. “It gives you a different level of insight and helps you understand what type of education or support they might need,” he adds.
Not every hospital, however, has been so lucky with CAC. “I’ve talked to hospitals that went through the whole RFP and implementation, and it went horribly. They’ve had to rip it out,” says Katz. “There are times when either the vendor over-promised or the hospital didn’t prepare well enough, and it has gone badly. All of that up-front work is so vital.”
In particular, hospitals using CAC must ensure that interfaces work properly so the technology reads all documents relevant for coding. Documents must also comply with a consistent format dictated by the CAC vendor. Something as simple as a colon rather than a semicolon after a header on a template can cause the CAC to skip a document.
Setting Realistic Expectations
CAC isn’t a “one-time and done” implementation, says Kelly Canter, BA, RHIT, CCS, coding development manager of CAC at Optum360. “Even after 15 years of ICD-9, we were still doing continuous improvement—always refining and adapting,” she adds.
Katz agrees. “If someone thinks they’re going to purchase CAC, install it, and magically have perfect coding and productivity through the roof … then they’re probably going to be in for a shock,” he adds. “It’s a piece of software like any other. How you build it, train staff on it, and use it—that’s going to determine how successful you’ll be.”
Experts agree that hospitals with successful CAC implementations have these four themes in common:
1. Quality clinical documentation improvement (CDI) programs.
CDI specialists obtain specificity on the front-end so the CAC engine has all relevant information present in the record that it needs to assign the correct—and most specific—code, says Garri L. Garrison, vice president of performance improvement at 3M.
2. Ongoing communication with the CAC vendor.
Hospitals should notify their CAC vendor of any new document types or changes to existing documents, including template changes. These changes can alter the way in which the CAC engine reads the document—or whether it even reads it at all, says Garrison.
3. Workflow changes to accommodate the CAC software.
Best practice is to use the auto-suggested codes and review all supporting evidence to validate the codes, says Garrison. “Once you adapt to this workflow, you can really get some good productivity gains,” she says. Coding professionals who manually review the entire record in the same way they did before CAC aren’t taking full advantage of the technology, she adds.
4. Internal monitoring.
Hospitals using CAC should monitor the following metrics, says Garrison:
- Case-mix index (CMI). According to a 3M blind comparison of randomly selected 3M clients and non-clients, hospitals using CAC had a positive shift in their CMI between 0.0102 and 0.2069, resulting in a potential financial impact in the thousands—and even millions—of dollars. “The 3M clients in the study did see a CMI increase within the range reported, but whether a hospital sees the same results all depends on the quality of coding before CAC is implemented, says Garrison.
- Coder productivity and accuracy. Do these metrics remain consistent—or even improve—over time?
- Discharge-not-final-billed (DNFB). Does the DNFB remain consistent—or even improve—over time?
- Volume and type of accepted codes. How often do coders accept the auto-suggested code versus code the record manually?
- Volume and type of unspecified codes. How often does the CAC pick up an unspecified code? Though this isn’t a CAC software issue per say, it’s something hospitals should monitor and address through their CDI program.
- Volume and types of deleted and rejected codes. For example, does the CAC frequently pull active diagnoses from the past family history or a nurse’s note? If so, the engine may need tuning to read the appropriate documents and/or segments of a document. Or are coders deleting codes inappropriately (e.g., reporting two codes instead of a CAC-assigned combination code)? In this case, coder education may be required.
- Volume of documents flowing into the CAC. Hospitals shouldn’t see a sudden change or sharp decline, for example, of standard document types.
Looking Ahead
If your hospital hasn’t yet embarked on the CAC journey, it may want to consider doing so. That’s because CAC engines are undergoing continual improvements, including the ability to help with manual abstraction of quality indicators and other regulatory-based reporting requirements, says Garrison. “Now that we have basically crossed the chasm to ICD-10, you’re going to see these engines do a lot more,” she says.
Thinking about taking the next steps? Following are four questions to consider:
- Do any of the CAC vendor’s clients use your hospital’s same EHR? If so, can you talk to them? “It’s not about what a salesperson says during a sale demo,” says Katz. “You need to talk to references and the actual product team.”
- What are the potential limitations of the CAC software? For example, at Truman Medical Centers, coding professionals must manually code all EKGs because their CAC software only reads documents that include signatures with provider credentials—something that their third-party EKG system can’t provide.
- What type of internal resources are available to help answer questions, resolve interface challenges, and help you fine-tune the software?
- How does the CAC vendor handle scanned documents? How easily can the CAC engine read this information?
The bottom line is that CAC is only as good as the documentation on which it’s based, says Canter. It also requires savvy coders who can validate the auto-suggested codes. “CAC is computer assisted coding. It will never outperform a human,” she adds.
One comment
Comments are closed.
Having been a Product Manager for NLP and CAC, I couldn’t agree more that consistency in document format is key to getting the most out of your CAC product. Assessing your documents (and documentation quality) before implementing CAC is vital. Also, you should consider your CAC vendor an extension of your department. When creating new templates for your EMR, consult with your vendor; they should provide you with formatting recommendations. And, work with them to improve the CAC output. It is a continuous process but you will reach your return on investment and reap the benefits that CAC has to offer.