Six ways to reduce workers’ comp costs using AI
My colleagues and I have been having some heated debates lately on how exactly AI can turn the tide on the rising medical costs in workers’ compensation claims. All of us are tired of hearing about the promise of data science and AI without anyone discussing their methods or how they have overcome the challenges of understanding their data. After doing some serious brainstorming, we have come up with a list of ways AI can actually help reduce medical costs.
This is definitely not a comprehensive list but hopefully is a good start. Comments are welcome.
1. Get to the bottom of what happened
If we don’t know where we are, we can’t know which way to go. The traditional way of classifying claims using nature of injury, cause of accident and body part is broken. The codes present in the data are often incorrect or out of date. Diagnosis codes in the billing data are a much more accurate way to classify the injury, but the problem with diagnosis codes is that a workers’ comp claim can have hundreds of ICD codes attached to it. Also, now that the floodgates have opened with ICD-10 codes, there are 80,000+ ICD-9 or ICD-10 codes to navigate. AI can help here by using a variety of factors ($, span of service, frequency of occurrence, etc.) and techniques (cluster analysis, classification algorithms) to:
- Sift through and analyze the bill lines to identify the primary diagnosis for each claim
- Detect comorbidities — a big driver of costs
- Group the identified primary diagnosis codes into diagnosis categories based on WC-specific costs and clinical judgment.
The primary advantage of doing this is three-fold:
- Cost driver identification-The factors, including diagnosis, that contribute to the total costs of a book of business can be isolated and tracked
- Case-mix adjustment- you can adjust for differences in diagnosis handled to do a variety of apples-to-apples downstream analyses (provider selection, regional comparisons, etc.)
- Anomaly detection — more on this in the next point.
2. Detect anomalies in procedures and drugs administered
Utilization review methods typically have relied on fee schedules and, more recently, practice guidelines. These methods generally do not alert the payer when a procedure or drug is unusual for a particular type of claim. Using diagnosis grouping and a big enough dataset, we can now calculate the probability distribution of procedures and drugs administered by diagnosis group. With the right analytical methods it is possible to identify outliers, and in particular, outliers that may not necessarily have unusually high costs but that are outliers when compared to other claims with similar diagnosis and injured worker characteristics. You can look at drugs and procedures administered in a claim, look at the group the claim belongs to and see if the drug/procedure is typically administered or an outlier. Anomalies can be unearthed and highlighted.
3. Select the “optimal” provider for a claim
Many payers contract with large provider networks and then realize that provider performance varies and their outcomes need to be analyzed in order to identify the top performers. Claims adjusters may have opinions about who are the good providers and who are not, but often an analysis of the data will reveal qualities of the providers that are not apparent to the adjusters. The results of using accurate, predictive models to rank providers and identify top performers can be truly amazing. It is possible to reduce total claim costs by a whopping 15–45% by directing injured workers to the top performing providers in their area. A few things to watch out for :
- Ensure that the predictive model takes a comprehensive view of costs; not just using medical costs alone or it becomes a game of whack-a-mole where squeezing down medical costs could lead to an explosion of other costs.
- Also, too often, providers are ranked based on where they lie in the distribution (e.g., top quartile etc). This approach doesn’t fully work, as the difference between one bucket and another can not be evaluated statistically. It’s important to segment providers based on statistically significant differences in performance.
4. Avoid unnecessary work
70–80% of workers’ comp claims are simple and don’t require specialist help (e.g., referrals/nurse case management etc). It is a cardinal sin to unnecessarily increase the cost of these minor claims by using expensive cost containment methods that really should be directed only where they will produce a clinical or financial benefit. Machine learning/decision tree based models can help score the complexity of each claim and direct only the ones that will benefit to the medical experts.
5. Enable early intervention in catastrophic claims
For the claims that are deemed high risk, the best way to reduce costs is early intervention. Predictive models can help calculate the expected cost trajectory of a claim and then detect when a claim is moving out of the expected range. Models can send alerts as soon as a deviation from the norm occurs to help the operations team take action.
6. Prevent litigation
Given all things being equal, the presence of an applicant attorney in a claim can bump up all costs — indemnity, medical and expenses. AI models can help detect the likelihood of litigation. We have been pretty successful in predicting future litigation status with a high level of accuracy (92%+) using day one data. Combining this detection algorithm with a tight process to handle “at-risk” claims can dramatically reduce the number of litigations. One of our customers saw a drop in applicant attorney involvement from 13% of all claims to 4%. Claims at-risk of being high cost trigger more involvement from senior claims handlers and help contain the damage before it goes towards legal action.