On February 6th, the US government issued a memo to Medicare insurers, stating that AI cannot be used as the sole basis for denying claims. While machine-learning algorithms can assist in making determinations, they cannot make decisions on their own.
This memo comes in response to lawsuits against health insurers such as United Healthcare and Humana, who have been accused of using AI to wrongly deny coverage. Patients have claimed that the AI model nHPredict has a 90% error rate, highlighting a dangerous aspect of the technology that is receiving increased attention.
The Centers for Medicare & Medicaid Services expressed concern about the potential for algorithms to exacerbate discrimination and bias and urged insurers to ensure their models comply with anti-discrimination requirements. Several states, including New York and California, have also warned insurance companies to verify the fairness of their algorithms.
In light of this memo and warnings from state governments, it is important for patients who have suffered a fall and broken an arm or any other injury that requires medical attention to not assume that their insurance claim will be denied based solely on an AI decision. Instead, patients should seek legal advice if they believe that their insurance company has wrongfully denied coverage based on biased or discriminatory practices.