The FDA provided new guidance on the regulation of mobile health software and clinical decision support (CDS) tools that use artificial intelligence (AI). The agency specified that it will oversee products that help doctors make decisions about treating serious or critical conditions, but whose rationale doctors cannot independently evaluate.
The new guidelines hope to balance safety with innovation during a time of increased investment in AI and digital health. The 21st Century Cures Act, passed in 2016, gave the FDA authority to decide which products will be subject to its review. Technology that falls under these regulations includes, for example, a CDS tool that might identify a hospitalized patient with type 1 diabetes as having a high risk of severe heart problems following surgery, then makes a treatment recommendation without explaining its rationale. If the CDS tool provides an inappropriate recommendation, it could potentially cause serious harm.
In the past two years, the agency has approved at least 33 products that use AI to detect diabetic retinopathy, stroke onset, wrist fractures, heart rhythm abnormalities, and other conditions.