Home Health Care Health Equity Begins with Increasing Diversity in Clinical Research: AI Can Help

Health Equity Begins with Increasing Diversity in Clinical Research: AI Can Help

4
0
SHARE

Two children had the same clinical problem: a vessel in their brain malfunctioned. The first child had an aneurysm that ruptured in his brain, causing a hemorrhagic stroke (blood on the brain). The second was unable to talk, respond, or understand questions due to a vessel that spontaneously tore inside his brain (intracranial dissection) causing an ischemic stroke. Both were hospitalized and in a coma.

Physicians were consulted on the patients’ behalf; two very different medical plans resulted. One patient received thorough medical testing and work-up. The other did not. One important difference in their cases? One child was White, and the other was Black.

Although the US has one of the most advanced healthcare systems in the world, stories like these are the norm. People of color, LGBTQ+ patients, and those in underserved urban and rural communities are prevented from accessing, trusting in, and receiving optimal clinical care. When we fail to address the biases that lead to these disparities, it not only has consequences that impact care, it also undermines the development of drugs optimized for all patients. It will take a comprehensive approach to eliminate bias in clinical research, and we need to enlist all tools at our disposal, including AI and machine learning, to do this.

Most clinical trial sponsors aren’t doing enough to dismantle biases that permeate drug development, even though enrolling more diverse patients in trials results in drugs with better tolerability and efficacy, and more predictable effect for wider patient populations. Bias has potentially dangerous consequences; the commonly used chemotherapy drug 5-fluorouracil was found to have adverse effects, including hematological toxicities, at higher rates among African American patients than White patients, a fact not revealed in the program’s clinical trials, which had limited patient diversity.

Typical research site-selection practices create a substantial barrier to diverse participation. Industry sponsors repeatedly conduct research at the same large sites and use the same investigators, which generally do not provide care to underserved populations and are often not easily accessible to diverse communities. These sites and investigators tend to be in communities where patients are better insured and generally healthier; 50% of clinical trials are conducted in only 2% of zip codes, with research being conducted among patients who are largely white, affluent, and male.

In addition, federally sponsored research is conducted at major medical centers that do not often engage community-based clinicians. As a result, these physicians are less likely to be aware of clinical research, even if those trails are being conducted at nearby sites, and their patients are most often shut out of trials.

It’s critical that we address these examples of institutionalized bias and racism in how we approach clinical trials for drugs and devices. While it’s a daunting task, the path forward is surprisingly clear, and AI can be instrumental in the next step of overcoming unconscious biases inherent in clinical trial processes.

Making trials easier to participate in is critical. Instead of defaulting to previously used sites, sponsors can use AI programs such as Trial Pathfinder, developed by researchers at Stanford to address diversity in oncology trials, and other platforms that are designed to locate patients with a particular disease and identify convenient sites based on adjacency to patient communities, access to transit, and other accessibility factors. Once patients and providers are identified, they can then be asked to participate.

With distance, travel and childcare expenses, and demands on time making it impossible for patients to take part in trials, the non-profit Digital Medicine Society has launched an AI-based initiative designed to increase diversity in trial participation. In addition, participation can also be enabled through AI-powered remote and decentralized trial approaches that allow patients to participate from home or their own doctor’s office.

Despite AI’s promise, those who conduct clinical trials with the aid of AI must take steps to ensure the quality and accessibility of their data. AI platforms must be intentionally trained toward inclusiveness by researchers, developers, engineers, scientists, and investigators, who must employ a critical eye to datasets, processes, and platform features. When investigators don’t recognize and address biases in datasets, this leads to wrong assumptions and misinformed or insufficient safety profiles for drugs in development. The algorithm that misidentified patients who could benefit from high-risk care management programs is a cautionary example: it was trained on parameters introduced by researchers who didn’t take patients’ race, geography, or culture into account.

We all win when diverse patients are included, not only in clinical trials but also in the care continuum. Stakeholders in the clinical research ecosystem — the biomedical industry, policymakers, government agencies, contract research organizations, and patient advocates — must now take steps to support the development and long-term sustainability of an infrastructure that unites clinical research with clinical care.

This can be achieved by leveraging AI to identify investigators and patients in communities not typically represented in research, and talking about clinical trials as care options. Those using AI must also be critical about the quality and equity of the datasets and processes used to drive computer-assisted decision making. These steps enable researchers to be intentional about dismantling biases that permeate the therapy development process, working with communities that have the greatest need, and creating drugs that are safer and more effective for the patients who need them most.

Photo: John M Lund Photography Inc, Getty Images

Source link