FDA launches AI tool for its medical researchers and investigators

The FDA has released Elsa, an AI tool designed to optimize the performance of FDA employees such as researchers and investigators.

The US Food and Drug Administration (FDA) has launched Elsa, a generative Artificial Intelligence (AI) tool designed to help employees – from scientific reviewers to investigators – work more efficiently. “This innovative tool modernizes agency functions and leverages AI capabilities to better serve the American people,” the FDA’s press release said.

“Following a very successful pilot program with FDA’s scientific reviewers, I set an aggressive timeline to scale AI agency-wide by June 30,” said FDA Commissioner, Dr Marty Makary. “Today’s rollout of Elsa is ahead of schedule and under budget, thanks to the collaboration of our in-house experts across the centers.”

An LLM designed for efficiency

Elsa is a large-language-model–powered AI tool designed to assist with reading, writing, and summarizing.

It can summarize adverse events to support safety profile assessments, perform faster label comparisons, and generate code to help develop databases for nonclinical applications, among other things, the FDA said. The agency is already using Elsa to accelerate clinical protocol reviews, shorten the time needed for scientific evaluations, and identify high-priority inspection targets.

The FDA said Elsa was built within a high-security GovCloud environment and offers a secure platform for FDA employees to access internal documents while ensuring all information remains within the agency. “The models do not train on data submitted by regulated industry, safeguarding the sensitive research and data handled by FDA staff,” the FDA said in its press release.

As the tool matures, the agency has plans to integrate more AI in different processes, such as data processing and generative-AI functions.

The FDA was able to launch Elsa ahead of schedule, it said, because leaders and technologists across the agency collaborated, “demonstrating the FDA’s ability to transform its operations through AI,” the FDA said.

Referring to the tool’s evolving nature, FDA Chief AI Officer Jeremy Walsh said: “As we learn how employees are using the tool, our development team will be able to add capabilities and grow with the needs of employees and the agency.”

FDA use of AI

The FDA published draft guidance earlier this year entitled Considerations for the Use of Artificial Intelligence to Support Regulatory Decision Making for Drug and Biological Products. This guidance provides recommendations to industry on the use of AI to produce information or data intended to support regulatory decision-making regarding safety, effectiveness, or quality for drugs.

The Center for Drug Evaluation and Research’s (CDER’s) AI Council was established in 2024 to provide oversight, coordination, and consolidation of CDER activities around AI use. 

Acknowledging that changing external and federal environments for AI have brought new governance needs, the CDER AI Council said it will consolidate and continue the important work started by the CDER AI Steering Committee, AI Policy Working Group, and CDER AI Community of Practice. Each of these groups will help craft regulatory submissions incorporating AI and broaden the scope and impact of AI use in drug development, the FDA said.