Tuesday, March 22, 2011

To improve drug safety, FDA should check its mixed-up files - The Boston Globe

FOLLOWING A number of high-profile drug safety scandals in recent years — such as Vioxx and Avandia — the pharmaceutical industry and the Food and Drug Administration are spending a great deal of energy trying to uncover, as early as possible, rare side effects that make a drug too risky. Sometimes, the agency asks companies to go back to the drawing board to redesign their studies or expand clinical trials to answer specific concerns about safety. But in some cases, the expanded data already exists in the FDA bureaucracy.

There is a vast amount of clinical information sitting in the agency’s archives that could be analyzed and mined to flag worrisome side effects. Multiple years’ worth of clinical trial data from studies of diabetes, high blood pressure, heart disease, Alzheimer’s disease, multiple sclerosis, cancer, and many other conditions could be used to answer important questions about drug safety and efficacy. But most of that data is unsearchable, existing in the form of paper submissions or unwieldy electronic files that can’t be downloaded and analyzed.

As recently as 2007, only 37 percent of drug applications were submitted to the FDA in electronic format. The figure has improved — it was around 70 percent last year — but that means 30 percent of data still comes in as binders full of paper. In the era of the iPad, this is unfathomable.

Even among electronic submissions, a lot of the data arrives in forms that are not easily searchable because companies submit the information in custom formats. Patients’ gender, for example, might be categorized as “male’’ or “female,’’ “M’’ or “F,’’ or “1’’ or “0,’’ depending on the study. The result, as described by the FDA itself, is an “informatics tower of Babel,” in which an extremely valuable resource for public health — millions of data points from a multitude of clinical studies — is virtually impossible to use.

Illustrating the absurdity of the situation, Jesse Goodman, chief scientist and deputy commissioner at the FDA, noted last November that someone researching what kind of yoga mat to buy has a wealth of information at their fingertips. “Your ability to access data on how to do that right away is now incredible,” he said. And yet, he added, “If you want to understand what has happened in billions of dollars of medical product development activity, your ability to access that data is quite limited.”

The FDA’s first priority should be to develop a standardized database into which data from past and present clinical trials can be deposited for further analysis. The agency has slowly begun to work towards this goal, and the effort is bearing fruit: last summer, a database containing information derived from 11 older clinical trials of Alzheimer’s disease was completed, allowing them to be analyzed in bulk. That database is now open for mining by researchers around the world, who may look for patterns of safety and efficacy in a manner that was previously impossible.

To make such databases useful, it is also imperative that industry transition to a standardized reporting format for all electronic data submissions. Barely half of submissions use the FDA’s preferred format. The agency must require its usage in the future, especially if it’s going to demand so much more data from companies.

To meet the public’s demand for greater drug safety oversight, asking industry to pay for more comprehensive and costly studies may be inevitable. But the FDA should be equally aggressive in assuring it has done everything in its power to get many of those pressing safety questions answered with the data it already has in its possession.

Posted via email from Jack's posterous

No comments: