Drug Problems: Did Bad Blood Tests Turn a Clinical Trial into a Mistrial?Tweet
March 15, 2016
Two recent studies dismiss concerns that a clinical trial of the blood thinner Xarelto might have been compromised by unreliable blood-testing devices.
However, the first of those studies, by the group that led the original clinical trial, sidestepped potential evidence of trouble: inconsistencies in data gathered during the trial. The authors did not include that information.
The second, by a European regulatory agency, described the inconsistencies: When readings taken by the suspect testing devices—the measurements used in the trial—were compared to parallel measurements taken by a central laboratory, “discrepancies of potential clinical relevance were rather frequently observed,” a report by the European Medicines Agency (EMA) said.
The discrepancies occurred “approximately in 35% of the estimations,” the report said.
In the end, the EMA said the discrepancies did not change its assessment of Xarelto’s risks and benefits.
The questions behind the two studies basically boil down to this: Was a clinical trial that spanned almost four years and involved more than 14,000 subjects in 45 countries fatally flawed? If a key testing device used in the trial delivered unreliable readings, can the trial results and conclusions be trusted? And, by reanalyzing data gathered during the trial, is it possible to determine with any precision what the results would have shown if the blood tests were done differently?
Some observers say the post-mortems haven’t resolved concerns about the Xarelto trial, and they say it may be impossible to do so at this point.
“It is too late to get any kind of definitive answer to the question whether or not this device and its poor functioning interfere with the validity” of the clinical trial results, Sidney Wolfe, a doctor and consumer advocate at Public Citizen’s Health Research Group said. “The kind of information necessary to definitively answer that question . . .aren’t available right now.”
“They should not have used a device that was this faulty in this trial,” Wolfe added.
The New York Times quoted Dr. Steven Nissen, a cardiologist at the Cleveland Clinic who served on a Food and Drug Administration advisory panel that endorsed Xarelto in 2011. Nissen, one of two advisory committee members who voted against the drug, expressed doubt that any after-the-fact analysis would give doctors and patients answers, the Times said.
“Given the fact that the device was inaccurate, there is no way anybody can tell you what would have happened in the trial,” Nissen told the newspaper.
If Wolfe and Nissen are right, the FDA is in an awkward spot.
The two recent studies reassessed a clinical trial called ROCKET AF, on the basis of which the FDA and the European Union more than four years ago approved Xarelto for use in patients with a potentially fatal or stroke-inducing heart condition called atrial fibrillation. The drug, which has also been approved for other uses, has been prescribed for millions of people and generates billions of dollars of sales annually for Johnson & Johnson and Bayer.
The two companies sponsored the clinical trial, and the Duke Clinical Research Institute (DCRI) coordinated it for them. The Duke-based committee that designed and oversaw the trial was co-chaired by DCRI founder and former Duke University researcher Robert Califf, who in 2011 helped make the case for the FDA to approve Xarelto. In the past, Dr. Califf was paid by subsidiaries of Johnson & Johnson and Bayer, according to financial disclosures on the DCRI website.
Today, Califf heads the FDA.
(Citing Califf’s oversight of the Xarelto trial and his ties to the pharmaceutical industry, the Project On Government Oversight’s advocacy team urged the Senate not to confirm him.)
Xarelto has generated a relatively high number of reports to the FDA about adverse events in patients taking the drug, according to the Institute for Safe Medication Practices, which tracks such reports. The companies that market the drug have been defending themselves against product liability suits filed by patients and their families.
In the first of the reassessments, the Duke-based executive committee of ROCKET AF was essentially reviewing its own work. In the second, the European Medicines Agency was examining issues it apparently overlooked when it reviewed the same clinical trial in 2011 and endorsed Xarelto for patients with atrial fibrillation.
The FDA has said it is reviewing data and has not announced conclusions.
Like its European counterpart, the FDA could have studied the performance of the testing devices used in ROCKET AF and the discrepant results years ago, before approving Xarelto. Asked repeatedly about the matter since November, FDA spokeswoman Katie Conover has declined to say what efforts if any the agency made to do that.
Questions about the Xarelto clinical trial came to light last fall, when POGO reported that ROCKET AF relied on a type of blood-testing device with a history of delivering false readings. Before the trial began, the device was the subject of two FDA warning letters. In December 2014, years after the trial ended, it became the subject of an FDA recall notice.
The trial compared Xarelto to warfarin, a generic blood thinner in use since the 1950s. Trial subjects on warfarin were supposed to undergo blood tests at least every four weeks or so to determine if their clotting rate was in the desired range, according to a briefing paper Johnson & Johnson Pharmaceutical Research and Development prepared for an FDA advisory committee. If their blood was clotting too quickly—in other words, if it was too thick—their dosing should have been increased to prevent clots from forming and causing strokes and other blockages. If their blood was clotting too slowly—if it was too thin—their dosing should have been reduced to avoid bleeding, including bleeding in the brain.
According to the 2014 recall notice, the testing devices had a tendency to produce falsely low readings. In theory, those could have prompted doctors to give patients too much warfarin, making them more susceptible to bleeding and skewing the comparison of the old drug to the experimental drug.
Under the trial protocol, the goal in ROCKET AF was to keep warfarin subjects’ clotting rates, known as INRs, in the not-too-fast, not-too-slow range of 2 to 3.
In a recent court filing in litigation against the drug companies, plaintiffs allege that, during the clinical trial, doctors raised concerns that testing devices were providing inaccurate readings. For example, according to another filing by plaintiffs, one patient’s INR was measured by one of the devices as 2 but later found by a different test to be 10, well outside the acceptable range. In that case, the trial site handling the patient was instructed to turn in the device for replacement, the court filing said.
Defendants in the case have accused plaintiffs of cherry-picking information from confidential documents.
Duke researcher Manesh R. Patel and co-authors presented the ROCKET AF executive committee’s reassessment of the clinical trial in a letter to the editor of The New England Journal of Medicine.
“These results are consistent with the overall trial findings and indicate that possible malfunction” of the testing devices “did not have any significant clinical effect on the primary efficacy and safety outcomes in the trial,” they wrote.
Their reassessment did not mention a set of data that could have provided additional perspective. At two points in the trial, weeks 12 and 24, blood analyses were performed by laboratory tests as well as via the suspect devices, making it possible to compare the readings, the European report showed.
Instead of making such comparisons, the executive committee focused on the fact that the recall notice said the testing device could provide inaccurate results for patients with specific medical conditions. The executive committee studied how those patients fared in the clinical trial compared to other patients.
In response to emails seeking comment, Patel answered only one of POGO’s questions, saying that Califf was not involved in the ROCKET AF executive committee’s reanalysis of the trial. One of Patel’s co-authors, Keith A.A. Fox, declined to comment. “I regret that, in view of the legal challenges, I will not be able to help,” Fox, a professor of cardiology at the University of Edinburgh, said by email.
Califf did not respond to questions sent by email.
Conover, the FDA spokeswoman, has said by email that Califf “is recused from” the FDA’s review.
The second reassessment, the one by European regulator EMA, relies heavily on analyses it requested from Bayer, one of the sponsors of the clinical trial and marketers of Xarelto. Those analyses were based on an assumption that the problem was limited to patients with the listed conditions, the EMA report said. The EMA questioned the “relevance” of the approach Bayer had taken and asked the company to provide additional analyses. Specifically, the EMA asked Bayer to compare measurements taken by device and by lab at weeks 12 and 24.
Those measurements often differed, according to the EMA report.
In a comparison of measurements taken at week 12, devices indicated that 2,801 INR readings were in the desired range. However, lab tests indicated that 965 of the 2,801, or 34.5 percent of them, were too high.
Similarly, in the week 12 comparison, devices indicated that 1,944 INR readings were too low, but lab tests indicated that 705 of the 1,944, or 36.3 percent of those, were too high or in the desired range.
The EMA report said the data could raise “some concern” about “the possibilities of inappropriate dosing.”
Looking at the numbers a variety of ways, the EMA described a muddled picture. “The pattern observed could support a conclusion that inaccurate device estimations can have led to a somewhat increased bleeding tendency among the warfarin treated patients, however it could also theoretically suggest an advantage for treatment with rivaroxaban in such situations,” the EMA said, referring to Xarelto by its generic name.
The EMA noted differences in the comparisons captured at weeks 12 and 24. How much those two snapshots in time reveal about the years-long trial as a whole is unknown and perhaps unknowable.
In a news release, the EMA said it concluded that “a defect with the international normalised ratio (INR) device used in the ROCKET study does not change its conclusions on the overall safety or benefit-risk balance of Xarelto.”
“EMA’s Committee for Medicinal Products for Human Use (CHMP) concluded that any incorrect measurements obtained with the defective device would have had only a marginal effect on the study results,” the European regulator said.
Further analyses “are not expected to provide additional information of substantial value,” the regulator said.
Frits R. Rosendaal, who heads a department that specializes in the blood coagulation system at Leiden University Medical Center in the Netherlands, told POGO by email that data presented in the EMA report looked “worrisome.” Rosendaal, who developed a quantitative method used in the study of blood thinning, said “a lot more” calculations would be needed “to reach a reassuring conclusion” about the reliability of the ROCKET AF results.
Yale University medical professor Harlan Krumholz told POGO that the two recent studies have not settled whether defective testing devices affected the results. He has called on Bayer to release the trial data so others can examine it, arguing that crowdsourcing could help.
However, based on a statement from Bayer, the crowd seems unlikely to get that chance.
Bayer has said that, in the interest of transparency, it has committed to data-sharing principles—for medicines approved since January 1, 2014.
Because the FDA and European Union approved Xarelto before that date, the ROCKET AF data “are not in the current scope of clinical trial data sharing,” Bayer spokeswoman Astrid Kranz said by email.
Read more articles in the “Drug Problems” series:
David Hilzenrath is the Chief Investigative Reporter for the Project On Government Oversight.
Topics: Public Health and Science
Authors: David S. Hilzenrath
- October 10, 2017
- April 3, 2017
- March 28, 2017
- March 3, 2017
- December 8, 2016
- December 1, 2016
- August 12, 2016
- July 13, 2016