There's an Andrew Pollack story in the Business Day section of today's New York Times that provides an excellent model of responsible news reporting in the drug-development business, while also illustrating how hard it can be to make a properly sourced and hedged story feel compulsively readable.
The story, "New Novartis Drug Shows Striking Efficacy in Treating Heart Failure," is one of a number of stories out today about a study being published in the New England Journal of Medicine and presented at the European Society of Cardiology congress in Barcelona, Spain. The study revealed long-awaited data about an experimental Novartis compound, code named LCZ696, that was designed to lower the risk of hospitalization and death in a large group of patients with heart failure.
In people with this condition, the heart can't pump enough blood to the body's organs, leading to fatigue, shortness of breath, fluid retention, and other nasty problems. Two of my own grandparents died of heart failure, which affects an estimated 5 to 6 million Americans at any given time.
Pollack's article doesn't go into detail about how the Novartis drug was developed or how it works. But it's unusual for its careful attention to the quantitative details of the study.
It also calls attention to fact that in the pharmaceutical industry today—when new blockbuster drugs come along so rarely—a relatively small increase in effectiveness can be hailed as a major victory.
What's so "striking" about LCZ696? According to Pollack's summary, a group of 8,400 heart failure patients in 47 countries were randomly assigned to receive either LCZ696 or enalapril, an ACE inhibitor that's a standard treatment for heart failure. During the 27 months of the trial, 26.5 percent of the patients receiving enalapril died or were hospitalized for worsening heart failure. Among the patients receiving LCZ696, the figure was 21.8 percent.
Here's the section of Pollack's story that I was most pleased to see—since these are the kinds of details you don't usually read in a story about a new drug. The difference in death or hospitalization rates—26.5 percent versus 21.8 percent—"represents a 20 percent relative reduction in risk using a statistical measure called the hazard ratio," Pollack writes. He continues: "There was a 20 percent relative risk reduction for cardiovascular death alone, and a 21 percent reduction for first hospitalization for heart failure. About 17 percent of patients getting LCZ696 died from any cause, compared to 19.8 percent of those in the control group, a relative risk reduction of 16 percent. About 32 patients had to be treated with LCZ696 to prevent one death."
When you put it that way, the new drug doesn't sound all that revolutionary. Pollack's focus on relative risk reduction and the hazard ratio is entirely appropriate, but it's almost always missing in mainstream news reports on new drugs, perhaps because it's such a downer.
After all, the new drug didn't cure heart failure—not by a long shot. All you can really say is that it was a little better than enalapril at keeping heart failure patients from dying or being hospitalized.
The most interesting figure, to me, was that last one: researchers had to give the drug to 32 patients to save one. So LCZ696 isn't exactly a miracle drug. Yes, it's more effective than enalapril, which was approved by the FDA almost 30 years ago. It's probably enough better that cardiologists will want to prescribe it, even at what's likely to be a high initial cost (assuming regulators at the FDA and regulatory agencies in other countries approve it). And that's probably sufficient to bring a big payday for Novartis—indeed, FierceBiotech's headline on the NEJM study says says LCZ696 is kindling megablockbuster projections. “It really is an advance, and we haven’t had one in a long time because the bar really is high,” Marc Pfeffer, a cardiologist at Brigham & Women's Hospital in Boston, told Pollack.
What makes reporting on the pharmaceutical industry so tough is that advances sound a lot less world-changing when you frame them in terms of relative risk reduction or hazard ratio. It makes adjectives like "remarkable" and "astonishing," which both turned up in Reuters' story on the study, seem a lot less warranted, even when they're coming from experts. Yet it's the only responsible, moderate, measured, accurate way to talk about these kinds of studies.
A side effect is that stories can end up up feeling a bit like a Wall Street analysts' reports (it's no accident that Pollack's story turned up in the Business Day section). But there will be plenty of time later for more colorful stories about the Novartis drug's potential impact on real people. Meanwhile, Pollack's circumspect treatment is a credit to him and his paper, and a good lesson for other science and medical reporters.