A couple of recent reports may be of interest to the translational research crowd. Nature Reviews Drug Discovery reported Trends in Disease Focus of Drug Development using data from Phase II-IV studies registered with clinicaltrials.gov from Sept 2005 through Sept 2007 (as a reminder, the ICMJE requires registration of all studies to be reported in journals abiding by their policies).
Six therapeutic areas accounted for two-thirds of all protocols: oncology, central nervous system disorders, cardiology, infectious diseases, endocrinology, and respiratory diseases. Of these six, respiratory diseases, endocrinology and oncology showed growth in the number of trials registered during this period. Among all disease categories, relative growth was greatest for rheumatology (157.6%).
On this side of the pond, Science published a Policy Forum article on the Life Cycle of Translational Research for Medical Interventions confirming the time and effort needed for the drug discovery efforts noted above to translate into accepted effective interventions. Indeed, the authors start off with the stark observation that:
Of 101 very promising claims of new discoveries with clear clinical potential that were made in major basic science journals between 1979 and 1983, only five resulted in interventions with licensed clinical use by 2003 and only one had extensive clinical use.
Matched by a thoughtful conclusion:
Successful translation is demanding and takes a lot of effort and time even under the best circumstances; making unrealistic promises for quick discoveries and cures may damage the credibility of science in the eyes of the public.
The authors sought out the first highly cited clinical study published showing effectiveness of a specific intervention (i.e., received over 1000 citations in the literature in 1990-2004 on the basis of the Web of Science). Of the 49 articles in this category, 32 reported the intervention investigated as effective and were selected for further examination (if more than one highly cited article covered the same intervention, only the first report was selected). By the end of 2006, the effectiveness of 19 interventions had been replicated by subsequent studies (14) or had remained unchallenged (5), whereas the other 13 had been either contradicted (5) or found to have had initially stronger effects (8) when larger or better controlled subsequent studies were performed.
To estimate the “translation lag” for each intervention to reach this milestone (publication cited at least 1000 times), the authors went back to identify the year of the earliest journal publication on preparation, isolation, or synthesis appeared or the earliest patent was awarded (whichever occurred first).
Using these milestones, “the median translation lag was 24 years (interquartile range, 14 to 44 years) between first description and earliest highly cited article.” Repeating the analysis using the time of discovery as the first publication or awarded patent of any agent in the wider intervention class showed even greater lag: median of 27 years (interquartile range, 21 to 50). Not surprisingly, among 18 nonrefuted interventions, the median translation lag was 16.5 years (range 4 to 50 years) or 22 years (range 6 to 50 years) when considering the wider class.
The authors note that
The fastest successful translation occurred for indinavir (as part of triple antiretroviral therapy) and abciximab, both of which took only 4 years from their patenting to the publication of a highly cited randomized trial. Both of these fast successes involved multidisciplinary work spanning molecular to clinical research on protease inhibitors and integrins, respectively.
On the other hand, they note that rather than continue to study highly cited claims that were eventually refuted (e.g. flavonoids, vitamin E, estrogen), time, money, and effort would be better invested in novel agents and technologies for treating for common diseases since “it is unlikely that genuine major benefits from interventions already known for a long time have gone unnoticed.”
Of course, a recent article in Circulation notes that, in addition to being more likely to report positive results, industry-sponsored cardiovascular clinical trials are more likely to be cited in the literature than studies performed by not-for-profit organizations (e.g., the NIH). The HeartWire summary notes that
The for-profit organizations have better media-outreach programs in place, ensuring wider press coverage that ultimately increases the citation rate. These companies also invest more resources to increase the number of secondary publications for a trial, leading to more exposure of the initial study.
It would be interesting to compare the translational lag of industry versus publicly funded research.