Research Productivity among NIGMS-funded PIs

Update: Jeremy Berg’s analyses of productivity among NIGMS-funded PIs has been covered in Nature News, with some additional commentary and a new composite figure.

NIGMS Director Jeremy Berg continues to anticipate the sorts of questions NIH-funded investigators would like to have answered. In his latest Feedback Loop post, he analyzes data on publications from 2007-2010 linked to NIGMS funding and the impact factor of the journals involved according to annual direct costs.

So what do the data show? Among the 2,938 investigators who held at least one NIGMS R01 or P01 grant in Fiscal Year 2006:

Median number of grant-linked publications, 6 (2007-2010)
Median journal impact factor, 5.5
Median annual direct costs of funding received, $220K

Please note (at the original post) the ranges in each category, though, and that funding totals are by PI vs award. PI productivity peaks at about $700K per year in annual DC. Anyone surprised? Jeremy notes this supports the NIGMS threshold of $750K in defining (and limiting addtional funding to) “well-funded laboratories.”

Anyone have specific sorts of peer review/grant funding/research productivity data they would like to see analyzed?

One hopes OER has recognized the scientific community’s interest in these types of data … perhaps they could cover NIH-wide and IC-specific trends on their RePORT site down the road.

8 Comments »

  1. Physician scientist said

    I think this is a fascinating analysis. I wonder how huge grants like the CTSAs would compare. My suspicion is that the impact/dollar spent is substantially lower than that of a single R01.

    It also shows that high paper numbers are the exception rather than the rule. This is good to show graduate students.

  2. BB said

    $750K = well-funded? Gee, then poverty-stricken labs like mine should feel real good about our progress (/sarcasm).

  3. […] 1, 2010 at 1:04 am · Filed under Grantsmanship, NIH Advice, Research News Ask and ye shall receive … NIH-wide data on the correlation between individual review criterion […]

  4. Rob said

    I find the RePORTER’s publication data to be extremely incomplete. Despite the fact that we put the appropriate grant numbers in the acknowledgments section of every paper, RePORTER only shows one paper from the last 5 years under one of our grants.

    In reality, it is more like 6 to 7 papers per year on that grant for the past decade.

    Are other folks finding the ‘results’ data out of sync?

  5. DrugMonkey said

    Rob, it is *very* likely that these data are being pulled from the PubMed Central system. If I am not mistaken you still need to do a lot of manual linking of your publications to the grant number. This is not an automatic process by any means. Try logging into PubMed Central and see if those data match up to what you are seeing in RePORTER. (and, you know, RePORT back to us!)

    • Rob said

      You’re correct. Everything is properly listed in PMC (just one year delayed).

  6. harristodd said

    Hey — I was just wondering about this, too. I’m totally up to date in PubMed Central, but my RePORTER “Results” section is under-populated. Am I missing something?

  7. writedit said

    Jeremy’s analyses have made it into Nature, with additional commentary and a new composite figure.

RSS feed for comments on this post · TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: