The Science of Funding
John McGowan fosters teamwork on the Web to target promising medical research investments.
John McGowan fosters teamwork on the Web to target promising medical research investments.
The federal government, and consequently the taxpayer, is the largest funder of medical research in the world, with annual expenditures topping $30 billion. So it's no surprise that the officials who oversee this funding are under immense pressure to choose the most promising experiments. That's why the National Institutes of Health is turning to a collaborative, Web-based tool to determine whether research investments are resulting in patents, influential papers and other significant contributions to science and industry.
Program officers and administrators across NIH are starting to depend on an online application developed largely under the supervision of John J. McGowan, deputy director for science management at the National Institute of Allergy and Infectious Diseases. The agency supports research into diseases such as malaria and AIDS, as well as food allergies, diabetes and asthma.
"Internally, it basically opens to the average program officer data that they've never had before," says McGowan, noting the system also makes it easier for him to answer budget questions from Congress.
McGowan, who joined the agency in 1986 to help oversee AIDS research funding, began using digital media to streamline the peer review process during the early 1990s. By the late 1990s, under his leadership all grant paperwork across NIH had been scanned into a searchable, electronic record that was accessible through an in-house database.
Internet access expedited research evaluations, but he recognized that the data by itself was not very helpful without the ability to combine it with background information. The thinking was "we have a lot of information, but we're not mining it for all the value that's there," he says. The new online tool "provides a front end and makes it very easy for the average scientist to ask questions of the portfolio."
Today, more than 1,000 people at institutes across the Health and Human Services Department are using the electronic Scientific Portfolio Assistant (eSPA) to track the impact of grants and contracts.
While work on the system began in 2007, during the George W. Bush administration, federal officials consider it part of President Obama's open government initiative to increase civic participation, public-private partnerships and transparency.
"It helps meet the administration's goal, which is one that we've had for a long time," McGowan says. "And this helps increase the accountability tied to results."
Before eSPA, NIH agencies spent roughly $250,000 on consultants to generate snapshots of research productivity once a year. Now government personnel have the tools to figure out the answers for themselves on an ongoing basis and to allocate grants based on the results. The eSPA software generates progress reports on groups of related projects by gathering context from systems, such as data on patents and clinical trials and citations of published papers.
The system isn't accessible to the public, but NIH officials say it allows them to quickly churn out reports that are then disclosed in testimony on Capitol Hill, in annual budget requests and on the agency's websites.
More than 53,000 research grant applications flowed through NIH in fiscal 2009. Of those, the agency funded more than 12,000 new awards. In total, it supported more than 45,000 research grants last year, all with the help of eSPA.
Last year, agency officials used eSPA to determine whether they had made a sound decision in granting money to a group of relatively inexperienced researchers. They tracked data on the scientists' publication rate, which is a common indicator of research productivity.
NIH had bypassed peer review, the traditional grant selection process, to award them funding for research that aligned with the agency's research priorities. The eSPA analysis confirmed their productivity matched that of veteran researchers selected through peer review. NIH continues to fund a limited number of projects through what is now known as the select-pay process.
"Using these metrics we can see how much money they've put in and how much they've produced based on publication records," McGowan says.
Not only does eSPA enable officials to pinpoint high-impact research, it also lowers consulting costs. Updating a 2007 evaluation of Type 1 diabetes research through eSPA, for example, cost about $50,000. According to officials, NIH saved about $200,000 in consulting fees.
And NIH got the update instantly, whereas a contractor would need a year and a half to produce a similar assessment, they say.
A chain of specialists and Web surfers would be required to gather the kind of intelligence that eSPA culls. NIH would need a program officer to retrieve a scientist's project summary; screen the material with a subject matter expert; then, after the scientist performs the project, ask another expert to assess the results; and finally compare that evaluation with performance data in the U.S. Patent and Trademark Office patent database, the National Library of Medicine's bibliographic database, and other science-related databases.
"Since we've integrated all of those into one system, now you just need to know eSPA," says Kevin Wright, who tracks such successes as head of program evaluation for the National Institute of Allergy and Infectious Diseases.
NEXT STORY: Sun Spotters