5 Things We've Learned From Predicting Famine With Analytics
Lessons for those trying to understand why analytics matter.
Starving children have depended on warnings made by a federal interagency group that has worked together for more than 25 years to help international aid groups by predicting where famines in remote regions are occurring.
The Famine Early Warning System is an interagency network among federal agencies and the United Nations that began in 1985, using scientific data to target about $1.5 billion in food aid from the U.S. Agency for International Development (USAID) to those who need it most. Participating federal agencies include the U.S. Geological Survey, the National Oceanic and Atmospheric Administration, NASA, and the Animal and Plant Health Inspection Service in the Department of Agriculture. The network was created after photos of undetected mass starvation in Ethiopia and Sudan in the mid-1980s shocked the world.
USAID program manager Gary Eilerts described the network’s operations at a recent event at the Partnership for Public Service, which released a report on the lessons learned from early analytics programs in the government, co-sponsored with the IBM Center. He said you need to include in the network all the people who can answers operationally diverse questions about an area’s hydrology, geography, climate, food security, etc. The report goes on to say the network relies “on a mix of social and physical science data to determine, more precisely than ever before, which parts of the population in which regions of which countries, would suffer most from environmental shocks, usually drought.”
The analytics process filters enormous amounts of data into defensible analyses to provide early warnings to political leaders so they can respond in the early stages of a famine. According to the report, it is “a successful blend of Earth observations [from satellites], hydrological modeling, food economics, weather and climate modeling, and much more.”
Analytics Isn’t New. While there is a certain amount of hype around big data and analytics in recent years, the reality is that they have been historically critical to the missions of numerous federal programs for many years. For example, the Center for Disease Control and Prevention uses them to plot food-borne illness outbreaks via DNA markers; the military has analyzed millions of pieces of biometric data in Afghanistan and Iraq to pinpoint “bad guys:” the Animal and Plant Health Inspection Service uses them to predict where to find harmful pests when inspecting millions of shipping containers; and the Veterans Health Administration uses predictive analytic tools to conduct care assessment needs of their patients to keep them out of the hospital.
However, today’s senior managers are tempted to begin analytics programs before determining the mission-essential questions they are seeking data to answer. Older data-based analytics efforts often grew out of the discoveries of line employees who made connections and saw patterns in data after receiving new software or hardware that helped them make sense of what they were studying.
Five Lessons from Veterans of Analytic Programs. This report examines programs that have been in operation for a longer period of time, which offers a better understanding of how they have advanced and evolved over time to be a sustainable component of a program’s operation. It highlights older analytics efforts and distills five lessons that can be valuable to those new to an analytics approach and are concerned that this could be another “flavor of the month.”
Lesson 1: Collaborate with other agencies to collect data and share analytics expertise. The fastest way to get up to speed with a data and analytics initiative is to adopt and adapt what works from other agencies. The report recommends acquiring and sharing data and services from other agencies, where possible, through memoranda of understanding. This can be used to kickstart the collection, analysis and modeling tools needed for an effective initiative.
Lesson 2: Develop data to determine return on investment for analytics programs. Another important lesson in the report is that the long-term programs that have survived over time expended analytic efforts not just on mission improvement but also on demonstrating return on investment – reporting better outcomes was not sufficient. For example, CDC’s PulseNet initiative identifies food-borne illness outbreaks. It costs $10 million a year to support the network of 87 federal and state labs around the country that do the work, but it calculates that its prevents $219 million in hospitalizations and other costs.
Lesson 3: Give agency leaders clear, concise analysis and results they can use to support data-driven programs. Presentation is important. Distilling the key points out of complex data is an art. The report found that “Data visualization – charts, graphs, maps and models – make analytical findings easier and faster to comprehend.” After all, if agency leaders are to support the use of analytics, they have to understand the results and see how they help achieve the mission. Dr. Stephan Finn says that the VA has shifted the use of its healthcare information from monitoring performance over time to become a real-time decision-support aid for medical personnel on the frontlines.
Lesson 4: Encourage data use and spark insights by enabling employees to easily see, combine, and analyze it. The older grassroots-driven projects show that senior managers should not overlook the payoff that comes from enabling frontline employees to see and use data organized for their needs, not just the needs of senior leaders: “Letting intended users test-drive analytics tools and muck around in the data itself enable discoveries that can save time, ease adoption and ensure success.”
Lesson 5: Leaders and managers should demand and use data and provide employees with targeted on-the-job training. Making analytics a standard operating procedure in agencies is hard, but in the cases highlighted in the report, it has been done, and it has “stuck” over the years and become a part of how they do business. What are some possible steps? One is to create a critical mass of analytic talent, another is to designate a “data evangelist” to help people see how data and analytics can be embedded in ways that make their jobs and missions more effective. The one thing the report notes is that there is no “silver bullet.”