Get BUFF
The Army muscles through the data fog of war.
As the military transforms from a hulking force, built to fight large-scale tank and infantry battles against superpowers that no longer exist, into what military planners have called "light" and "agile" units that can deploy anywhere in the world on short notice, the appetite for real-time information about what's happening on the battlefield has increased dramatically.
Consider this. The wars in Iraq and Afghanistan demonstrated the military's penchant for all manner of battlefield sensors-from devices that detect movement on the ground to high-flying aerial drones outfitted with video cameras. Those sensors report data back to military commanders, who plot troop movements and attempt to predict what their adversaries will do. Whereas those planners are accustomed to receiving hundreds of reports every hour, in the future it's believed they'll be juggling upwards of 200,000 in the same time.
As troop units get smaller, so will the groups of analysts trying to make sense of that swelling wave of sensor data. And when it's that massive, there's no way one human being, or even a group of them, can ride the wave without drowning.
Thus, the rise of the machines. If you think of analysis as a pyramid, then human beings sit on the pointy end, atop the network of computers, databases and software tools they have to manage. In the future, the pyramid will flip over, and machines will do the heavy lifting of data crunching and might even predict wartime scenarios for human analysts, who will become the trickle-down beneficiaries of computerized intelligence.
That's the vision, at least. And it's a long way off. But a group of futuristic computer researchers want to get there faster. They work at the Army's Battle Command Battle Lab at Fort Huachuca, Ariz., one of 13 war research centers devoted to puzzling out the problems that, to most eyes, look impossibly futuristic and like the stuff of science fiction.
Jason Denno, the Battle Lab's deputy director and a self-described "deviant" thinker, is leading the BrUte Force Fusion Program, a daring and potentially fruitless attempt to conquer the data deluge by wrestling it to the ground. (The muscular approach prompted the program's acronym, "BUFF," which was Denno's idea.)
BUFF aims to achieve what sensor-minded scientists call level II fusion. In the fusion hierarchy, level I occurs when a sensor is able to detect the movement of an object in a battle space-a tank or a soldier, for instance. Level II, however, integrates data from multiple sensors. Instead of seeing one tank or soldier, it describes entire groupings of tanks or troops.
Level III makes an evolutionary leap, actually describing what the moving units are likely to do next. For military planners, that's the nirvana-like state of total battlefield awareness. But there's a problem. Even level II fusion is so complex that most scientists believe it will take a quarter-century to build systems capable of managing the massive sensor traffic that comes with it.
Denno is not one of those scientists. He hopes that, with BUFF, the military could produce level II fusion tools within the next few years. "We have a sea of information at this point," Denno says. Buried in it, somewhere, could be the telltale clue that lets analysts know whether they're looking at a few tanks moving across the desert, or the point of a much larger force, backed up by devastating artillery power.
As Denno sees it, analysts searching for those intelligence nuggets have been going about things the wrong way. Traditionally, they have looked at the most current set of facts and assessed what was happening at that moment. When they have to make another assessment-probably a few minutes or a few days later-they look at the freshest data obtained since the last set.
The problem, Denno says, is analysts' views frequently are based only on the most recent data; they might not account for historic trends. BUFF, however, would base each assessment on all the data that has been collected. Of course, that means the data set grows exponentially with each new assessment. The mound of data becomes a mountain and then a mountain range.
BUFF would manage the heap by taking advantage of the tremendous leaps in modern computing's ability to store data and move it around in massive swaths among processors that can slice, dice and sort it. That's the idea, at least. Denno admits the data sets could grow so large they would crash even the buffest system.
Still, because of fairly recent technological advances, the BUFF vision appears to be more realistic, says Benn Stratton, national director of the defense and civilian agencies division of SGI Federal, a high-performance computing firm working on the project with the Fort Huachuca battle lab. In what passes as the old days-less than a decade ago-analysts had to fetch data in chunks from storage computers, the equivalent of going to a library to check out books rather than having them at your fingertips electronically, Stratton says. In a perfect world, BUFF would read the library in seconds, process the information, and then tell analysts what they need to know.
Denno apparently recognizes that he has proposed a radical solution to the problem of dealing with overwhelming amounts of data. Rather than trying to handle it discretely, with finesse, he wants to swallow it whole. That ambition was the inspiration for the name BrUte Force Fusion.
"Sometimes, you just need a sledgehammer," Denno says of his all-or-nothing concept. "We were the first ones to use a sledgehammer instead of a pickle fork."
NEXT STORY: The Power of Six