If you’re like me, you may have thought that the US Military adopted network-centric warfare in the current conflicts, so it could leverage its technological advantage.  This widespread application of information technology as a unifying doctrine for warfighting was the culmination of a debate that began in January 1998, when the journal Proceedings published “Network-Centric Warfare: Its Origin and Future” by Vice Admiral Arthur K. Cebrowski and John J. Garstka.

If the only thing you ever read was General McChrystal’s It Takes a Network, published in Foreign Policy, you might think that the commanders in theater adopted network-centric warfare—not because of years-long deliberation within the DoD— but because al-Qaeda, adopted it first.

General McChrystal describes the enemy as being able to “… leverage sophisticated technology that connects remote valleys and severe mountains instantaneously — and allows them to project their message worldwide, unhindered by time or filters. They are both deeply embedded in Afghanistan’s complex society and impressively agile. And just like their allies in al Qaeda, this new Taliban is more network than army, more a community of interest than a corporate structure.”

Early in his command in Iraq, McChrystal drew a diagram illustrating the bottleneck that prevented the free flow of data among the highly compartmentalized structures of U.S. forces.  This bottleneck contrasted greatly with the al Qaeda’s easy exchange of information, which enabled it to maintain a lateral structure, quick adoption of successful tactics, and independent operations.

“The sketch from that evening — early in a war against an enemy that would only grow more complex, capable, and vicious — was the first step in what became one of the central missions in our effort: building the network. What was hazy then soon became our mantra: It takes a network to defeat a network.”

This is an inspiring story, depicting the flexibility and ingenuity of our military determined to complete its mission under difficult circumstances. McChrystal’s article is well-written and I strongly recommend it.

However, it is not correct to say the U.S. network-centric warfare efforts began with McChrystal’s diagram.   Award-wining Noah Shachtman writing in “How Technology Almost Lost the War: In Iraq, the Critical Networks Are Social — Not Electronic” for Wired.com, reports that the principles of network-centric warfare were adopted and applied as early as the 2003 invasion of Iraq.  In fact, they worked pretty well —during the invasion.  The problems occurred during the occupation and rebuilding.

Shachtman writes “… Cebrowski and Garstka weren’t really writing about network-centric warfare at all. They were writing about a single, network-enabled process: killing.”  In counter-insurgency, killing is not the same thing as warfare.  So, the DoD’s application of network-enabled killing was great for using Special Operation teams to target and eventually destroy SCUD missiles. Not so great for nation building.

Under McChrystal and Petraeus’ leadership, the U.S. led forces in Iraq and Afghanistan altered their internal social culture about intelligence distribution as well as built social networks with the locals. Shachtman is persuasive that the social networks with the inhabitants are more significant than electronic. Even Garstka admitted to Shachtman, “You have your social networks and technological networks. You need to have both.”

Just as McChrystal did, we need to change our attitudes in order to properly exploit the advantages of networks. This applies not only to the military, but to the community supplying them.  Contractors and sub-contractors need to overcome their traditional hostility with competitors and network with each other in order provide the best-possible solutions. Nowhere is this more apparent than in the issue of interoperability, which is essential for true network-centric warfare.