The Need-To-Know Fallacy

There are many gems in the book Team of Teams: New Rules of Engagement For a Complex World, written by General Stanley McChrystal and co-authors based on McChrystal’s experience leading the Joint Special Operations Task Force in Afghanistan and Iraq fighting Al-Qaeda. One that stuck with me is a small section called The Need-To-Know Fallacy.

Most of us who have worked in government have encountered people who routinely discourage or block information-sharing in the name of exercising the Need To Know principle. This has never sat well with me, but I have always thought the problem is probably just me, and that “growing up” in academia and then in a Silicon Valley start-up biased my thinking towards a culture of (unusually) open information sharing that just doesn’t work elsewhere. Luckily, McChrystal has now given me a language and mental model to understand this fallacy we call Need to Know. I can’t explain the concept better than the book so I will just quote the relevant paragraphs (some emphasis are my):

“The habit of constraining information derives in part from modern security concerns, but also from the inured preference for clearly defined, mechanistic processes — whether factory floors or corporate org charts — in which people need to know only their own piece of the puzzle to do their job. …  But as technology has grown more sophisticated and processes more dispersed, the way component parts of a process come together has become far less intuitive, and in many cases impossible for a cadre of managers to predict fully.

In military, governmental, and corporate sectors, an increased concern for secrecy has caused further sequestering of information. We have secrets, and secrets need to be guarded. In the wrong hands, information may do great damage, as the recent Snowden and WikiLeaks scandals have shown. In the absences of a compelling reason to do otherwise, it makes sense to confine information by the borders of its relevance.

The problem is that the logic of “need to know” depends on the assumption that somebody — some manager or algorithm or bureaucracy — actually knows who does and does not need to know which material. In order to say definitely that a SEAL ground force does not need awareness of a particular intelligence source, or that an intel analyst does not need to know precisely what happened on any given mission, the commander must be able to say with confidence that those pieces of knowledge have no bearing on what those teams are attempting to do, nor on the situations the analyst may encounter. Our experience showed us this was never the case. More than once in Iraq we were close to mounting capture/kill operations only to learn at the last hour that the targets were working undercover for another coalition entity. The organizational structures we had developed in the name of secrecy and efficiency actively prevented us from talking to each other and assembling a full picture.

Effective prediction — as we have discussed — has become increasingly difficult, and in many situations impossible. Continuing to function under the illusion that we can understand and foresee exactly what will be relevant to whom is hubris. It might feel safe, but it is the opposite. Functioning safely in an interdependent environment requires that every team possess a holistic understanding of the interaction between all the moving parts. Everyone has to see the system in its entirety for the plan to work.”

The above makes a compelling case for unrestrained sharing of information to achieve organisational-level coherence and shared consciousness. But what do we do to prevent the Snowdens and WikiLeaks of the world? The answer is simple: Information flows should be unrestrained in an organisation, but access to and use of all sensitive information need to be logged and audit-able and subject to security analytics processes. In other words, the good old principle of Trust but Verify.

PS: The Need-to-Know Fallacy as described above is also related to one of Ed Catmull’s principles of managing a creative culture: “A company’s communication structure should not mirror its organisational structure. Everybody should be able to talk to anybody.” (See also this blog piece.)

PPS: Government is far from the most secretive of organisations I have worked for. The number of non-disclosure agreements I have had to sign to get on projects and my subsequent assessment on the value of the information I came to possess through those projects speak of the general paranoia around keeping corporate secrets. Interestingly, my own experience is that the most traditional industries — e.g. steel manufacturing, oil refining, investment banking — also turn out to be the most secretive industries, perhaps because of the intense competition in those industries and the relative lack of moat or competitive advantage companies in those industries have over each other. Internet companies, in contrast, tend to have quite open sharing culture, at least with respect to sharing of information as opposed to data. As I pointed out in a separate blog piece and contrary to popular belief, these companies tend to use the exact same AI/ML algorithms, most of which are published in the literature, but derive their competitive advantage from running those same algorithms on their proprietary data collection, with the companies owning the best and largest data collection in their industry segment winning. That’s not a bad set up and perhaps there is a way for government and these more traditional industries to more openly share technical know-how, i.e. information, without openly sharing their data.

 

 


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s