You may have been reading about information “silos” or “walls,” meaning that we tend to consume information that reinforces what we already believe, and ignore everything else.
I write about bias as one of the problem contexts surrounding analytics processes, but I’ve never found the image of a barrier to be a good representation for self-reinforcing information consumption. After all, the information we don’t consume exists, as available as what we do consume. There isn’t an intrinsic barrier at all, at least not associated with the information itself.
The notion of an information compartment or barrier has several problems. First, it’s difficult to live in a true information silo – the unpleasant and contradictory filters in, at least occasionally. If there is a wall, it leaks.
Second, we don’t so much ignore facts we don’t like, as we convert them to a more palatable form so they can be integrated, or summarily rejected. In short, we engage in information caricature, possibly as a defense mechanism. Conveniently, at least when we’re dealing with unpleasant facts, transferring information is a messy and imprecise business, difficult to do well even when we are motivated to be accurate. And we really can’t think about information without thinking about its movement and transfer. Data, rather like Berkeley’s falling tree, is only meaningful when processed by humans – when it is assembled by someone, transmitted, and then received by someone else. By the time a nugget of information has been processed through this tripartite gauntlet, any resemblance to the original may be close to coincidental. (I like the communications-class exercise of passing a fact around in a circle to see how quickly it becomes distorted, to the point it is nearly unrecognizable. We shouldn’t think data systems will save us from this natural distortion, for people load those systems with data, then transform the data, and other people read and interpret the results.)
In my experience, our ability to morph the contradictory or disagreeable until it’s consistent with our existing beliefs is far greater than our ability to fully ignore what we don’t care to see. If I had a dollar for each time an analyst (myself included) read something into a data outcome that wasn’t there, you would not have to be reading this paragraph. People rationalize the most amazing things, particularly when the environment doesn’t fully support objective analysis.
Objectivity leads me to a third point: information transfer is social – after all, how do we normally interact with others? With communications, what we seek most of all is comradery, rather than objectivity. Unless I want to start a fight with each statement I make, what I present is likely to be aligned with the beliefs of those in my work or social community. We communicate to bind together, and we naturally craft our communications to be agreeable to our friends and colleagues. It should be no wonder that the information we obtain from casual or even professional communications can bear little resemblance to an original fact, and is rather pleasingly consistent with what we think we already understand. Not only does communication bind people together, it binds and coalesces the facts we share together.
We might do better to speak of the information communities that bind us as and our shared facts together, rather than the negative image of an information wall or silo. The wall/silo image implies that only our individual weakness prevents us from breaking through to a fuller and more realistic picture. I see the binding together of people, and their perceived facts, as more helpful to understanding many conclusions drawn from information. That binding is powerful, for very few of us really want to stand alone, and so can be very be difficult to disrupt, even when it’s to our advantage to do so. In my view, understanding the origins of information communities and how they impact the conclusions we draw from data is a key feature, perhaps the key feature, of good information analysis.