Bounded Awareness in the context of an Epistemology of Practice

When Polanyi (2009) describes his vision of tacit knowing, he emphasizes the hierarchical nature of it via the proximal and distal terms; the former comprises the components, the latter the greater whole. This concept is present in the common vernacular with a particular saying:

The whole is greater than the sum of its parts.

He presents a vision of the universe as stratified, consisting of these proximal and distal pairs in a massive relational tree. Polanyi does us the great favor of illuminating his intent with examples to aid in picturing what he describes, but the core idea as a whole is, as suggested at the start of the course, profoundly simple.

That’s one element of it, at any rate–more than a few exist, as others have already demonstrated with the focus taken in their blogs. For now, however, I will deal with this.

As I was perusing articles in an effort to find a focal point for this first post, I happened across an interesting turn of phrase in the Cook and Brown (1999) article.

The first contention of this paper is that each of the four categories of knowledge inherent in the explicit/tacit and individual/group distinctions is a distinct form of knowledge on equal standing with the other three…

For context, there are two main arguments at the heart of that piece:

  1. That knowledge and knowing are distinct things, but that each acts in support of the other.
  2. That knowledge is composed of four groupings composed of two mutually exclusive pairs: explicit/tacit and individual/group.

These came together to describe Cook and Brown’s central ideas, the “epistemology of practice” (intended as a mirror of the epistemology of possession, knowing paired with knowledge), and the “generative dance” in which knowledge is used as a tool of knowing within the a social and physical context, creating new knowledge as a result (1999).

image007Essentially, it is a modification and expansion of Polanyi’s conception of tacit knowing (as a great deal of writing on the subject is), one which I think has an interesting concept behind it and a general enough approach to apply in a broad range of contexts.

Consider that to be an extremely brief summary, as that article had enough content for a great many posts–and the ideas presented therein are central enough that I suspect I will be revisiting Cook and Brown in future. For now, it is enough to have a general idea of their gist.

Another of the articles I examined was the Kumar and Chakrabarti (2012) piece, which reexamined the Challenger disaster from the perspective of bounded awareness. I’ll begin with the description of bounded awareness which they borrow from Bazerman and Chugh (2006):

…decision makers [experience] bounded awareness when they overlook relevant and readily available information, even while using other available information, and take a decision that is either suboptimal or entirely erroneous.

Ultimately, they conclude that the Challenger disaster falls largely on the fact that NASA managers had bounded awareness, which they attribute to a RCR-identified (1986) and subsequently investigated (Starbuck, and Milliken 1988) combination of repeated, uninterrupted successes and a gradual acclimatization to risk-taking due to the same, allowing them to feel confident and trivialize knowledge which should have alerted them to danger.

In particular it is noted that many NASA engineers noted and alerted others to the problems which were ultimately dismissed. In that single point we can see both Polanyi’s original conception of tacit knowledge and Cook and Brown’s expanded vision. The engineers who protested the launch knew that things were wrong, that there was–to them–a clear and present danger of failure in the O-rings, but they were unable to convey the full import of it to NASA managers. Why? Cook and Brown would suggest several factors.

The individual tacit knowledge of the NASA engineers conflicted with the group tacit knowledge of NASA as a whole–the sense of impending danger due to mechanical wear didn’t mesh with the culture of confidence and increased risk-taking. Moreover, the explicit group knowledge that the O-rings were not supposed to show signs of wear was glossed over for much the same reason.

Additionally, that same explicit group knowledge was not allowed into the “generative dance”, there was in fact a deliberate failure to apply it to the knowing/practice of NASA operations.

In short, the issue of bounded awareness described by Kumar and Chakrabarti is explained and expanded upon by the concepts of Cook and Brown. It was not merely that NASA managers were confident and increasingly willing to take risks, but that those were integrated into the tacit group knowledge of NASA operations staff. They became the norm, to the point that they were only questioned by individuals with both personal tacit knowledge of danger and access to broader explicit knowledge of a potential source of that danger–and ignored even then!

 

I can’t exactly speak as to how I arrived at this subject, as it seemed to fall together almost too neatly. That said, after much editing, I rather like how it turned out. It was certainly a pleasure to consider and write, so I suspect I may do more in this style, taking one theoretical piece alongside one empirical piece and shaking a bit to see what ideas happen to fall out.

 

References

Bazerman, M.H. and Chugh, D. (2006a). Decisions without blinders. Harvard Business Review 84(1), 88-97.

Bazerman, M.H. and Chugh, D. (2006b). Bounded awareness: focusing failures in negotiation. Thompson, L.L. (Ed.). Negotiation Theory and Research: Frontiers of Social Psychology. New York, NY: Psychology Press. 7-26

Cook, S.D.N. and Brown, J.S. (1999). Bridging epistemologies: the generative dance between organizational knowledge and organizational knowing. Organization Science 10(4), 381-400. doi:10.1287/orsc.10.4.381

Kumar J, A. and Chakrabarti, A. (2012). Bounded awareness and tacit knowledge: Revisiting Challenger disaster. Journal of Knowledge Management, 16(6), 934-949. doi:10.1108/13673271211276209

Polanyi, M. (2009). The tacit dimension (Revised ed.). Chicago, IL: University of Chicago Press.

Starbuck, W.H. and Milliken, F.J. (1988). Challenger: fine-tuning the odds until something breaks. Journal of Management Studies 25(4), 319-40.

RCR (1986), ‘‘Report of the Presidential Commission on the Space Shuttle Challenger Accident”. science.ksc.nasa.gov/shuttle/missions/51-l/docs/rogers-commission/table-of-contents.html Retrieved 1/27/16.

Bounded Awareness in the context of an Epistemology of Practice

7 thoughts on “Bounded Awareness in the context of an Epistemology of Practice

  1. This was a very well thought out and reasoned entry. I especially liked when you said “The individual tacit knowledge of the NASA engineers conflicted with the group tacit knowledge of NASA as a whole–the sense of impending danger due to mechanical wear didn’t mesh with the culture of confidence and increased risk-taking.” This was a real world example of tacit knowledge that made me understand it better so thank you! Can’t wait to read that article.

    Liked by 1 person

  2. Cook & Brown write a great article — yet they diverge from Polanyi in one important way. As you point out, they do not subordinate one grouping from the other, but Polanyi made explicit knowledge a derivative of tacit knowledge. It’ll be fun to explore this difference.

    Like

    1. Not precisely. I would agree that risk itself is considered a negative and generally seen as something which should be subjected to reductive attempts, both within KM and in many of the areas of work and the world which KM scholars study. There’s a fairly well-known article which describes risk-taking behavior as a balancing act in which decision makers weight the potential risk and expected value of various actions when choosing, in which “expected value is assumed to be positively associated and risk is assumed to be negatively associated, with the attractiveness of an alternative.” (March and Shapira 1987). This isn’t necessarily the case for all scholars, of course, but I would posit that in general risk-taking has no inherent valuation in KM, that it’s a neutral decision making behavior which happens to include the negative factor of risk in its calculations.

      So, for example, the Challenger disaster: the risk taking behavior of NASA managers was not viewed as a negative because they were taking risks, but because the risks they took were high, unnecessary, and generated very little additional benefit over choices in which they weren’t taken. Like so many other things, I’d say that it depends highly on context.

      March, J.G. and Shapira, Z. (1987). Managerial perspectives on risk and risk taking. Management Science 33(11), 1404-18.

      Like

  3. You have incredibly well constructed connections between the two articles. I like the idea of using both one empirical and another theoretical article to bridge the research and form a larger conceptualization of their respective parallel philosophies.

    Like

    1. I was hoping for an effect like this when I decided to handle it that way. It seemed like trying to do pure theory would get a bit messy–those articles tend to be meaty enough to fill a post on their own, never mind trying to work in three or four.

      Like

Leave a comment