Top Bar
770-507-5001    |    info@vanallen.com

Plane Talk



An Isolated Incident: Risk Management and Organizational Culture

Posted on: June 28th, 2016 by webmin

by Dr. Jerry Dibble

 

“It was an isolated incident.”  The last time I heard this phrase, I was seated at a conference room table in a client hangar.  But, as usual, it didn’t mean what those who used it thought it did.

Fortunately, the event in question had not resulted in personal injury or damage to the aircraft. In fact, at the time, the passengers were probably unaware that there had been an “incident” at all.  But within two weeks it had led to the dismissal of the director of aviation and chief pilot, created ill-will among the members of the flight staff and left everyone so sensitive to criticism that simply mentioning it led to the dismissive remark that it was simply “an isolated incident.” The group, we were assured, was conducting all of its operations according to the book, and its heightened attention to safety and security made further references to the event unnecessary.

The truth of the situation was quite different. When senior management asked us to assess the organization and its performance, we began by conducting half-hour interviews with each member of the department—the flight staff, maintenance technicians, schedulers and other ground personnel. We also walked the hangar and made notes about safety equipment, cleanliness, and security. Our observations and interviews clearly indicated that, while the group’s intentions were quite good, their performance and the levels of risk management were significantly below what the aviation department’s members believed them to be. In most respects, the group’s operations were standard practices at best, with only a few areas in which they were adhering to best practices or best practices plus (the standard we recommend and that most corporate executives expect.)[1]

We also observed a number of disturbing things about the culture of the organization. Like most aviation professionals, they were convivial and energetic, results-oriented, and completely at home in highly structured environments. But they were also conflict averse–so much so, that until it became absolutely necessary to address operational challenges or areas of disagreement, they would go out of their way to preserve a collegial, amiable environment in the department—an environment that depended heavily on their unexamined faith that most members of the group were doing a good job, agreed on fundamental principles for operating the aircraft safely, and had each other’s best interests at heart.  But in some cases, a profile like theirs can mean trouble.[2]

For one thing, confidence in the excellence of a group’s work and the good intentions of its members do not guarantee high performance. When we asked the individual members of the client group to rate the department’s performance on safety, customer service and financial effectiveness, they consistently rated themselves between 4 and 5, the two highest ratings—not just on safety but on service and efficiency as well. But it was apparent to us, even from a preliminary survey, that ratings of 2 to 3 would have been more realistic and that a full audit would almost certainly support the lower ratings. It was clear there was nothing isolated about the event in question.

If you are fortunate enough to own a summer cottage, you may have made a late-night trip to the kitchen for a snack, turned on the light and seen a cockroach run for cover. You could easily dismiss the event as an “isolated incident.”  But, if you are being honest with yourself, you probably know that when you see one cockroach there are many others beneath the sink, in the woodwork and in other places that you can’t see and don’t want to look. Isolated incidents in aviation are much the same—except, of course, that cockroaches at a summer cottage are mortifying, but not literally so.

When you see a cockroach in your kitchen, best practices call for an apology to your guests and an appointment with the exterminator. In aviation, the best practice is to call in someone who can help the members of the organization see what they don’t know or don’t want to admit. Because denial (defining an incident as “isolated”) should not be an option.  The “isolated incident” is always a warning and an invitation to improvement. Here’s why.

Over the past two decades, industrial psychologists have adopted a strikingly different approach to the study of organizational knowledge.  It has important implications for business aviation.[3] As they see it, documentation of rules and practices, such as regulations, flight operations manuals, and checklists, is only one type of organizational knowledge—and perhaps not the one that has the most impact on the way tasks are actually performed. In reality, they say, the organization’s explicit or recorded knowledge—the set of regulations, practices, and policies that most departments follow—is merely the written form of the agreements, some explicit and some implicit, that the members of the department have made about the operation of the aircraft and the tasks it involves.

To be clear: The scholars agree that rules and procedures, industry regulations and best practices are important and a critical element in effective risk management. In most cases, they reflect the organization’s best knowledge about professional aviation—including about things that can go wrong, either with mechanical equipment or in the way people deploy it. Most aviation professionals know these rules and procedures well and do a good job of observing them. But in doing so, even the best and most professional of them mistakenly assume that safety and risk management are simply a matter of faithfully obeying FAA regulations, following the flight operations manual, completing checklists, etc.  And they also assume that the written rules accurately and fully represent the way tasks are (or should be) performed and the way decisions are (or should be) made.

But Tsoukas contends that there is more to performing even the most routine activity than simply following the written rules. In fact, he would say that (1) in making a decision of any kind, the members of the organization are, to some extent, always interpreting written rules and guidelines—consciously or subconsciously deciding which rule applies to the current activity, the limits of its application, and so forth; and that (2) such decisions and interpretations always bring into play many pieces of information and many considerations that have not been recorded as organizational knowledge, for which there are no written rules, and which may vary considerably from one member of the group to another, one trip to another, and one aviation department to another.

For an easy example, we might look to stick and rudder skills, which in their own way vary as much from one aviator to the next as the “fist” of telegraph operators once did. In that case, even when the operators sent the same messages, using exactly the same code symbols and hardware, it was always possible for other operators to recognize the individual by the specific way he or she operated the equipment. The same is true for pilots, even when they follow the rules for climb rates, approaches and hundreds of other everyday tasks in the cockpit.

But Tsoukas’ observations go well beyond stick and rudder skills. ”Any practice,” he reminds us, “is always richer than [the] formal representation of it” (p. 105). In other words, regardless how detailed and comprehensive flight operations manuals and other documentation may be, individual members of an organization always understand and follow rules, procedures, and other generalized instructions only “in practice”—i.e., by applying them in specific, concrete circumstances. As a result, the performance of even the most routine activity is always “an interpretive act,” a situational application of established rules conditioned by the past history of the individual who carries it out; formal organizational processes and procedures; and a host of unforeseeable and therefore undetermined circumstances that make application of the rules an art rather than a science

For obvious reasons, this observation disturbs most aviation professionals. Because they prefer highly structured, matter-of-fact environments, they believe that a good set of rules, operations manuals, checklists, etc. is the only reliable way to define the roles and activities for which they are responsible. For the same reason, the idea that the detailed, written processes and procedures in the FOM, MEL, SMS and other documents are open to interpretation can strike them as wrong-headed. After all, the whole point of standard practices, rules and regulations is to regularize practice and avoid individual interpretation.

Tsoukas agrees that idiosyncratic or uninformed interpretation can be disruptive and dangerous. But he also maintains that, even with the most detailed and exhaustive rules and process descriptions in place, putting them into practice always requires interpretation. As he sees it, the only possible choice is between social interpretation based on common agreements, values and goals (what he calls an “organizational narrative”) and unanchored individual interpretation. A completely objective interpretation and application of the rules by the person performing the action is impossible.  In short, even a perfectly written set of guidelines will not assure safe outcomes.

The scholars’ observations have important implications for our understanding of the role of safety rules and regulations. First and foremost, they suggest that the safety net of rules and regulations is not actually a safety net. It is merely a first line of defense that is most effective when it is least needed, a porous latticework of organizational knowledge stretched across a chasm of as yet unidentified risks and threats. When everything goes according to plan, the safety net protects well enough against known risks and produces acceptable outcomes.  But, like any net, it is full of holes, any one of which could provide the opening for an unanticipated and potentially dangerous “incident” to occur. At its very best, Tsoukas would say, the safety net is a work in progress, an incomplete fabric of interrelated individual and organizational commitments to the safe and effective practice of aviation.

Needless to say, the client group in question was not familiar with any of these views. They believed that because they had a detailed FOM, had filled all the roles in the organization chart with experienced professionals, put the usual safety procedures and checklists in place, and were working hard to meet their responsibilities, they must be performing at or above the best standards for business aviation.  But they had failed to take into account several factors, the import and the force of which would have been much clearer to them with the advantage of Tsoukas’ insights:

(1)      With department growth it had become necessary to recruit a number of new pilots over a short period of time.  As a result, many members of the group were new to the organization.  The way they applied and interpreted its rules, systems and procedures— even the way they understood their roles in the organization—varied significantly from one person to another, sometimes unconsciously so.

(2)     With the increased workload and the addition of new aircraft, the pressure on the department was far higher than it had ever been. As a consequence, there was little time to identify and fill in the gaps in the safety net, discuss and agree on the alignment of the department with corporate goals, or even to address the logistical and personal issues imposed by tight schedules.

As a result, on the day of the “isolated incident,” at the end of a long day of flying, in bad weather and with a contract pilot in the right-hand seat, one of the gaps in the safety net made itself apparent in a way that was disturbing enough for at least one member of the flight crew to report it to the chief pilot.  But, astonishingly, there was no formal discussion of the event, either by the crew, in its immediate aftermath, or subsequently by the department as a whole. Instead, everyone turned his or her back on it. And when it was no longer possible to ignore what had happened, the group began to look instead for a scapegoat. They assumed that since there was nothing wrong with the department’s standards of operation, the fault must lie with someone’s incompetence, either a “personality problem” or a willful departure from best practices.

What, then, should the client have done? First, of course, the crew ought to have discussed the event immediately.  If they had, they might have seen that it represented an implicit challenge to their assumption that the organization was operating at the level of best practices. They might also have seen—as was apparent from our interviews—that it arose not from individual incompetence but from a failure on the part of everyone in the cockpit to identify the risk factors inherent in the situation and to agree in advance on the best way to deal with them. Instead, they depended on a menagerie of incompatible assumptions and individual interpretations to carry the day.

Given enough time and enough help from outside sources, they might also have avoided the impulse that led them to turn away from the event without discussion, to see it as an “isolated incident” and nothing more. They apparently perceived it, consciously or unconsciously, as a threat to their professional competence, the continuity of their personal and professional relationships, the aircraft and their passengers—and thus to the status quo. But scholars would define the event in terms that are at once richer and more provocative: as an invitation to deepen and broaden the organization’s practical knowledge, its commitment to a common interpretation and application of existing rules and procedures, and, beyond that, to strengthen the underlying fabric of values, judgments, and agreements that defined the organization’s culture.

Tsoukas and other scholars call that fabric an organizational “world view” or “organizational narrative.” What that means in plain language is that our actions, our goals, our decisions and our relationships with others are never independent threads. Consciously or unconsciously, we weave them together so that all of our interpretive acts are united, not merely by their locations in a series of processes and procedures but by a common understanding that serves as the ultimate reference point not only for corporate behavior in general but for operational safety, effectiveness and customer service in particular.

The real moral of the client’s story was that without relentless attention to the organizational culture—especially as so much changed around them—even the best documentation would have been powerless to prevent the series of events that led to our visit. By definition, safety and procedural documentation is a skeletonized record of the past, of what has already been, known and decided. A strong organizational culture by contrast urges us to look at how past interpretations relate to the here and now and to the future—to what might be. It enables us to see how all events reflect each other and everything else in the organization. And, finally, it enables us to anticipate what might be next by virtue of open discussion, mutual interpretive effort, and reliance on an extended community that spreads well beyond the department. For that reason, organizational culture is fundamental to the achievement of safety, high quality customer service and world class financial performance. It is the lifeblood of all of them and the single most important determinant of the character of the organization and its performance.

 

 

Jerry Dibble, MBA, Ph.D.

Dr. Jerry Dibble has over 20 years of management consulting experience with technology-driven corporations, professional services firms, and manufacturers of consumer products.  He holds a Ph.D. from Stanford University, an MBA from Georgia State University and a degree in electrical engineering from Purdue University.  Often cited in business periodicals, including Fortune and The Wall Street Journal, his work has won multiple awards, including the prestigious Communicator of the Year award from the Atlanta chapter of the International Association of Business Communicators.

Jerry’s work history includes five years as a product manager with a global industrial equipment firm; eight years as Director of Business Communication Programs and Associate Professor of Business Administration at Georgia State University; and two years as a vice president with a nationally recognized change management consulting firm.  South-Western Publishing has published Jerry’s book, Communication Skills and Strategies: Guidelines for Managers at Work and several monographs on leadership and management.

 



[1] For more on the way The VanAllen Group defines the various levels of performance and its rationale for doing so, see Pete Agur’s article, “Your Dog Is Ugly” (http://www.vanallen.com/index.php/your-dog-is-ugly/)

[2] Here and elsewhere, comments on the group’s profile are based on psychometric data from the Birkman Method®, which VanAllen uses in connection with many of its audits, search assignments and new leader assimilation programs. As a result, it now has one of the largest Birkman databases in the world on aviation professionals, broken out by roles within the department, and further tabbed by the size and mission of the operation. It thus allows us to work from a baseline of expectations about group and individual behaviors, spot distinguishing features of individual departments, and use that knowledge to promote change and improve the quality of conversation and decision-making in the group.

[3] See, for example, H. Tsoukas, Complex Knowledge: Studies in Organizational Epistemology (2005), especially Chapters 3 through 6, which summarize and interpret much of the current scholarship.


WordPress Appliance - Powered by TurnKey Linux