Final Report
Sensemaking Symposium
23-25 October 2001
Command and Control Research Program
Office of the Assistant Secretary of Defense
for Command, Control, Communications and Intelligence
Dennis K. Leedom, Ph.D.
Evidence Based Research, Inc.
Click this link to download the PDF version of this report (380kB)
Acknowledgments
First of all, appreciation is expressed to Dr. David Alberts for his continued interest in furthering our understanding of sensemaking in the military and his foresight in sponsoring this symposium. Second, appreciation is expressed to the various presenters who offered their expertise and insight into the different aspects of sensemaking. In order of presentation, these individuals included Dr. Richard Hayes, Dr. John Nosek, Dr. Louise Comfort, Dr. Karlene Roberts, Dennis Wisnosky, Dr. Mariann Jelinek, Dr. Karen Carr, and Barry McGuinness. Finally, appreciation is expressed to Cara Christie, William Wood, Kristina Thompson, Telvyn Murphy, Eric Cochrane, Stacey Lakind, and Stevana Allman for their assistance in data analysis and note taking during the symposium.
Executive Summary
This present report summarizes the insights and recommendations of a symposium on sensemaking, sponsored by the Command and Control Research Program (CCRP) of the Assistant Secretary of Defense for Command, Control, Communications, and Intelligence, and held in Vienna, Virginia, on 23-25 October 2001. The goal of this meeting was to bring together knowledgeable researchers and practitioners from industry, academia, and government to cross-fertilize the best sensemaking ideas and practices. Such a gathering served ASD(C3I) in two ways. First, the meeting provided an opportunity to build upon (or refine) the conceptual framework derived from earlier work by the Information Superiority Metrics Working Group (ISMWG) and the "Foundations" Workshop on Sensemaking organized in conjunction with the Committee for Information and C2 Systems of the Association of Astronautics and Astrophysics (AIAA). Second, the meeting helped develop a better understanding of how different fields of research might contribute to a number of applied areas unique to the military. These areas include intelligence analysis, joint and coalition peace-keeping and peace enforcement operations, counter-terrorist operations, joint military operations, and homeland defense operations.
During the symposium, participants reviewed insights and findings from a number of research perspectives, including work focused on:
From the ideas presented, it is clear that a number of research areas offer useful insight and guidance for improving sensemaking processes within a number of applied areas unique to the military, including joint and coalition operations across the spectrum of conflict. At the same time, the parochial and disconnected nature of much of this work presents a challenge for applying it to military interests. Accordingly, the third day of the symposium engaged the participants in working group discussions focused on identifying basic and applied research issues, along with highlighting the need for framework for integrating these various research perspectives. Elements of this multidisciplinary framework include (1) a common taxonomy of sensemaking constructs and theories; (2) linkages among multiple levels of analysis; (3) a code of best practice for observation, measurement, experimentation, and modeling; (4) a guide for the application of sensemaking concepts and models to information technology development, training development, and organizational design; (5) standard operational scenarios that facilitate construction of a consistent body of knowledge and meta-analysis; and (6) a repository of empirical findings that motivate and guide research, experimentation, and mathematical modeling and simulation.
Introduction
A knowledge management workshop sponsored on 6-8 March 2001 by the Command and Control Research Program (CCRP) of the Assistant Secretary of Defense for Command, Control, Communications, and Intelligence (ASD(C3I)) identified sensemaking as an essential cognitive element of the military decisionmaking process (MDMP). As shown in Figure 1, participants of this earlier workshop viewed sensemaking as occurring within the cognitive domain while linking other critical MDMP elements across the information and physical domains of command and control.
Figure 1. Sensemaking Conceptual Framework
One of the most urgent recommendations from this earlier workshop was that a symposium be convened to bring together knowledgeable researchers and practitioners from industry, academia, and government to cross-fertilize the best sensemaking ideas and practices. Such a gathering would provide ASD(C3I) with an opportunity to (1) build upon (or refine) the conceptual framework and (2) develop a better understanding of how different fields of research might contribute to the enhancement of sensemaking in military command and control. Specific goals of this symposium included:
The present report summarizes the insights and recommendations of this symposium, held in Vienna, Virginia, on 23-25 October 2001. Appendix A provides a list of the participants. Please visit www.dodccrp.org and choose CCRP Events to view the individual presentations from the plenary sessions and brief outs from the working groups.
The Military Need for Sensemaking Research
While military commanders and their staffs have always engaged in "making sense" of their mission, their adversary, their own forces, and their operational environment; there exists a heightened requirement for addressing this process and its contribution to effective command and control. This requirement stems from the confluence of three trends in the current transformation of U.S. military forces.
First, there exists the need for current and future military forces to conduct a broader and more complex spectrum of operations, as compared with a decade ago (Wentz, 2001). Wars no longer take place between nation-states on traditional battlefields, but have been replaced by emergent and asymmetric threats involving cultural factions and transnational actors. Military operations are increasingly compromised by conflicting political, diplomatic, and legal issues. Decisions must be made in real time with simultaneous tactical, operational, and strategic implications. Today’s battlespace is often populated by large numbers of international and non-governmental organizations. And finally, alliance partners do not always share a common understanding of intent and strategy.
Second, in response to these more demanding requirements, U.S. military forces are beginning to employ new, more appropriate operational concepts and command approaches. Effects-based operations (EBO) are replacing historic models of attrition warfare. EBO are operations that synchronize and focus both lethal and non-lethal means against an adversary’s centers of gravity in order to defeat his will to continue. But, compared with past models of combat, planning and executing EBO require a deeper level of sensemaking across multiple functional areas and technical disciplines. How this goal will be achieved, in turn, requires a better understanding of the sensemaking practices and dynamics associated with joint and coalition operations.
Third, U.S. military forces are paralleling the information revolution in the commercial sector by adopting network centric warfare concepts. Network centric warfare represents a fundamental shift from platform-centric warfare and is defined as "an information superiority-enabled concept of operations that generates increased combat power by networking sensors, decisionmakers, and shooters to achieve shared awareness, increased speed of command, higher tempo of operations, greater lethality, increased survivability, and a degree of self-synchronization" (Alberts, Garstka, & Stein, 1999). Yet, as noted in the commercial sector, the potential for improving organizational decisionmaking through the introduction of information technology remains largely unrealized without (1) adequate understanding of the sociotechnical issues and (2) deliberate reengineering of the cognitive and information domains within the organization (NRC, 1999; Brynjolfsson & Hitt, 1998). Thus, the adoption of network centric warfare concepts and practices by the U.S. military will require a deeper understanding of (1) how sensemaking occurs at both the individual and organizational levels within a command and control system and (2) how sensemaking is shaped by information technology, battle staff training, and organizational design.
Historical Examples of Sensemaking Success and Failure
To gain insight into the role sensemaking plays in military success or failure, Evidence Based Research, Inc., conducted a retrospective analysis of 30 historical military battles, insurgencies, terrorist attacks, and other combat incidents involving 149 specific decision events. Of these events, 60 decisions were associated with military success while 89 were associated with military failure. Using the sensemaking framework developed during the earlier workshop, analysts subjectively assessed nine sets of factors believed to influence sensemaking during these decision events:
An analysis of these cases revealed that the information systems available
to the decisionmakers generally tended to perform adequately. That is, the right
data were collected and put together appropriately, decisions and rationale
were shared, and information was put together in a form that facilitated awareness.
However, prior knowledge was relatively less influential than emotions, beliefs,
cognitive factors, and mental models--all components of sensemaking. A comparison
of successful versus unsuccessful decision events across these cases also reinforces
this focus on sensemaking. In this comparison, analysts computed the mean performance
rating for each factor across all successful military cases, and subtracted
the corresponding mean rating from across the unsuccessful cases. As shown in
Figure 2, the factor mean differences most strongly distinguishing successful
versus unsuccessful military decision events in these cases reflect a number
of sensemaking elements.
Figure 2.Factors Distinguishing Successful Versus Unsuccessful Military Decision
Events
While findings from these cases are preliminary and warrant further analysis, they support the argument that DoD research on command and control should place increased focus on sensemaking (what information means vis-à-vis human goals, expertise, culture, emotions, and biases)--in addition to its traditional focus on information technology (what information can be collected, stored, and displayed).
How Sensemaking Is Addressed in Research
A primary goal of the symposium was to bring together knowledgeable practitioners from different fields of research who might contribute to an understanding of sensemaking in a military command and control context. Planners of this symposium anticipated that each research area would offer a different perspective on sensemaking and its relationship to other topics of current interest within the CCRP community--e.g., recognition-primed decisionmaking, collaboration, situation awareness. In this regard, the symposium was successful. The invited speakers were able to examine sensemaking from a variety of different research perspectives. At the same time, these presentations illustrated the notion that each research community has often (1) attached its own unique definition to commonly used terms and/or (2) introduced specialized constructs or models that are not widely understood by researchers in other fields. The need for reconciling these different areas of research (at least as they are applied to military command and control) became apparent among participants during the final day of the symposium.
While the individual presentations are available at www.dodccrp.org, it is useful to summarize the different research perspectives that were represented in this symposium. Because there is already considerable overlap and crosstalk among these different perspectives, they do not represent independent areas of research. However, it is possible to highlight important differences among these research perspectives regarding their scope, focus, motivation, and applicability to military command and control. Each area of research could make a useful contribution to our understanding of sensemaking and how it might be enhanced in a complex, ambiguous, and high time-stress military decisionmaking environment. The following discussion highlights key aspects of each research perspective and then summarizes its major strengths and limitations with respect to the military’s need for sensemaking research.
Individual Sensemaking: A Focus on Situation Awareness
The first research perspective addresses individual sensemaking and is historically motivated by the desire to understand (1) situation awareness and its impact on individual operator or aviator performance, (2) shared situation awareness in a common task environment, and (3) how situation awareness can be enhanced through advanced information displays. Much of the research literature associated with this perspective falls within the field of cognitive psychology and references the seminal work of Mica Endsley (1995) who the original model of situation awareness most often cited by others. This model posits that situation awareness is "the perception of the elements in the environment within a volume of space and time, the comprehension of their meaning, the projection of their status into the near future, and the prediction of how various actions will affect the fulfillment of one’s goals." This model provides a fairly concise definition of situation awareness; however, it is one that has been applied largely to task situations of a prescribed nature--i.e., task domains where operator goals are clearly defined and behavior is governed by procedural rules. As such, much of the research emphasis has been placed on how individual system operators interpret cues and warnings vis-à-vis stored patterns of experience and training. Finally, this definition of situation awareness subsumes an aspect of sensemaking, albeit in the limited sense that comprehension is assumed to be a passive process of pattern discovery rather than a more active process of pattern creation.
A related definition of situation awareness has been offered by Barry McGuinness, one of the presenters at the symposium. McGuinness begins his description of situation awareness with a quote from a pilot (Figure 3): "it’s knowing what’s going on so you can figure out what to do" (Perla et al., 2000). Basically, McGuinness builds upon Endsley’s steps of perception, comprehension, and projection by adding the additional steps of intention (understanding your options and courses of action relative to your goals) and metacognition (accounting for how reliable your situation awareness is likely to be).
Figure 3. Situation Awareness
Within this research perspective, situation awareness is distinguished from knowledge and sensemaking in the following manner:
Knowledge is defined as the capacity for action (doing, saying, thinking).
Situation awareness is defined as dynamic "situated" knowledge, or the capacity to act effectively in a given specific situation.
Sensemaking is defined as the process of creating situation awareness in situations of uncertainty.
Because much of this applied psychological research is grounded within the context of systems engineering and human factors, there exists a strong desire for concepts and performance to be measurable and for theories to be testable. Accordingly, sensemaking and situation awareness are viewed as working concepts that enable us to investigate and improve the interaction between man and information technology. Within this perspective, it is recognized that humans play a significant role in adapting and responding to unexpected or unknown situations, as well as recognized situations. Accordingly, as we move from concepts, metrics, and analysis to testable theories, we need to attribute the relative contributions of both humans and information technology in our models of system performance. It is further recognized that clear metrics do not currently exist for understanding how humans "make sense" and identifying what conditions facilitate "good" performance. Given the state-of-art in studying this aspect of human-computer interaction, it is noted that research must build from qualitative description toward quantitative prediction of performance, using a range of investigation methods:
Summarizing this area with respect to the military’s needs, strengths of this research perspective include the following:
At the same time, this research perspective exhibits a number of limitations vis-à-vis its application to military command and control.
Organizational Sensemaking: How Sense Is Made in Ambiguous or Uncertain Environments
The second research perspective addresses sensemaking at the organizational level and is historically motivated by the desire to understand:
Unlike the previously discussed research area that is largely dominated by cognitive psychologists and human factors engineers, a multitude of different research disciplines contribute to this second perspective: organizational psychology, sociology, management science, social anthropology, to name but a few. Much of the descriptive research in this field draws from the work of Karl Weick who provided a comprehensive discussion of the social dynamics within an organization that lead to the creation of situational understanding and direction (Weick, 1995). In this work, Weick begins with a multitude of definitions applied to sensemaking in the social science literature and then proceeds to develop a number of basic properties of this process. These basic properties serve as a useful framework for sensemaking research as it applied to military command and control:
Grounded in identity construction. Making sense of the environment influences, and is influenced by one’s self-concept and personal identity. For example, how a military force defines its own mission and capabilities will, in turn, shape what kind of sense it makes out of its operational battlespace.
Retrospective. Making sense of the present is always grounded in past experience, including past decisions to adopt certain plans and goals. For example, how a military force makes sense of the battlespace will be shaped by both its training and experience, as well as by commander intent.
Enactive of sensible environment. Making sense involves the construction of reality by assigning authority to events and cues vis-à-vis a specific context, activity, or ontology. For example, instead of passively attempting to discover all truth about the battlespace, a commander will focus only on those aspects of the battlefield that fit certain mental models or stories derived from experience.
Social. Making sense involves the creation of shared meaning and shared experience that guides organizational decisionmaking. For example, military command and control involves many decisionmakers at various levels who must be led with a single vision and equipped with a common understanding of (1) what the situation is and (2) what they are attempting to accomplish.
Ongoing. Making sense is a continual process of refining understanding, taking action, and restoring equilibrium within the context of a specific project. For example, military command and control involves a continual process of monitoring, goal-directed planning, and execution against a thinking adversary in a dynamic environment.
Focused on and by extracted cues. Sensemaking involves the process of people noticing and extracting specific cues from the environment and then contextually interpreting those cues according to certain held beliefs, mental models, rules, procedures, stories, and so forth. For example, particularly when facing time stress and situational ambiguity, experienced commanders will intuitively focus on key events and decisions that can turn the course of the operation.
Driven by plausibility rather than accuracy. Sensemaking is driven by the need for a workable level of understanding that guides action, rather than by a search for universal truth. For example, when faced with situation uncertainty or information overload, commanders will simplify their information needs in order to make plausible, but timely decisions in order to maintain momentum and advantage over an adversary.
In addition to outlining these properties of the sensemaking process, Weick also identifies a number of important ways in which organizations tacitly codify past knowledge and experience. These "minimal sensible structures," when combined with organizational goals, provide the contextual basis for interpreting the current situation and directing action:
Ideology. Shared, relatively coherent, emotionally charged beliefs, values, norms, cause-effect relationships, preferences for certain outcomes, and expectations that bind the organization together. They provide ready-made interpretation structures for supporting the belief side of sensemaking.
Third-Order Controls. Unspoken organizational premises (jargon, patterns of uncertainty absorption, unique communication channels, informal procedures, and personnel selection criteria) that shape the flow/content of information, constrain the search for options, focus the definition of risk, and constrain expectations. They act to delimit the belief side of sensemaking.
Paradigms. Internally consistent sets of simplifying heuristics about important objects in the world, how these objects act, how they relate to one another, and how they come to be known. They serve as alternative realities for linking belief and action.
Theories of Action. Organization-level cognitive structures that filter and interpret environmental signals as triggers for organizational response. They link perception to shaping action.
Tradition. Symbolic mental structures (patterns of action, patterns of means-ends behavior, organizational structures) that facilitate a practical, can-do, action-oriented stance toward the world. They provide the ready-made formulas for action.
Stories. Narrative structures that represent filtered, ordered, and affected accounts of experience based on a "beginning-middle-end" story sequence. They are used to guide action under conditions of crisis, complexity, and time pressure.
Building upon the work of Weick, Michael Zack (1999) defines different types of organizational ignorance as uncertainty (insufficient information or lack of confidence in the information), complexity (more information that can be processed or understood), ambiguity (lack of a conceptual framework for interpreting information), and equivocality (several competing or contradictory conceptual frameworks). As seen in Figure 4, having insufficient information or lacking an interpretive framework will lead an organization to engage in acquisitive processing--e.g., acquire additional information, use existing knowledge to infer or fill in missing data items, or confer among experts to suggest an interpretive framework or paradigm. By contrast, having too much information or too many competing explanations will lead an organization to engage in restrictive processing--e.g., establish information filtering criteria, decompose problem space and delegate decisionmaking responsibility, or discuss/negotiate among experts to agree upon best interpretation. Applying this model to military command and control, one notes that acquisitive processing has been the traditional focus of information technology development, while restrictive processing remains principally the responsibility of human decisionmakers.
Figure 4. Types of Situational Ignorance
Organizational sensemaking is also closely related to the field of research on naturalistic decisionmaking--research motivated by dissatisfaction with traditional choice-theoretic models of human decisionmaking. Following the basic description of naturalistic decisionmaking developed by Gary Klein (1993), researchers have refined our understanding of how expert practitioners make extensive use of recognition decision models in high time-stress situations. Over the past decade, Klein’s model of recognition-primed decisionmaking has been accepted by many researchers as the basic paradigm for military command and control (cf., Kaempf, Klein, Thordsen & Wolf, 1996; Pascual & Henderson, 1997; Klein, 1997). However, it should be noted that this model applies primarily to situations in which expert decisionmakers have relevant experience for sensing and interpreting a dynamic situation.
But the more difficult and increasingly common case involves military situations where decisionmakers face a novel or unknown problem environment. In this case, we must begin to examine how decisionmakers cope with uncertainty. Based on a study involving students at the Israel Defense Forces Command and General Staff College, Lipshitz and Strauss (1997) identified a range of tactics employed by military officers in coping with situational uncertainty. These tactics included reduction of uncertainty (e.g., collecting additional information, seeking advice/support, applying doctrine), situation shaping or risk management (e.g., employing preemptive tactics, avoiding irreversible actions), and uncertainty suppression (e.g., ignoring uncertainty, taking action based on intuition, gambling).
Other studies of tactical military decisionmaking suggest that a variety of sensemaking models can be observed to operate under different time-stress and uncertainty conditions (Adelman, Leedom, Murphy & Killam, 1998). As shown in Figure 5, commanders are seen to engage in a constant cycle of monitoring and adjustment during the execution of a military operation. The identification of an anomaly or deviation from the approved plan will trigger a decision event. If sufficient time is available, the commander will engage his support staff in the formal development and examination of alternative courses of action--a deliberate, choice-theoretic type of sensemaking process. However, under conditions of high time-stress, the commander will employ a more abbreviated sensemaking strategy. If the commander’s experience allows him to recognize and interpret the evolving situation, he will likely engage in a recognition-primed decision process--a process typically limited to the commander and a few close advisors. But, if the situation remains unclear, then it is likely that the commander will attempt to manage risk and uncertainty by employing one of coping tactics identified by Lipshitz and Strauss.
Figure 5. Sensemaking Strategies Employed by Military Commanders
Studies of organizational sensemaking also reveal that decisionmakers at various levels within an organization play different sensemaking roles. In a reprint of a classic Harvard Business Review article, John Kotter (1999) notes that effective general managers primarily contribute to organizational sensemaking by (1) setting agendas and (2) building and monitoring the organizational networks that successfully implement these agendas. Extending this notion to military command and control, one can observe different levels of a battle staff focused on different aspects of sensemaking in a military operation:
Commander. Maintains high-level situation awareness vis-à-vis commander’s intent in order to focus staff attention and adjust operational priorities
Principal Staff Advisors. Monitors and interprets specific areas of functional responsibility in order to improvise operations by adapting the unit’s capabilities to situation demands
Supporting Staff Sections. Collects and integrates specific information in accordance with staff procedures in order to develop situation displays and detailed action plans and orders
Summarizing this area with respect to the military’s needs, strengths of this research perspective include the following:
Despite these strengths, however, this field of research exhibits a number of limitations that must be overcome if it is to significantly contribute to the systematic improvement of command and control operations.
Organizational Sensemaking: How to Improve Decisionmaking During Crisis
There exists another research perspective focused on organizational decisionmaking that overlaps in some ways with the research just discussed. However, it differs in two respects: (1) it is more prescriptive in nature and (2) it is focused more specifically on high-reliability organizations--organizations that must continue to function during periods of crisis and in the face of situation uncertainty and complexity. Specifically, much of this research examines the social and structural causes of organizational breakdown and seeks to identify ways in which organizations can adapt. As a result, many of the studies found in this area focus on crisis events (e.g., earthquakes, terrorist attacks) or events in which a string of behaviors or decisionmaking errors led to catastrophic outcome (e.g., shoot-down of UH-60 helicopters, World Trade Center attack). Like the perspective just discussed, this body of work reflects a multidisciplinary approach--e.g., organizational psychology, social anthropology, management science, military history.
As summarized during the symposium by Mariann Jelinek, traditional organizations are designed to produce stable, predictable performance by eliminating ambiguity and unauthorized behavior. Such organizations use task decomposition and specialization to narrow participant focus--a practice common in most command and control organizations. Emphasis is placed on control and managerial intent, while other cognitive resources are generally ignored. As a result, thought and attention tends to be limited to the managers of such organizations--with the result being consistency and rigidity of thinking. By contrast, these same organizations do not respond well to crisis and ambiguity. If such organizations are to successfully adapt, they must organize for innovation by (1) emphasizing organizational change and learning, (2) facilitating shared cognition, and (3) embracing ambiguity as opportunity. Addressing cognition and decisionmaking from an overall systems perspective, Jelinek has identified three cognitive elements that contribute to the ability of organizations to respond effectively to crisis and ambiguity (Jelinek & Litterer, 1995). These elements include:
Shared management. Everyone in the organization (down to the lowest levels) is responsible for overall system performance
Mindful alertness to anomalies. Because data takes on meaning only in context, subordinates should be alert to patterns, anomalies, and change and push this information upward in the organization
Ambiguity absorption. Organizational design should attend to who deals with ambiguity in the organization, how data is matched up with those who provide context and interpretation, what are the attentional resources within the organization, and where does there need to be shared interpretation
Jelinek emphasizes data-based organizations that focus on real causes and real results, that emphasize learning and improvement, that facilitate information sharing in order to empower all participants, and that require decisionmakers to "listen down" to subordinates who have more direct access to situation awareness of the environment. Shared interpretation and attention to patterns/cues of change are valued over execution of mindless processes that treat anomalies as environmental noise. On the other hand, such high-performing organizations tend to be hard on people, leading to participant burnout and secondary effects such as high divorce rates. Because it is often easy to dismiss existing bureaucracies that work, organizations must strike a balance between the stability and change.
A similar focus on organizational breakdown is seen in the work of Karlene Roberts, another presented in the symposium. Robert’s work draws from the sensemaking research of Karl Weick, the research of Gary Klein on recognition-primed decisionmaking, and the human error research of James Reason. Addressed within this body of research are several important issues regarding high-reliability organizations:
In her recent research on virtual organizations and incident command systems, it was found that a number of factors contribute to organizational reliability (Grabowski & Roberts, 1998; Bigley & Roberts, in press). These factors can be organized in terms of structuring mechanisms, cognition management mechanisms, and constrained improvisation. Structuring mechanisms refer to basic processes that can be employed to rapidly alter the formal structure of an organization during a crisis period. They include:
Structure elaboration: the process of building an organization on-site during a crisis to mitigate specific risks with whatever resources are available. This building process might persist until the crisis abates.
Role switching: the activation and deactivation of organizational roles, combined with the reassignment of personnel in accordance with the requirements of goals and plans.
Authority migration: the dynamic and temporary transfer of authority from fixed authority relationships in the organization to those individuals possessing relevant expertise in a crisis situation.
System resetting: the process of deliberately disengaging from the crisis environment when an initial strategy appears to be ineffective or inappropriate, so that the organization can be redirected or reconfigured in a more appropriate manner.
Cognition management mechanisms include techniques for manipulating the operational representation of the organization (including participant roles), the environment, the projected future, and their meaning in order to reshape the shared mental models that guide individual behaviors within the organization. These mechanisms include:
Developing: the process of creating a mental picture that "sizes up" the current situation in terms of which factors are significant, which elements of the situation can be categorized in certain meaningful ways, and what are the important cause-effect relationships. Developing provides the context in which objectives are formulated and roles activated.
Communicating: the process of transmitting an accurate and timely picture of the situation --typically upwards in the organization in order to facilitate smooth handoff of command and insure that incoming resources efficiently coordinate their activities.
Shifting and nesting: the process of either off-loading (shifting) sensemaking responsibility to another individual (usually a lower-level participant with more direct situation awareness) or aggregating (nesting) the scope and detail of sensemaking responsibility in order to maintain perspective and coordination.
Constrained improvisation methods (similar to mission-type orders in military command and control) decentralize decisionmaking authority to subordinates in a way that permits adaptation and deviation from organizational rules and procedures while still accomplishing higher-level intent. Improvisation is considered legitimate and is supported to the extent that it fits with organizational goals and is not likely to cause harm. Improvisation can involve a number of different elements such as:
Improvisation of tools: the non-traditional or novel use of organizational tools and information systems in ways that effectively respond to the crisis with limited resources.
Improvisation of rules: a more radical departure from (or deliberate violation of) organizational rules when the crisis situation dictates a more creative response.
Improvisation of routines: an adjustment of standard operating procedures required in order to accommodate local circumstances.
Other research within this perspective is inspired by complexity theory, the notion that complex organizational behavior and interactions with the environment can arise out of relatively simple rules governing the behavior of the individual participants within the organization. Support evidence for adopting a complexity model can be found in a variety of organizational settings, including military command and control (cf., Waldrop, 1992; Lissack, 1996; Kirlik, Rothrock, Walker & Fisk, 1996; McMaster, 1996; Schmitt, 1997).
Of specific interest within this body of research are several concepts that appear to reflect or explain the behavior of socio-technical organizations such as commercial corporations, emergency management systems, and military command and control systems. The first concept is the notion of self-organizing systems, a phenomenon reflected in many natural systems (e.g., galaxies, chemical compounds, cells, organisms), but with particular application in organizations that must continually adapt to changing environments. The essence of self-organization is that system structure and order often appear without explicit pressure or influence from the outside world. Rather, this self-organization arises as a product of the system’s internal components and their behavioral properties. In terms of emergency management systems or military command and controls, self-organization is seen to depend upon authority structures, information sharing, goal structures, and so forth.
The second concept refers to the notion that systems and organizations tend to operate in a region of complexity between two phase states: chaos and stability. As shown in Figure 6, small changes in component properties can push the organization into chaotic behavior and collapse or lock the organization into stable behavior and cohesiveness. Applying mathematical terminology such as percolation, attractors, homeostasis, and structural coupling, researchers within this field of study seek to develop both explanatory and predictive theories regarding the micro-level socio-technical factors that yield chaotic or stable behavior at the macro-level.
Figure 6. Zone of Operation in a Complex Adaptive System
Extending the arguments of Michael Lissack (1996), complexity theory offers insight regarding the relative contribution of information versus other forms of organizational infrastructure to organizational effectiveness. According to Lissack, investment in traditional infrastructure (defined here as organization structure, culture, hardware, training, and other fixed "assets") typically obeys the economic law of diminishing returns. That is, adding more and more structure, hardware, training, and so forth produces less and less incremental increase in organizational performance. By contrast, Lissack argues that investment in information potentially obeys a different law, one that is the inverse of diminishing returns. That is, there exists the potential for information to be leveraged in an expanding manner rather than a diminishing manner. This opportunity exists because information provides the organization with the means to effectively reorganize and perform in a more responsive manner to the environment. However, there are many obstacles and constraints within an organization that limit its ability to absorb and capitalize on new information. Thus, a critical question (which brings the discussion back to sensemaking) is how can organizations improve their ability to absorb and capitalize on new information in a changing environment?
One area of relevant work received extremely well by the symposium was that of Louise Comfort, who has focused on the performance of incident command systems and their response to contingency and crisis situations (cf., Comfort, 1994; Comfort et al., 1999). According to Comfort, organizational fragility is defined at the point at which the capacity for collective action collapses in a social environment. The attack on the World Trade Center illustrates the sensemaking collapse of the airport security systems, flight crews and passengers, and the first-responder rescue teams. In each case, participants faced unimaginable events, could not recognize the risks, and were unable to act in a cohesive manner to avert danger. More generally, disaster and crisis environments create difficult conditions for human decisionmakers and the socio-technical systems they interact with. Failure in one subsystem can trigger secondary and tertiary failures in other subsystems until the entire organizational structure collapses. Hence, the design of effective response systems requires a socio-technical approach. Implications for the new war on terrorism include:
Self-organization occurs in a complex adaptive system in the region where there is sufficient order to hold and exchange information, but where there is sufficient flexibility to adapt to a changing environment. Emerging systems respond to a perceived threat by initiating both (1) collective actions to achieve a set of stated goals and (2) innovative efforts to change the system’s status vis-à-vis the threat. Initial conditions required for adaptive system behavior in incident command systems include:
During the symposium, both Comfort and Jelinek pointed to a number of critical indicators that determine whether a socio-technical organization can produce and sustain adaptive behavior during a contingency or crisis situation. These factors include technical structure, organizational flexibility, and cultural openness (Comfort, 1999). As defined by Comfort, technical structure refers to the presence of both technical communication systems and organizational infrastructure designed to support the effective flow of information among participating actors and groups. For example, technical structure might include physical entities such as reliable communication networks and well-designed emergency operations centers. Technical structure might also include knowledge objects such as operational threat assessments, definitive response procedures, and supporting technical analyses.
Organizational flexibility refers to the existence of plans, authorities, interdisciplinary and inter-organizational knowledge, and trained administrative personnel that facilitate rapid adaptation from normal operations to contingency operations. For example, organizational flexibility is provided by published contingency plans; legal authorities that can be invoked during a crisis; multiple pathways of information exchange within the organization; and managerial personnel who are familiar with the culture, capabilities, rules, and procedures of other agencies or organizations with whom they must coordinate in a crisis situation.
Cultural openness within an organization refers to a basic willingness to rapidly move beyond existing organizational structures and processes in order to effectively meet the operating and coordination demands of a crisis situation. For example, cultural openness is reflected in a commitment to shared values beyond the organization; a readiness to accept new information from valid sources; a willingness to employ new procedures, rules, and coordination mechanisms; a willingness to open new avenues of information exchange with other agencies and organizations; an attitude of self-monitoring and critique; and a continual search for relevant, timely, and accurate information in order to achieve higher-level goals within the community.
Examining these factors, Comfort defines four types of organizational response. The first type of organizational response is termed "non-adaptive." This behavior characterizes organizations with low technical structure, low flexibility, and low cultural openness. Such organizations function under threat largely dependent upon outside assistance, and then revert to previous status after the threat has dissipated. A second type of response is termed "emergent adaptive." Emergent adaptive behavior characterizes organizations with low technical structure, medium flexibility, and medium cultural openness. Such organizations develop a mode of organization and action to cope with the threat, but cannot sustain collective action. A third type of organizational response is termed "operative adaptive." Operative adaptive behavior characterizes organizations with medium technical structure, medium flexibility, and medium cultural openness. Such organizations function well against the threat, but cannot translate methods of response into new modes of sustained operation and threat reduction. Finally, a fourth type of response is termed "auto-adaptive." Auto-adaptive behavior characterizes organizations with high technical structure, high flexibility, and high cultural openness. Such organizations are effective against the threat and can transfer lessons learned into sustained reduction of threat.
Figure 7. Types of Organizational Response to Crisis and Contingency
Based on this research, Comfort concludes that effective response against a continuing threat requires an auto-adaptive type of command and control organization. Such an organization is characterized by (1) appropriate use of technical systems to monitor, process, and disseminate information; (2) the ability to rethink organizational functions in order to achieve self-organizing actions that avert risk; and (3) the capacity to create new meaning from experience that enables effective action in dynamic situations. During the symposium, Dennis Wisnosky reinforced this notion based upon his work with industrial organizations.
The particular strengths of this third research perspective can be summarized in the following manner.
Yet this area of research is not without its own set of limitations.
Collaboration Systems: A Search of Meaningful Ontologies
The fourth research perspective represented in the symposium on sensemaking derives from the field of computer science, with some inspiration from the fields of psychology, sociology, and management science. It approaches group or team sensemaking as a process to be supported with computer software and hardware, rather than as a phenomenon to be fully understood in its own right. Much of the history of this field can be traced to the evolution of computer science research over the past several decades (Grudin, 1994). As the attention of computer scientists shifted from mainframe technology and support of corporate enterprises to the support of individual users and collaborative groups, there emerged a disciplined body of research known as "computer supported cooperative work." The research was motivated by the desire to enable broader participation in collaborative work by:
However, more recent reviews of the computer-supported cooperative work research area suggest that it has made limited contribution to sensemaking at the group/team and organizational levels (Gray, 1998). With the exception of human parallel processing, very little new has been introduced into the way humans have always collaborated. Software developed for collaboration has principally focused on assisting brainstorming and voting--processes that reflect only a small fraction of the socio-cognitive processes involved in collaborative sensemaking. The inability of most collaborative systems to survive beyond the whims of an individual champion is probably testimony that they have not addressed the critical sensemaking needs of the organization. Finally, the existing body of research in this area has been tied more to the software characteristics of individual prototype systems, rather than structured to provide insight into basic socio-cognitive processes. Hence, it is difficult to generalize findings from one research study to the next.
Recently, computer science research has attempted to address more directly the issue of how organizations socially construct knowledge and authority (Muller & Millen, 2000). Of principal concern is how organizations manage knowledge, as distinguished from either data or information (Figure 8). As part of this research, various models have been proposed to describe how knowledge is originated, refined, authored, and authorized within an organization--e.g., production flow model, democratic exchange model, hierarchical learning model, question-answering model, methods and procedures model, best practices model, consultant model. These models, in part, reflect different academic or empirical paradigms, while considering important knowledge management roles within an organization--e.g., resource-oriented staff, client-oriented staff, knowledge and authority staff, and information gatekeepers.
During the symposium, John Nosek’s presentation addressed the social construction of knowledge within groups and teams, and focused specifically on the types of knowledge structures needed for sharing meaning across different communities of expertise. This work is relevant to military command and control inasmuch as it assists our understanding of how to facilitate collaboration when participants are drawn from disparate fields of expertise--e.g., military, diplomatic, non-governmental. As part of this research, Nosek defines sensemaking as the basic process whereby people create shared meaning with others (Nosek, 1999). In this research, Nosek considers both ecological and constructionist models of knowledge to explain how actors (people or machines) interpret their environment. An ecological view argues that objects have meaning without conscious interpretation, whereas a constructionist view argues that actors observe stimuli and construct their meaning of objects. Applying these notions to social sensemaking, Nosek assumes that the ecological model holds whenever there is a clearly understood context, while the constructionist model applies in situations of greater ambiguity or uncertainty. Given the nature of most real-world collaborative situations, the constructionist model is seen to dominate.
Figure 8. Definition of Data, Information, and Knowledge
Hence, the basic issue for collaboration across different domains of expertise is the fact that different experts will construct different meaning from the same situation, based upon their relative knowledge and experience. If they are to collaborate effectively, these participants must find a mechanism for sharing and reconciling these different meanings. In attacking this issue, computer scientists adopt a number of constructs to describe this collaboration process (Hong, 1999):
Community of practice: a body of expert practitioners with shared histories of learning and common semantics that facilitate efficient dialog within the community
Community of interest: a group of participants from different communities of practice that forms around the need to frame and solve a specific problem
Boundary objects: externalized objects (representations) that serve to communicate and coordinate the perspectives of different communities, but where each participant has only partial knowledge of and control over the interpretation of the object [note: boundary objects can also mediate dialog between humans and computers]
Boundary spaces: social spaces where groups meet to discuss common interests [note, this can refer to either a physical space or a virtual space]
Boundary people: individuals who have an interest in and ability to understand "other" knowledge systems, and who can serve to mediate the flow of meaning across professional boundaries
In this context, Nosek asserts that effective organizational action depends upon the "right" thinking, by the "right" team members, at the "right" time. Accomplishing this requires the effective push and/or pull of knowledge within "collaboration envelopes," boundary objects and spaces tailored to provide specialized frameworks for (1) organizing dialog among different experts; (2) building bridges of meaning across different contexts; and (3) alerting participants to significant changes in meaning as the dialog progresses. Information technology can provide the set of tools needed to bridge across different communities of practice. Such tools are not limited to a single media, but must (1) effectively facilitate the chunking of knowledge and (2) be easy to use for both the sender and recipient.
The strength of this final research perspective can be summarized as follows. First, this research truly bridges the socio-technical process by explicitly considering both humans and computers as potential participants in the sensemaking process. This aspect of the research makes it well-positioned to contribute to the military’s concept of network centric warfare. Second, the research rightly acknowledges that the focus of information technology applications within an organization should be on the facilitation of collaborative sensemaking. As a result, the utility of information technology should be judged vis-à-vis the requirement to manage knowledge within the organization rather than merely to move data or information around the organization. Third, because of the quantitative structure imposed upon research within the field of computer science, this perspective is grounded in the desire to operationally define its constructs in measurable ways.
Limitations of this perspective are best acknowledged using the very language developed by this community. First, the research perspective employs constructs that are not shared or understood outside the community of computer science researchers engaged in developing software and hardware for computer supported cooperative work. Second, much of the development within this perspective is confined to collaborative forums that exclude other research communities addressing the sensemaking--e.g., those engaged in research on situation awareness or complex adaptive systems. As such, these other communities of practice have not yet benefited from the insights and models proposed within the field of computer supported cooperative work.
Recommendations of the Symposium Working Groups
The final day of the symposium divided participants into two working groups, each with the task of addressing the following questions:
Basic research: What research needs to be done? Are there current research programs that need to be expanded, continued, or redirected? What areas of sensemaking are currently being ignored?
Applied research: What research needs to be done? Who might be the customers? What are the strengths, weaknesses, and gaps of current research?
Expert panel: For the future, who are the sensemaking research experts (from academia, industry, or military) who should advise the Department of Defense on research directions, characterize the state-of-art, review on-going efforts, lead specific research thrusts, identify high-priority work or foundational work, and identify efforts from which data could be mined?
Practitioner panel: Who are the practitioners in applying sensemaking concepts to organizations and/or problem domains (not necessarily in the context of military command and control or interagency operations) who could convene periodically in parallel with the expert panel to frame, brainstorm, and evaluate results?
Organizational linkages: Which organizations should be involved future sensemaking research--i.e., organizations that might give/receive assistance in the form of funding, research facilities, sharing on-going research programs, lending expertise, etc.?
While time constraints limited the depth of discussion of these issues, the two working groups were able to formulate complementary perspectives regarding the future challenges and direction of sensemaking research as it applies to military command and control. A synthesis of these perspectives can be summarized as follows. Regarding both basic and applied research, the working groups noted that the various academic disciplines can be arrayed along a spectrum extending from work that furthers our understanding of sensemaking at the individual level to studies that address sensemaking issues at the organizational, multi-organizational, and societal level. This research spectrum, shown in Figure 9, potentially contributes to a number of military and Federal government areas: e.g., the improvement of interdisciplinary teamwork in intelligence analysis, the enhancement of joint military command and control, and the synchronization of homeland defense operations.
Figure 9. Spectrum of Academic Research on Sensemaking
Within the area of basic research, there was general agreement on the need to develop a multidisciplinary framework for synthesizing the contributions of different research perspectives. Such a framework would:
At the applied level of research, the working groups identified the need to construct sensemaking test beds--specifically in those functional settings characterized by high time-stress, situational ambiguity, and the requirement for multiple areas of expertise to collaborate toward the framing and solution of problems. Applied areas unique to the military include joint and coalition peacekeeping and peace enforcement operations, counter-terrorist operations, and homeland defense operations. However, the working groups also suggested a number of other operational settings that could also serve as effective research test beds: hospital emergency rooms, FEMA disaster response, wildfire fire fighting teams, search-and-rescue operations, security traders, and community 911 operations. These various test beds would fall within an n-dimensional space of sensemaking, as suggested in Figure 10.
Finally, the working groups developed a number of lists, including (1) researchers nominated for the expert panel on sensemaking; (2) researchers and other individuals nominated for the practitioners panel; and (3) organizations identified as key linkages for sensemaking research. These names and organizations will be reviewed for participation in future sensemaking research events sponsored by ASD(C3I).
Figure 10. Dimensions of Applied Sensemaking Research
The Road Ahead
One goal of the symposium, to illuminate sensemaking from a variety of different research perspectives, was successfully achieved through the three days of presentations, discussions, and recommendations. From the ideas presented, it is clear that a number of research areas offer useful insight and guidance for improving sensemaking processes within a number of applied areas unique to the military, including joint and coalition peace-keeping and peace enforcement operations, counter-terrorist operations, and homeland defense operations. At the same time, the parochial and disconnected nature of much of this work reminds one of the parable involving the three blind men attempting to describe an elephant:
One day three blind men came upon an elephant in the jungle. Not being able to see the elephant, they each manually inspected different parts of the animal to determine what they had encountered. The one who grabbed the tail was sure the beast resembled a rope. The one who wrapped his arms around a thick leg was sure that it was very similar to a tree. The one who latched onto the trunk concluded that the animal was related to the snake.
Parable of the Three Blind Men and the Elephant
While the lack of research cohesion and consistency is not surprising, it is also concluded that future progress in this area will depend upon bringing these various research perspectives together in a multidisciplinary framework. But, given the parochial nature of academic work, it is unlikely that such a multidisciplinary framework will emerge naturally within the near future. As a result, consideration should be given to the role that ASD(C3I) should play through the CCRP in (1) developing a multidisciplinary framework along the lines suggested by the working groups and (2) stimulating sensemaking research that will both fill in our gaps of understanding and explicitly apply this body of knowledge to the unique interests of ASD(C3I). The essence of the multidisciplinary research framework is summarized in Figure 11, based upon the earlier recommendations of the symposium working groups.
Figure 11. Multidisciplinary Research Framework for Sensemaking
To initiate development of this framework, it is proposed that ASD(C3I) assemble (in early 2002) an expert group of both researchers and practitioners to draft a long-range roadmap for sensemaking research and its application to the military. The roadmap would address each element of the multidisciplinary framework outlined in Figure 11, and provide ASD(C3I) with an investment strategy for research and experimentation Subsequently, ASD(C3I) would convene two formal workshops during 2002. The first workshop (to be held in the spring of 2002) would have an academic focus and serve to refine the first three foundational elements of the roadmap by:
Following this, a second workshop (to be convened in the late summer of 2002) would have more of a practitioner focus and serve to refine the remaining three elements of the roadmap. This workshop would invite researchers who could speak to the findings of the first workshop, as well as practitioners from each of the application areas plus relevant contributors from the areas of simulation modeling and field experimentation.
It is recognized that there exists broad interest in sensemaking across the defense research and development community. Accordingly, it is proposed that ASD(C3I) identify potential partners for collaborating with the CCRP in this long-range research plan. Participation might include research organizations from within the United States (e.g., Defense Advanced Research Projects Agency, National Science Foundation), research agencies within our alliance organizations (e.g., British Defence Science and Technology Laboratory, Australian Defence Technology Organization), as well as professional societies and alliance committees focused on the analysis and improvement of military command and control systems (e.g., NATO Studies Analysis and Simulation Panel 39 on Analysis of the Military Effectiveness of Future C2 Concepts and Systems).
Finally, it is proposed that the CCRP undertake the writing of a book (or book series) that summarizes the insights gleaned from this symposium and the two proposed workshops, and provides formal documentation of the research framework. This book (or book series) would add to the CCRP’s evolving body of knowledge regarding critical aspects of national defense transition.
References
Adelman, L.; Leedom, D.K.; Murphy, J.; & Killam, B. (1998). Description of brigade C2 decision process. (Contract # DAAL01-95-C-0115). Aberdeen Proving Ground, MD: U.S. Army Research Laboratory.
Alberts, D.S.; Garstka, J.J.; & Stein, F.P. (1999). Network centric warfare: Developing and leveraging information superiority (2nd Ed Rev). Washington, DC: DoD Command and Control Research Program.
Bigley, G.A. & Roberts, K.H. (in press). The incident command system: High reliability organizing for complex and volatile task environments. Academy of Management Journal, in press [Retrieved 5 Nov 2001 from http://www.aom.pace.edu/ amj/December2001/Bigley.pdf]
Brynjolfsson, E. & Hitt, L.M. (1998) Beyond the productivity paradox. Association for Computing Machinery, Communications of the ACM, 41(8), 49-55.
Comfort, L.K. (1994). Interorganization learning following the Northridge earthquake of January 17, 1994. Journal of Contingencies and Crisis Management, 2(3), 174-188.
Comfort, L.K. (1999). Shared risk: Complex systems in seismic response. New York: Elsevier Science (Pergamon Press
Comfort, L.; Sungu, Y.; Huber, M.; Piatek, J.; Dunn, M.; & Johnson, D. (1999). Self organization in disaster mitigation and management: Increasing community capacity for response. Paper presented at the TIEMS Conference, Washington, DC. [Retrieved from http://jishin.ucsur.pitt.edu/publications/980117.html]
Darley, J. (2001). We fail to contribute to policy debates. American Psychological Society Observer, 14(8), 3,40.
Endsley, M. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37(1), 32-64.
Federal Aviation Administration (1996). Federal Aviation Administration Human Factors Team Report on: TheInterfaces Between Flightcrews and Modern Flight Deck Systems. Washing, DC: FAA.
Grabowski, M. & Roberts, K.H. (1998). Risk mitigation in virtual organizations. Journal of Computer-Mediated Communications, 3(4). [Retrieved 5 Nov 2001 from http://www.ascusc.org/ jcmc/vol3/issue4/grabowski.html]
Gray, P. (1998). Is information systems research on collaboration systems relevant? Presented as part of the Discussion Series on Collaborative Work and System, ISWorld Net Virtual Meeting Center. [Retrieved 5 Nov 2001 from http://ww2.cis.temple.edu/isworld/vmc/April98/gray/].
Grudin, J. (1994). Computer-supported cooperative work: Its history and participation. IEEE Computer, 27(5), 19-26.
Hong, S. (1999). Historiographical layers in the relationship between science and technology. History of Technology, 15, 289-311.
Jelinek, M. & Litterer, J.A. (1995) Toward entrepreneurial organizations: Meeting ambiguity with engagement. Entrepreneurship: Theory and Practice, 19(3), 137-169).
Kaempf, G.L.; Klein, G.A.; Thordsen, M.L.; & Wolf, S. (1996). Decisionmaking in complex command and control environments. Human Factors and Ergonomics Society, 38(2), 220-231.
Kirlik, A.; Rothrock, L.; Walker, N.; & Fisk, A.D. (1996). Simple strategies or simple tasks? Dynamic decisionmaking in "complex" worlds. Proceedings of the Human Factors and Ergonomics Society 40th Annual Meeting, 184-188.
Klein, G.A. (1997). Implications of the naturalistic decisionmaking framework for information dominance. (AL/CF-TR-1997-0155). Wright-Patterson AFB, OH: U.S. Air Force Armstrong Laboratory.
Kotter, J.P. (1999). What effective general managers really do [Reprint of 1982 HBR article]. Harvard Business Review, 77(2), 145-159.
Lipshitz, R. & Strauss, O. (1997). Coping with uncertainty: A naturalistic decision-making analysis. Organizational Behavior and Human Decision Processes, 69(2), 149-163.
Lissack, M.R. (1996). Chaos and complexity: What does that have to do with knowledge management. In J.F. Schreinemakers (Ed.) Knowledge Management: Organization, Competence and Methodology. Wurzburg, Germany: Ergon Verlog. 1: 62-81. [Retrieved from http://www.lissack.com/writings/knowledge.htm].
Maurino, D.E.; Reason, J.; Johnston, N. & Lee, R. (1995). Beyond aviation human factors: Safety in high technology systems. Aldershot, UK: Avebury.
McMaster, M.D. (1996). The intelligence advantage: Organizing for complexity. Boston, MA: Butterworth-Hienemann.
Muller, M.J. & Millen, D.R. (2000). Social construction of knowedge and authority in business communities and organizations. (Technical Report 01-03) Lotus Research. [Retrieved 5 Nov 2001 from http://www.lotus.com/lotus/research.nsf/2b4f81291401771785256976004a8d13/64d5be552434b27d85256aaf00502120/$FILE/social%20construction-michael%20muller-paper.pdf]
National Research Council (1999). Realizing the potential of C4I: fundamental challenges. Committee to Review DoD C4I Plans and Programs. Washington, DC: National Academy Press.
Nosek, J.T. (1999) What do we need to tailor and why? Exploring the social construction of knowledge for guidance. Presented at the Work Activities Coordination and Collaboration Conference, 22 Feb 1999, San Francisco, CA. [Retrieved 5 Nov 2001 from http://www11.informatik.tu-muenchen.de/forschung/konferenzen/extension/wacc99-ws-impltailor/finals/nosek.doc].
Pascual, R. & Henderson, S. (1997). Evidence of naturalistic decisionmaking in C2. In C. Zsambok & G.A. Klein (Eds.) Naturalistic decisionmaking, (217-226). Mahwah, NJ: Erlbaum.
Perla, P.P; Markowitz, M.; Nofi, A.; Weuve, C.; Loughran, J. & Stahl, M. (2000). Gaming and shared situation awareness (Report CRM D0002722.A2/Final). Alexandria, VA: Center for Naval Analyses.
Reason, J. (1990). Human error. Cambridge, UK: Cambridge University Press.
Salas, E.; Prince, P.; Baker, D.P. & Shrestha, L. (1995). Situation awareness in team performance: Implications for measurement and training. Human Factors, 37(1), 123-136.
Schmitt, J.F. (1997). Command and (out of) control: The military implications of complexity theory. In D.S. Alberts & T.J. Czerwinski (Eds.) Complexity, global politics, and national security, (219-248). Washington, DC: National Defense University.
Waldrop, M.M. (1992). Complexity: The emerging science at the edge of order and chaos. New York: Simon & Shuster.
Weick, K.E. (1995). Sensemaking in organizations. Thousand Oaks, CA: Sage Publications
Wentz, L.K. (2001). Peace support operations cooperation, coordination, and information sharing: Lessons from Kosovo. DoD Command and Control Research Program. [Retrieved 2 Nov 2001 from http://www.dodccrp.org/Kosovo/zip/KFORCoop.exe].
Zack, M.H. (1999). Managing organizational ignorance. Knowledge Directions, 1(Summer), 36-49.
Sensemaking Attendee List |
|
Participation |
Affiliation |
Amy Bolton |
Naval Air Warfare Center |
Barry McGuinness |
BAE Systems |
Bill Dyas |
CNET/ETS |
Bob Fleming |
SPAWAR |
Brian Tsou |
Air Force Research Laboratory |
Capt Michael Sarraino |
FCTCL/CNET |
Dave Signori |
RAND |
Dave Alberts |
OASD |
David Noble |
EBR, Inc. |
Dennis Leedom |
EBR, Inc. |
Dennis Wisnosky |
Wizdom Systems, Inc. |
Dick Wilson |
EBR, Inc. |
Gwendolyn Campbell |
Naval Air Warfare Center |
Jim Murphy |
Dynamics Research Corporation |
JoAnn Brooks |
MITRE |
John A. Poirier |
OPNAV N6C/ SAIC |
John Buchheister |
OASD(C3I) |
John Nosek |
Temple University |
Julia Loughran |
ThoughtLink |
Karen Carr |
BAE Systems |
Karlene Roberts |
Berkeley |
Larry Wiener |
OPNAV N6 |
Louise Comfort |
U Pitt |
MAJ James Sweeney |
ODUSD (S&T) |
Marchelle Stahl |
ThoughtLink |
Mariann (Sam) Jelinek |
NSF/College of William & Mary |
Mark Mandeles |
J. de Bloch Group |
Mike Letsky |
ONR |
Mitzi Wertheim |
CNA |
Richard Hayes |
EBR, Inc. |
Yan Yufik |
Cognitive Scientist |