WarningThis information has been archived for reference or research purposes.

Archived Content

Information identified as archived on the Web is for reference, research or recordkeeping purposes. It has not been altered or updated after the date of archiving. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats on the "Contact Us" page.

Views and Opinions

Can We Streamline Operational Planning?

by David J. Bryant

Print PDF

For more information on accessing this file, please visit our help page.


The Operational Planning Process (OPP) provides the Canadian Forces (CF) with a formal procedure intended to aid commanders and their staffs in developing effective plans. In operational conditions, however, time is often limited, and planners likely will not have access to all the information needed. This creates a problem, with two competing demands requiring satisfaction. On one hand, one wants to be thorough and effective. On the other, one wants to be fast and efficient. The OPP provides some guidance, but it may require more time and information than can be afforded practically in the fast-paced and complex environment of today’s “three-block war.”1 This article will address whether the OPP could be made more efficient while still remaining an effective tool for planning.

The Operational Planning Process

The OPP is laid out in CF doctrine,2 and it overlaps to some extent with The Estimate and Battle Procedure, which are other planning procedures used by the Land Force (LF).3 Figure 1 shows the five major steps of Canadian Forces OPP, each of which can be decomposed into numerous sub-functions that collectively perform the analysis of the mission, development and evaluation of Courses of Action (COAs), the selection of the best COA, and the development of that COA into a plan and a series of orders. Although described in doctrine as a fairly linear approach, the OPP is meant to be a highly iterative process to cope with the nonlinear nature of operations.4


Figure 1

(Larger image)

The OPP belongs to a family of analytic and procedural methods used by military and business organizations.5 Most are descendants, in some form or other, of rational analysis models pioneered by psychologists Herb Simon and Allen Newell,6,7 in which problem solving is approached in a linear, step-by-step method.8 Although the analytic approach to problem solving has been widely adopted, the time-constrained, uncertain nature of warfare makes it a domain in which analytic processes may be difficult to perform. Analytic procedures, for example, assume a closed, readily decomposable system. Warfare, however, is an activity in which many factors affect a battle from outside any arbitrarily drawn boundary – and all the elements within warfare interact in complex ways that make it virtually impossible to eliminate uncertainty.9 Consequently, the OPP can give the false impression that exhaustively listing and considering factors has rendered the battlefield understood and certain, while non-linear dynamics progressively reduce the accuracy of assumptions over time.

Thus, it is not surprising that numerous studies10,11,12 have found that expert military teams rarely exhibit behaviour consistent with the analytic planning processes in which they have been trained. Analytic processes like the OPP have fairly severe practical boundary conditions, of time, information, and resources (especially at the brigade and battalion levels), that limit their applicability to complex, real-world domains. Analytic procedures also tend to constrain creativity and initiative in planners.

An Alternative Approach to Planning

In light of concerns about the efficiency of analytic processes, many researchers have explored an alternative approach. Generally referred to as Intuitive Decision- Making,13 this approach begins with observations of experts in skilled domains, such as fire-fighting and military command and control, as a means to determine how experts actually solve complex problems, rather than concentrating on formal models of problem solving.14 The basic principle of intuitive decision- making is that reasoning and problem-solving relies upon non-formal, recognition-based processes, such as pattern-matching and story building. Rather than attempting to devise multiple options and selecting the best among these options, problem solvers devise a single COA and evaluate it against a satisfying criterion (whether acceptable), working iteratively to evaluate, revise, and re-evaluate a promising COA until it is acceptable.15,16

Studies examining military planning have found that expert teams not only deviate from the prescriptions of the analytic planning process but also spontaneously shift into processes consistent with intuitive decision- making.17,18 Researcher A.J. Athens,19 for example, examined two historical case studies of operational planning in detail to determine how planning in these cases compared to the doctrinal procedure.

We can see the relevance of the intuitive approach to military planning in the results of studies that indicate that experts frequently deviate from formal procedures. The team of Serfaty, MacMillan, Entin, & Entin,20 for example, observed that naval officers performed simulated anti-submarine warfare scenarios by a three-stage process of matching the situation to a schematic memory representation, gathering information to elaborate the remembered case, and then recognizing an appropriate plan for action. Other studies21,22,23 have demonstrated that decision- makers focus on recognizing the situation in making command decisions. These findings are not limited to naval Command and Communications. Leedom, Adelman, & Murphy,24 as well as Serfaty et al.25 have also reported findings that support the use of recognition-based decision-making by army commanders in performing simulated missions.

Observing a Brigade Staff

A study we recently performed also illustrates the roles of analytic and intuitive processes in operational planning.26 We observed the planning Staff of 1 Canadian Mechanized Brigade Group (1 CMBG) during Exercise Virtual Ram, conducted at Canadian Forces Base (CFB) Edmonton, 21-25 January 2005. Our goal was to establish some generalizations concerning the way the OPP is applied in an operational context.

The exercise scenario for Virtual Ram assumed an operational environment with multiple threats and multiple civilian agencies, the vast majority being neutral to the military mission. For exercise play, the flank units, higher headquarters, and various civilian agencies were simulated by player cells at ‘HICON,’ the headquarters controlling the exercise, in this case, Land Forces Western Area (LFWA). The Primary Training Audience (PTA) was 1 CMBG operating in a field configuration. The Secondary Training Audience (STA) consisted of subordinate units (battalions and regiments) operating indoors in modified command post configurations. The exercise consisted of simulation supported within the framework of an exercise control matrix approved by LFWA.

Throughout Virtual Ram, we followed the Commander, Chief of Staff, G3 Plans, and G2 Plans, who all played pivotal roles in planning. We observed the activities of these individuals and their interactions with other staff. To understand how they planned the mission, we recorded what they did, their rationales, how they made decisions and performed actions, when they did things, and with whom. All this data went into constructing a detailed graphical representation of the planning process as it was performed during the exercise. In order to ensure accuracy, we validated the output of the data collection with the staff observed for data collection (i.e., the G3 Plans). We then compared what was actually done to the formal OPP, as it is laid out in army doctrine.

The planning staff followed the OPP throughout the exercise, but it did so in its own way, as appropriate to the specific conditions of the scenario. The staff performed four cycles through the process as they responded to new information from higher command and events in the exercise. Although the OPP is fairly linear, not all cycles involved all OPP steps. The staff omitted and abbreviated steps that were not needed, or were of lesser importance, in light of time pressure and missing information. Interestingly, even at the beginning of the exercise, the staff did not perform functions in the order in which they are outlined in the doctrinal OPP. Certain functions and sub-functions were abbreviated or aggregated into higher level functions. Furthermore, the planning team even moved ‘backward’ in the process at times, usually as the team acquired new information.

The planning staff was observed to omit and abbreviate OPP steps throughout all four planning cycles. This was especially true of numerous sub-steps under the major functions. In addition, throughout planning, the staff engaged in a great deal of ‘looping’ back and forth between functions, especially for lower level functions. The abbreviation and repetition of lower level functions, seemingly in groupings, suggested that these functions are strongly linked and performed as more of a continual process than in discrete steps. The staff, for example, frequently looped between COA Validation and performing the Staff Analysis, suggesting that analysis and validation were done simultaneously, as opposed to separate steps. Similarly, the review of the situation and review of higher level information were, on occasion, performed simultaneously. Steps such as analyzing factors and making deductions were returned to five times throughout the four planning cycles, again illustrating the lack of rigid separation of steps within the OPP.

In general, the CMBG planning staff followed a step-by-step analytical decision-making approach for higher level OPP functions, but employed more intuitive processes to perform specific, individual functions. It appeared that the input of various staff to the OPP was intuitive, or at least based on his/her own estimate of the situation, compiled from various sources.

Generally, it appears application of the OPP at the brigade level may be a hybrid of analytic and intuitive decision-making. The question is whether a hybrid process is better suited to the modern military arena, in which all three elements of a ‘three-block’ war may occur simultaneously within the same mission. Moreover, the CF has looked to smaller functional units, such as the task force, to serve with a large degree of initiative in the field. The traditional OPP, albeit a tool to arrive at a reasonable solution, is challenged when faced with a multi-faceted environment in which it is not clear what factors may become variables in the planning scenario.

How Can Operations Planning be Made More Efficient?

The results of our observations suggest that planners naturally bring intuitive decision-making processes to the planning process. The planning process should thus support planners in terms of their natural ways of thinking. In part, this entails an expansion of the role of experience in the process. The OPP already acknowledges the importance of experience to effective planning,27 but experience is also the basis of the intuitive approach, especially in the recognition of the situation and the diagnosis of potential problems that might arise.28 This has obvious training implications, notably, that military commanders need both to train to gain practical experience and to train to use that experience effectively.29

Planning can also be made more effective by increasing the role of critical thinking. Among the important functions of mental simulation is the discovery of problems in the plan.30 Planners must explore possible outcomes of the plan to determine how a plan might fail, the consequences of its failure, and the options that exist to enhance the plan. This is the fundamental rationale for contingency planning where branches (what if?) and sequels (what then?) in a COA are developed.31 Several procedures for ensuring sufficient critical evaluation have already been developed for military planning.32,33 These techniques teach one how to identify assumptions and vulnerabilities, and they test possible outcomes of the plan.

It is also important that planners share a common visualization of the battle space and the plan, including the commander’s intent. The OPP specifies in great detail the information necessary to plan effectively, much of which must necessarily be shared among a large number of individuals both within the planning staff, and among distributed units of the friendly force. Unfortunately, although shared understanding and the synchronization of forces that depends on it are key military tenets, the complexity and uncertainty of the battle space make it difficult to achieve.34 Thus, it is imperative that coordinated mechanisms exist within the planning organization to facilitate shared understanding.

Some Practical Measures

What specific changes to the OPP could be made to align it more closely with the intuitive decision-making approach?

Reduce the number of steps: Theories of intuitive decision-making argue that expert decision-makers perform best when they do not attempt to engage in extensive analysis, but, instead, employ recognitional strategies to assess the situation and retrieve a COA.35 A key implication of these theories, then, is that problem solving and planning should be based on a less formal and less extensive procedural basis. Recognition-based models suggest a planning process in which the initial Mission Analysis and COA Development stages of the current OPP are abbreviated by eliminating numerous specific steps within each stage. Reducing the extent to which the operations is broken down into specific factors, and eliminating the exhaustive review of these factors in the evaluation of potential COAs, has the effect of greatly reducing the number of stages in a planning process.

Pursue a single COA: Intuitive theories also suggest that experts need only pursue a single COA, rather than generate and compare multiple options. There are quite a few reasons why this should prove to be an effective measure. As S. Whitehurst36 has pointed out, multi-attribute analyses depend upon a large volume of certain facts for the analysis to have any value. In domains where there is extensive uncertainty, especially in terms of ambiguity of data, factorial comparison among multiple alternatives simply no longer yields reliable results.

From the perspective of recognition-based decision-making, it is reasonable to retrieve a single COA and to evaluate it against satisfying criteria of effectiveness, rather than for optimality. Generally, experts have been observed to retrieve high quality solutions to problems in their initial attempts to solve problems, suggesting that a single COA approach is likely to produce a suitable solution quickly.37,38 It is less cognitively demanding to mentally simulate the expected outcome of a single COA and judge its acceptability than to hold multiple mental simulations in mind at once.

Thus, comparing multiple COAs is inefficient, as it consumes cognitive resources while preventing planners from focusing on understanding the mission and the opponent.39 Multiple COA comparison is also inefficient, because it separates the development and evaluation phases. When three COAs are developed for comparison, two COAs will be developed that will not play any role in the operation. Nor do the discarded COAs usually provide any significant input into the selected COA (i.e., to make it better). When development of a single COA is more closely integrated with its evaluation, as in an iterative process of development, evaluation, and revision, ideas that would otherwise be diverted into separate COAs can be considered with respect to the COA developed and selected for implementation.40 Pursuing only one COA at a time also makes it easier for the commander to direct the COA development process, and to lend his or her expertise to the initial concept for the COA.41

Question assumptions early in the process: One function of mental simulation is to help planners identify assumptions inherent to their plan.42 This is necessary to predict outcomes accurately and to evaluate the plan. Thus, several writers have advocated the inclusion of an explicit step in planning to identify assumptions in the COA. Again, Whitehurst43 recommends that planners list assumptions as they develop a COA and judge the reliability of these assumptions as part of the evaluation. Similarly, researchers Fallesen and Pounds44 have argued that military staff be trained to perform ‘relevancy checks’ as a normal part of planning. Relevancy checks entail generating ‘what-if’ questions to consider possible problems with a COA, or potential events that would affect how that COA might work. Relevancy checks are a means to stimulate critical thinking, as well as to provide richer understanding of the mission and COA.

The value of assumption analysis is two-fold. First, explicitly identifying assumptions helps planners better understand the mission and the plan itself, which contributes to better analysis and communication. Second, when assumptions are explicitly identified, planners can evaluate the truth and consistency of those assumptions. A major part of COA evaluation should include identifying gaps, inconsistencies, and contradictions among assumptions and measures to resolve these problems. In addition, assumptions are key indicators of what information needs will exist throughout an operation.

Integrate planning and execution: Although there is sometimes a tendency among military planners to separate planning from the execution of a plan,45 it is important that planning and execution be viewed as part of an integrated system.46 Activities laid out in the OPP are just the initial stages in a process that requires constant evaluation and revision of the plan. Despite this, the OPP does not instruct planners to develop an explicit framework in which to evaluate the plan as the operation is conducted – a plan for evaluating the plan itself.47 Thus, whereas extensive analysis may be done to evaluate the suitability of a COA, the analysis of the outcomes of the plan as it is implemented can be somewhat ad hoc. For this reason, it makes sense to include a step in planning in which specific desired outcomes are identified, and measures of those outcomes developed. The measures serve as ways to judge the extent to which desired outcomes are achieved, which then indicate whether the plan is being performed as intended. A plan of assessment should be integrated with the plan of action and should directly relate to observable consequences of actions. Criteria for judging the success of achieving goals also help planners anticipate failures or problems before they become critical.48

Facilitate shared visualization: Sharing the commander’s intent is the bedrock of shared understanding and synchronized operations.49 Yet, the best way to formulate and distribute the commander’s intent is still a topic of debate.50 Generally, shared intent is seen to consist of more than just the formal statements of intent propagated by the commander.51 Although explicit intent is shared through explicit communication in some form (usually written or verbal directives), sharing implicit intent is a long-term preparatory activity that must be supported by the whole military organization. Organizations must support development of shared implicit intent by supple-menting formal activities, such as education and training in doctrine and procedures, with opportunities for teambuilding and personal interaction.52 These activities convey implicit knowledge, expectations, and values that people internalize.


Although empirical research is needed to validate the concepts identified in this article, there seems to be promise in the synthesis of intuitive and analytic concepts of planning. The OPP serves as a well-documented and logical framework in which to conduct operational planning. As deployed units become smaller and more flexible, however, we need to look at new concepts of planning. Integrating intuitive concepts with the OPP may enhance the efficiency and effectiveness of planning in time-constrained and uncertain operational environments.

CMJ Logo

Doctor David J. Bryant is currently a Defence Scientist with Defence R&D Canada – Toronto, where he is pursuing research on operational planning, inferential processes involved in situation assessment, and tactical picture compilation. He gratefully acknowledges inputs to this article from Tab Lamoureux, Lora Bruyn, Lisa Rehak, and Robert Vokac from Humansystems, Inc. 


  1. United States Marine Corps, Sustaining the Transformation (MCRP 6-11D). (Washington, D.C.: Department of the Navy, Department of Defence, 1999), at <http://www.doctrine.usmc.mil/signpubs/r611d.pdf>.
  2. Department of National Defence, Joint Doctrine Manual: CF Operational Planning Process (B-GL-005-500/FP-00), (Ottawa: Government of Canada, 2002).
  3. L. Bruyn, T. Lamoureux, and B.Vokac, Function Flow Analysis of the Land Force Operations Planning Process. DRDC Toronto Contractor Report (CR-2004-065). (Toronto: Defence R&D Canada – Toronto, Department of National Defence, 2004).
  4. Department of National Defence, Land Force: Volume 3: Command (B-GL-300-003/FP-000), (Ottawa: Government of Canada, 1996).
  5. J.J. Fallesen, “Decision Matrices and Time in Tactical Course of Action Analysis,” in Military Psychology, Vol. 7, 1994, pp. 39-51.
  6. A. Newell and H.A. Simon, Human Problem Solving (Oxford: Prentice-Hall, 1972).
  7. H.A. Simon, “Invariants of Human Behaviour,” in Annual Review of Psychology, Vol. 41, 1990, pp.1-19.
  8. J. Marr, The Military Decision Making Process: Making Better Decisions Versus Making Decisions Better (Fort Leavenworth, Kansas: Army Command and General Staff College, 2000).
  9. S. Whitehurst, Reducing the Fog of War: Linking Tactical War Gaming to Critical Thinking (Fort Leavenworth, Kansas: Army Command and General Staff College, 2002).
  10. G.A.Klein, Making Decisions in Natural Environments – Final Report. August 1994-December 1996 (ARI-SR-31), 1997.
  11. H.A. Kievenaar, Accelerated Decision Making at the Task Force Level (Fort Leavenworth, Kansas: US Army Command and General Staff College, 1997), p. 98.
  12. K.G. Ross, G. Klein, P. Thunholm, J.F. Schmitt, and H.C. Baxter, „The Recognition-Primed Decision Model,” in Military Review, July- August 2004, pp. 6-10.
  13. D.J. Bryant, R.D.G. Webb, and C. McCann, “Synthesizing Two Approaches to Decision Making in Command and Control,” in Canadian Military Journal, Vol. 4, No. 1, Spring 2003, pp. 29-34.
  14. Klein, Making Decisions...
  15. G.A. Klein, “A Recognition-Primed Decision (RPD) Model of Rapid Decision Making,” in G.A. Klein, J. Orasanu, R. Calderwood, and C. E. Zsambok (eds.), Decision Making in Action: Models and Methods (Norwood, New Jersey: Ablex, 1993), pp. 138-147.
  16. D.K. Leedom, L. Adelman, and J. Murphy, “Critical Indicators in Naturalistic Decision Making,” in Fourth Conference on Naturalistic Decision Making (Warrenton, Virginia: Klein Associates, 1998).
  17. Fallesen.
  18. G. Klein, S. Wolf, L. Militello, and C. Zsambok, “Characteristics of Skilled Option Generation in Chess,” in Organizational Behavior and Human Decision Processes, Vol. 62, No. 1, 1995, pp. 63-69.
  19. A.J. Athens, Unravelling the Mystery of Battlefield Coup D’Oeil (Fort Leavenworth, Kansas: US Army Command and General Staff College, 1992), p. 55.
  20. D. Serfaty, J. MacMillan, E.E. Entin, and E.B. Entin, “The Decision-Making Expertise of Battle Commanders,” in C. Zsambok and G. Klein (eds.), Naturalistic Decision Making (Hillsdale, New Jersey: Lawrence Erlbaum Associates, 1997), pp. 233-246.
  21. Klein, Wolf, Militello, and Zsambok.
  22. J.A.Cannon-Bowers and H.H. Bell, “Training Decision Makers for Complex Environments: Implications of the Naturalistic Decision Making Perspective,” in G. Klein and C. E. Zsambok (eds.) Naturalistic Decision Making (Mahwah, New Jersey: Lawrence Erlbaum Associates, 1997), pp. 99-110.
  23. G.L. Kaempf, G. Klein, M.L. Thordsen, and S. Wolf, “Decision Making in Complex Naval Command-and-Control Environments,” in Human Factors, Vol. 38, 1996, pp. 220-231.
  24. Leedom, Adelman, and Murphy.
  25. Serfaty, MacMillan, Entin, and Entin.
  26. L. Bruyn, L.Rehak, B. Vokac, and T. Lamoureux, Function Flow Analysis and Comparison of Doctrinal and Applied Operations Planning Process, DRDC Toronto Contractor Report (CR-2005-144) (Toronto: Defence R&D Canada – Toronto, Department of National Defence, 2005).
  27. Department of National Defence, Joint Doctrine Manual.
  28. G.A.Klein and C.E. Zsambok, Models of Skilled Decision Making. Paper presented at Visions-Proceedings of the Human Factors Society, 35th Annual Meeting, San Francisco, California, 1994.
  29. K. Neville, J.Fowlkes, and T. Strini, Facilitating the Acquisition of Mission Planning and Dynamic Replanning Expertise – Final Report. February-August 2002 (AFRL-HE-AZ-TR- 2003-0016), 2003.
  30. G. Klein and B. Crandall, Recognition-Primed Decision Strategies – Final Report. November 1988-November 1991 (ARI-RN-96-36), 1996.
  31. Whitehurst.
  32. M.S. Cohen, J.T. Freeman, and B. Thompson, “Critical Thinking Skills in Tactical Decision Making: A Model and a Training Strategy,” in J. A. Cannon-Bowers and E. Salas (eds.), Making decisions under stress: Implications for Individual and Team Training (Washington, DC: American Psychological Association, 1998), pp. 155-189.
  33. R. Pliske, M. McCloskey, and G.A. Klein, Facilitating Learning from Experience: An Innovative Approach to Decision Skills. Paper presented at the Proceedings of the Fourth Conference on Naturalistic Decision Making, Warrenton, Virginia, 29-31 May 1998.
  34. J.C. Dejarnette, Keeping Your Dog in the Fight: An Evaluation of Synchronization and Decision-Making (Fort Leavenworth, Kansas, School of Advanced Military Studies, Command and General Staff College, 2001), p. 47.
  35. G.A. Klein and R. Calderwood, Investigations of Naturalistic Decision Making and the Recognition-Primed Decision Model. – Final Report. July 1985-July 1988 (ARI- RN-96-43), 1996.
  36. Whitehurst.
  37. Klein, Wolf, Militello, and Zsambok.
  38. R. Lipshitz and O.B. Shaul, “Schemata and Mental Models in Recognition-Primed Decision Making,” in Klein and Zsambok, Naturalistic Decision Making..., pp. 293-303.
  39. Whitehurst.
  40. Whitehurst.
  41. Kievenaar.
  42. Klein and Calderwood.
  43. Whitehurst.
  44. J.J.Fallesen and J. Pounds, Identifying and Testing a Naturalistic Approach for Cognitive Skill Training. Paper presented at the Fourth Conference on Naturalistic Decision Making, Warrenton, Virginia, 1998.
  45. Bryant, Webb, and McCann.
  46. D.J. Bryant, Critique, Explore, Compare, and Adapt (CECA): A New Model for Command Decision-Making. DRDC Toronto (TR 2003-105). Toronto, Ontario: Defence R&D Canada – Toronto, Department of National Defence, 2003.
  47. R. Russell, In Support of Decision Making (Fort Leavenworth, Kansas: School for Advanced Military Studies, United States Army Command and General Staff College, 2003).
  48. Ibid.
  49. Dejarnette.
  50. D.J.Bryant, R.D.G. Webb, M.L. Matthews, and P. Hausdorf, P. (2001). Common Intent: Literature Review and Research Plan. Report to Defence and Civil Institute of Environmental Medicine (CR-2001-041). Humansystems Incorporated, Guelph, Ontario, 2001.
  51. R. Pigeau and C. McCann, “Redefining Command and Control,” in C. McCann and R. Pigeau (eds.), The Human in Command: Exploring the Modern Military Experience (New York: Kluwer Academic, 2000), pp. 163-184.
  52. Bryant, Webb, Matthews, and Hausdorf.