SCIENCE AND THE MILITARY

Canadian militiamen and British soldiers repulse the American assault at Sault-au-Matelot, Québec, 31 December 1775, by Charles William Jefferys (1869-1951). 

Wikimedia Commons

Canadian militiamen and British soldiers repulse the American assault at Sault-au-Matelot, Québec, 31 December 1775, by Charles William Jefferys (1869-1951). 

Is Your World Complex?

An Overview of Complexity Science and its Potential for Military Applications

by Stéphane Blouin

Stéphane Blouin, Ph.D, P.Eng, is a Defence Scientist at Defence R & D Canada, and an Adjunct Professor at Dalhousie University in Halifax. He holds degrees in mechanical, electrical, and chemical engineering. Throughout the years, he has held various R & D positions in Canada, France, and the USA related to applications in large-scale processes, automated assembly lines, robotics and networks. His research interests cover real-time monitoring, control, and optimization of systems with non-linear, discrete-event, or hybrid dynamics.

Print PDF

For more information on accessing this file, please visit our help page.

Introduction

On New Year’s Eve, 1775, American revolutionary General Richard Montgomery’s ill-fated decision to lead the assault through a fallen barricade in Québec City was decisive in preserving a British presence in North America. Had Montgomery instead used his troops as a shield, he might have survived and captured Québec, which would now be part of the United States. This example illustrates how complex battles and wars can be sensitive to a single event or decision.

Over the last decades, an impressive number of publications bearing words like ’complex systems’ and ’complexity’ have proliferated in the management, economics, biology, and policy literature. Despite this massive documentation, notions associated with complexity remain difficult to understand, partly due to a lack of clarity with respect to definitions, concepts, and principles. This article aims to provide an introduction to the concept of complexity, its tools, and its potential impact upon military operations.

Not surprisingly, explaining complexity is complicated. The complexity research field is not yet mature and is more akin to a loose network of interconnected and interdependent ideas.1, 2 Most complexity concepts relate to how life, as described in physical, biological, and social sciences, happens, and how it evolves. The term ’complex system’ means an assemblage of entities interacting according to rules, and exhibiting emergent behaviours through adaptation. Common-life examples of complex systems are the stock markets, ecosystems, and immune systems.

Procter & Gamble, Southwest Airlines, and other private businesses have already claimed benefits by implementing complexity concepts..3, 4 Procter & Gamble (P&G) optimized the flow of raw materials for several of its confectionary products by injecting simple, ant-like rules to its supply-chain practices and software, the analogy being that when the path borrowed by ants becomes blocked, they figure out collectively a new and efficient route. Ultimately, P&G could reduce routing time and costs by half. For Southwest Airlines, computer models showed that the transfer of packages to the most direct flights led to unnecessary handling and storing of packages. By allowing more roundabout routes, the carrier could reduce the package transfer rate by 70 percent, thus saving millions of dollars.

Southwest Airlines, the launch customer for Boeing Aircraft’s new 737 Max, 13 December 2011.

Southwest Airlines

Southwest Airlines, the launch customer for Boeing Aircraft’s new 737 Max, 13 December 2011.

So, why should a military organization care about complexity? Many good reasons relate to the fact that:

  • the classic Newtonian5 approach, which assumes a machine-like operation, is often inadequate,
  • the potential breadth of military applications is large,
  • complexity often provides answers and insights not derivable by any other existing theories,6 thus potentially providing a superiority advantage, and
  • military organizations and their operations like wars and stabilization efforts are complex systems.

Examples from many disciplines and parallels with military operations are used here to convey the main ideas and concepts. Given the extent of the topic, the coverage is not exhaustive, and references to original publications are provided for the interested reader.  

Complex Systems Defined

Complex systems are those systems sharing all of the following properties:

  • Made of a collection of entities such as hardware, software, and people
  • Component interactions are based on rules
  • “Open” or exchanging energy, matter and information, with their surroundings
  • Emergent collective behaviour
  • Irreducibility: “The whole is more than the sum of its parts”
  • Capable of adaptation and self-organization.

The distinctive trait of complex systems is that of emergence, where overall system behaviours emerge from interactions between components.

The Origins of the Complexity Concept

How to defeat a decentralized terrorist group? How can we gain advantage over an enemy force on a battlefield? How can we stabilize a region and build trust with its residents? These difficult questions have more in common with how living organisms evolve than how a mechanical clock operates. In a similar fashion, the concept of complexity is influenced by questions pertaining to how life happens, and how it evolves in natural systems and societies.

The main driving forces behind complex systems research have been findings from biological sciences, rapidly evolving computation technology, and the fact that often, the answers and insights provided by complexity cannot be derived in any other way. Early research efforts in various disciplines eventually converged into overarching principles and universal properties forming the complexity field as we know it today.

The evolution of complexity research is best illustrated by listing a few key results; more complete historical perspectives can be found in the literature.7 In the late-1950s, the cybernetics pioneer W.R. Ashby8 developed a law stipulating that the “space of possibilities” of a system should at least match the scale of the challenge to be met.9 For instance, compared to traditional war fighting, the key to success in today’s complex warfare is the capability of small units to act independently with relatively weak coordination, thus increasing their “space of possibilities.” This strategy is the exact opposite of the large-scale coherence of forces fielded in the First and Second World Wars.

In 1963, the world-renowned mathematician and meteorologist Edward Lorenz10 published his computer simulation results about “strange attractors.”

National Oceanic and Atmospheric Administration (NOAA) satellite image of Hurricane Katrina, 24 August 2005.

NOAA

National Oceanic and Atmospheric Administration (NOAA) satellite image of Hurricane Katrina, 24 August 2005.

A strange attractor is a system that has an extreme sensitivity to initial conditions and never settles into a predictable state. Lorenz showed that weather is such a system and that it cannot be predicted with 100 percent accuracy. He introduced the “butterfly effect” metaphor in 1972 by giving a talk entitled: “Predictability: Does the Flap of a Butterfly’s Wings in Brazil Set Off a Tornado in Texas?” Similar to the butterfly effect, large international consequences can result from small regional events. A classic example is that of the outbreak of the First World War, following the assassination, in June 1914, of the Austrian Archduke Francis Ferdinand and his wife in Sarajevo.11

Assassination of Franz Ferdinand, Archduke of Austria, and his wife Sophie, in Sarajevo, Bosnia, 28 June 1914, at the hands of Gavrilo Princip, by Achille Beltrame.

Gianni Dagli Orti/The Art Archive at Art Resource, NY

Assassination of Franz Ferdinand, Archduke of Austria, and his wife Sophie, in Sarajevo, Bosnia, 28 June 1914, at the hands of Gavrilo Princip, by Achille Beltrame.

In 1967, the social psychologist Stanley Milgram conducted an experiment12 to model connectedness in human societies. His experiment revealed the “small-world” or the “the six degrees of separation” phenomenon, stipulating that any two people are only six people apart in the network of acquaintances.13 Such a notion has a significant impact when a military force stabilizes a region by building trust with its residents through social networks.14

In 1983, Stephen Wolfram,15 the main developer of the Mathematica software, published simulation results showing that simple rules may lead to complex natural patterns. Indeed, Wolfram’s algorithm creates a pattern resembling that of a snail shell.

Wolfram simulation pattern.

Stéphane Blouin

Wolfram simulation pattern.

Wolfram’s algorithm creates a pattern resembling that of a snail shell.

http://fr.wikipedia.org/wiki/Fichier:C%C3%B4ne_textileIII.png

Wolfram’s algorithm creates a pattern resembling that of a snail shell.

Much more complex patterns found in nature can also be generated, yet the patterns are governed by simple rules.16 Another example where simple rules capture complex natural behaviour is that of Reynolds’ flocking simulation results,17 published in 1987. As an artificial life and computer graphics expert, C.W. Reynolds demonstrated how systems governed by three simple rules could produce the efficient, yet highly flexible, flocking behaviour observed in birds. Similar flocking rules have been encoded into unmanned aerial vehicles to ensure they fly in a formation while avoiding collisions.18  

Starlings flocking à la C.W. Reynolds.

Vince Mo/Flicker

Starlings flocking à la C.W. Reynolds.

The concepts of emergence, irreducibility, and adaptation are well illustrated by the example of flocking birds. The “V” shape pattern that migrating birds naturally adopt in flight formations is the classic example to convey the emergence concept. As such, emergence and irreducibility are closely related in the sense that the overall system, a bird formation, cannot be understood by understanding the parts in isolation, i.e., a single bird. Adaptation is illustrated by the fact that if the leading bird is removed, then any other bird can take the lead position. Similarly, some terrorist groups have taken on the structure of informal local groups where any member can take on the role of leader.19 Also, the organized society of an ant colony is determined not by the dictates of the queen, but by local interactions among thousands of worker ants.20 This order arises, despite the absence of a centralized authority, through interactions between components governed by cooperation and competition.

V formation geese

500px.com/photo by Annette Gerard

V formation geese

It is remarkable how a small set of interaction rules in a complex system can result in a large number of outcomes. Those same rules may lead to emergence. Referring to past examples, there were eight rules in Wolfram’s algorithm, and three rules in Reynold’s flocking birds. In fact, a small number of rules may lead to a very large number of outcomes. For instance, the game of chess has only a few dozen rules, but after hundreds of years, we keep discovering new strategies for playing it. Ill-conceived rules defining local interactions between components can have undesired global consequences. As an example, the blackout of 14 August 2003, during which 20 percent of the North American power grid went down, was the result of many local and interdependent interactions.

Prior blackout 14 August 2003

NOAA

Prior blackout 14 August 2003

Blackout 14 August 2003

NOAA

Blackout 14 August 2003

The Complexity Approach

Advantages and Drawbacks

Complexity suggests new ways of thinking about problems, and new questions that should be answered. Some authors claim that complexity “… allows old concepts to be understood in different ways, allows for new generalizations about certain kinds of phenomena, and has unique concepts of its own.”21 Despite their limited predictability, one may still be able to draw valuable conclusions from studying complex systems. Indeed, even though weather is not fully predictable, the space of possible outcomes is still known by meteorologists.

The adoption of a complexity approach leads to outcome uncertainties. Just like a meteorologist knows his/her weather forecasting model is not 100 percent accurate, a complexity practitioner must be comfortable with unanticipated outcomes and a less-than-perfect prediction capability. As a pioneer in complex systems and non-linear science, J.H. Holland22 states: “…with a careful research plan, under controlled conditions, using selected agents, complex systems do pretty much what they damn please.”

One of the greatest challenges for a complex system practitioner is that the outcome will be highly context or history dependent. A challenge for military applications is that commanders may find it difficult to rely upon systems that lack a quantifiable measure of effectiveness.23 Complexity, in its purest sense, is also challenging to use because it does not always indicate what people might need to do differently in specific contexts.24 These last two concerns can be partially addressed by testing distinct scenarios numerous times, and comparing their outcomes.

One of the most common tools for studying complex systems is a system model simulated on computers. The main difficulties with computer models are that they may lack scientific rigor, and that there is no consensus among the various model types, and also about their validity. Diane Hendrick, an active member of the Peace & Collaborative Development Network, raised an interesting question:25 “How useful and reliable can the model then be if the emergent properties are actually constrained by the model-makers interpretations?” Another difficulty with such computer models is their calibration to produce strong correlations with real-world systems. Some critics claim that complexity models have shown only mixed results and limited applicability.

Points of Contention

The complexity approach also has its share of contention. In “What is Complexity Science, Really?”26, author S.E. Phelan states: “Complexity science introduces a new way to study nature’s laws that differs from traditional science. Complexity science posits simple causes for complex effects.” However, the fact that simple mathematical rules have occasionally generated behaviours similar to those found in nature or society does not prove that there exists a set of simple rules explaining every complex phenomenon in the world.

Clarity often lacks about what “system” means in “complex systems.” An additional problem in modeling complex systems is that they often have fuzzy boundaries.27 After all, where does one ecosystem stop, and the next one start? This common situation poses serious challenges for calibrating computer models to produce strong correlations with real-world phenomena. One such example is the coupled ocean-atmosphere, where neither system is independent of the other.

Many authors also confuse “complex systems” with “complicated systems.”28 To clarify the difference between those concepts, consider a formation of migrating birds and a fighter jet. Both systems are made of multiple components interacting with one another. Each fighter jet component has a clear role and functions, which cannot adapt ’on the fly’ as can a flock of migrating birds. Thus, the fighter jet is ‘complicated,’ but not ‘complex.’

Snow geese flying in arrowhead formation

National Geographic 1182545/Alaska stock images

Snow geese flying in arrowhead formation

Applying the Complexity Approach

Complexity concepts can be used individually or as part of an integrated approach to describe, understand and model phenomena. Various methods can be used to study complex systems. Simulation using computer models is by far the preferred tool. Simulations allow a series of thought experiments to test various  ‘what if’ scenarios.

The main categories of computer simulations of a complex system reduce to system dynamics (SD), cellular automata (CA), and agent-based models (ABM). At the physical level, system dynamics is represented by equations obeying the laws of physics, while at the organization-level system dynamics are higher-level abstractions comprising loops and inputs where an input could be a successful marketing campaign.

 A cellular automaton contains a large number of simple identical components whose interactions are limited to neighboring components. Each component has a finite set of possible values evolving in discrete time steps. An agent-based model involves a number of decentralized decision-makers (agents) interacting through prescribed rules.29

A limited number of generic simulation packages exist for organizational and physical SD, whereas numerous packages offer CA and ABM capabilities. A recent review30 surveys more than 30 ABM simulation-based platforms for generic applications. Most CA applications enforce a specific rule set, and its best introductory example remains John Conway’s “Game of Life.”31 Among military-oriented applications, many nations, including the United States of America, New Zealand, and Australia all have developed sophisticated simulation packages geared toward specific needs.32

Gaining in popularity, agent-based models are used to explore a wide-range of issues, from disease propagation and social networks, to manufacturing and combat.33 Agent-based sensitivity studies demonstrated that even a 50 percent air traffic reduction would not dramatically slow the spread of certain types of pathogens. Policymakers thus now know that restricting air travel is unlikely to be the most effective policy tool for dealing with Severe Acute Respiratory Syndrome (SARS).34

Contagion spread of SARS outbreak

Christos Nicolsides/Juanes Research Group, MIT, at http://dx.plas.org/10.1371/journal.pone0040961

Contagion spread of SARS outbreak

Military applications of agent-based models include studying the impact of degraded communications for army troops, investigating the integration and use of unmanned surface vehicles for naval operations, and exploring the impact of squad-size formations in an urban environment.35 A degraded communication study36 evaluated how factors like latency, maximum range, buffer size, accuracy, reliability, and jamming impacted the ability of a networked force to conduct a company level attack while using the Future Combat System (FCS). The FCS utilizes modern battlefield sensing, networking, and lethality features to engage the enemy at a standoff distance. Using MANA37 and the Caspian Sea as a fictitious battlefield area, the impact of the above communication factors was quantified by tracking the battle length and causalities. Through a large number of simulations with different settings, it was found that (1) a communication range resulting in a coverage that is less than 75 percent of the battle space has large and negative consequences, and that (2) a slow network is nearly as detrimental as having a diminished communication range.

Another successful example of computer-based simulations relates to the coupling of DARNOS 38 with a battle-space dynamic representation thus enabling the analysis of different C2 (command and control) networking structures and the assessment of the operational effectiveness of a networked force.39 A squad-size study using agent-based models investigated the possibility of reducing the army infantry squad from 12-to-9 soldiers, while incorporating the futurist concept of an Armed Robotic Vehicle (ARV).40

The study explored the impact of varying Blue Force characteristics, such as the squad size, the number of squads, the weapons, and sensor ranges in an urban environment where Red force troops evolved. An interesting sensitivity outcome from simulation results concludes that squads composed of 9 or 12 soldiers suffer a similar number of casualties as long as the ARV survives, but the survivability of smaller squads is greatly reduced when the ARV is dysfunctional.  

In the majority of cases, simulation outcomes can be classified into a relatively small number of distinct categories. As stated in “The Use of Complexity Science” by T.I. Sanders and J.A. McCabe 41 with respect to agent-based models in particular: “When used to support real world decision-making, these interactive computer-based models enhance our thinking and lead to better responses, fewer unintended consequences and greater consensus on important policy decisions.”

Among the lessons learned through the application of complexity theory, it is clear that the inherent context and history dependence of complex systems has implications in many fields. For instance, “… the success of a nation may be best explained not by its population’s virtues, its natural resources and its government’s skills, but rather simply by the position it took in the past, with small historical advantages leading to much bigger advantages later.”42 This last reference also highlights that in the realm of knowledge management and organizational learning strategies, ‘best practices’ may need to be replaced by ‘good principles,’ because what worked in the past may not work the next time around.

Another important lesson is that the quality of relationships between individuals may be more critical than individuals themselves, just like a sport team with the best individual players behaving egoistically can lose to a cooperative team of less talented players. Also, complexity implies that hierarchical organizations can never be as resilient as complex networks.43 Interesting military concepts include steering enemy forces, either to a chaotic or an equilibrium mode.44 In the chaotic mode, the enemy force is subject to a decision overload in a short time frame, thus potentially having a destabilizing effect. In an equilibrium mode, the enemy force gets closer to a linear behaviour, thus being easier to predict and to defeat. 

Members of Hamas’ national security forces demonstrate during a graduation ceremony at their destroyed security compound in Gaza, 2 December 2012.

Reuters RTR3B4Jn by Suhaib Salem

Members of Hamas’ national security forces demonstrate during a graduation ceremony at their destroyed security compound in Gaza, 2 December 2012.

Complexity concepts are sought after because their outcomes often suggest unconventional and radical ideas, such as evolving at the edge of out-of-control, where a system is most adaptive, flexible, and energized. Viewed through the lens of complexity, the system that can best and most quickly adapt will be the system that prevails. When a complexity viewpoint is adopted, “… one’s focus turns from knowing the world to making sense of the world, from forecasting the future to designing the future, from discovering the right force structure to keeping the force structure fluid, and from overcoming the limits of the system to unleashing the dynamic potential of the system.”45 This reinforces the belief that the “… capacity to tolerate uncertainty is a better predictor of success than straight cognitive ability.”46 Complexity “… also suggests that predicting the long-term future is less important ... than is maintaining the ability to learn and adapt to a rapidly changing and largely unpredictable environment.”47 A counter-intuitive complexity notion inspired from nature is that living organisms usually seek adequate solutions, as opposed to optimal solutions.

Complexity concepts have already been used to study various types of military operations.48 At the tactical level, concepts borrowed from complexity theory led to new approaches for dealing with insurgencies and terrorists.49, 50 Interestingly enough, these tactics do not favor the eradication of specific members but instead targets their relationships. Making a network analogy, individual fighters, cells, tribes and clans represent the network nodes, and relationships between nodes are the network links. In the present context, links between nodes could mean communication channels, financial, ideological, spiritual, or technological dependencies, sanctuary access, and so on. The proposed tactics recommend either re-enforcing the network links to increase the overall predictability, or ’de-linking’ the loose base that provides the highest level of adaptation. 

Both quantitative and qualitative assessments of results originating from complexity concepts have been performed. The successful military campaign of General Matthew Ridgway in the Korean War was qualitatively compared to complexity concepts for assessing their potential as a basis for military practice.51 The comparative analysis showed strong correlations between complexity concepts and General Ridgway’s decisions through the conflict. A quantitative assessment was performed by comparing the outcomes of two agent-based models, and JANUS, a commonly-used interactive high-resolution ground combat simulator, in the squad-size context described earlier.52 The outcomes of all three types of simulation showed strong similarities in determining the key factors impacting the squad performance.

General Matthew Ridgway

DefenseImagery.mil HD-SN-98-07578

General Matthew Ridgway

On-Going Research & Dynamic Networks

The study of Complexity was originally inspired by natural and social systems. Today, researchers are applying complexity concepts to understanding and designing man-made systems. A decade ago, the question, “How do we build artificial systems so that properties that emerge are the ones we want?” was raised by D.G. Green and D. Newth.53 Since then, the emergence simulation trend morphed into an investigation about how one can instead influence emergent behaviours, knowing that “controlling a complex system” is an oxymoron.54 This change of attitude coincides with the fact that “complex networks,”55 a large class of complex systems, has recently experienced an explosion of research efforts due to the need of better understanding social networks, propagation of diseases, electricity grid stability, and so on. Around the same time, the Information Technology professor D.G. Green56 demonstrated that any complex system inherits the properties of a very generic class of networks.

Today, questions related to network dynamics have significant momentum in the complexity community. This trend is likely to remain in the near future, as researchers are just starting to grasp the impact of local actions on large-scale networks, i.e., rumors spreading over social networks, or viruses propagating through computer networks. For many military operations, communication and data networks are critical for operating unmanned vehicles and off-board sensors. Given that the use of such systems will likely increase,57 we will soon face the challenge of managing heterogeneous networks, whose nodes have distinct capabilities and various levels of autonomy.

Internet blog map.

Matthew Hurst/Science Photo library

Internet blog map.

Current research on networks covers a wide range of activities pertaining to various network types based on their structures, their communication links, and their natural or man-made origins. One of the main questions is to determine the necessary rules and connectivity to prevent emerging undesirable behaviour. For instance, it can be shown that certain connectivity conditions are required for a network of distributed agents to reach a consensus by exchanging data.58 In the absence of such conditions, consensus cannot be reached, and each agent could have a significantly different version of the truth, thus diminishing the potential for military mission success. Autonomy rules could also dynamically change, based upon the presence and configuration of the network, thus allowing a collective,  rather than an individualistic assessment of situations. For instance, once established, the network could enter into a ‘survival’ mode, and force mobile agents to manoeuvre in a formation pattern to favour strong network connectivity.   

Egypt influence network.

Kovas Boguta

Egypt influence network.

Conclusion

Although complexity lacks integrated theoretical foundations, its concepts, tools, and principles are widely applicable to understanding and enhancing military effectiveness. Applications with potentially high benefits are those where the life and living system metaphor is a more adequate description than that of a machine operating with a clockwork precision. Numerous examples conveyed that complexity concepts can impact military decisions at the tactical, strategic, and operational levels. 

To a large degree unpredictable and uncontrollable, complex systems have distinctive traits common across many disciplines. Whereas original complexity research focused on investigating emerging behaviour in systems found in nature and societies, recent research trends include influencing the emergent behaviour of man-made systems.  

Conclusions reached through complexity concepts often lead to unconventional guidance emphasizing autonomy, decentralization and adaptation, and diminishing the importance of long-term predictions and rigid hierarchies. Such conclusions may encounter serious oppositions from many establishments, including military formations, because it is somewhat contrary to the conventional way of thinking.

So, if your world is indeed complex, what are the advantages of adopting the complexity manner of thinking? Complexity remains the most promising theoretical framework available today to study questions pertaining to military structures and operations, due to their strong similarities with how living organisms survive through adaptation, competition, and cooperation.

 

Robot soldier

Victor Habbick visions/ Science Photo Library

Robot soldier

Acknowledgement

The author sincerely thanks Dr. Daniel Hutt for his thorough reviews which significantly improved the quality of this publication.

 

NOTES

  1. F. Heylighen, “What is Complexity?” Principia Cybernetica Web, 9 December 1996.

  2. D. Chu,  R. Strand, and R. Fjelland, “Theories of Complexity: Common Denominators of Complex Systems,” in Complexity, 2003, Vol. 8, No. 3, pp. 19-30.

  3. T. Plate, “Complexity Science as a New Strategic Tool,” in Quarterly Strategy Review CGEY Strategy & Transformation Practice Publication, April 2001, Vol. 1.

  4. G. Rzevski, “Application of Complexity Science Concepts and Tools in Business: Successful Case Studies,” 2009. Accessed on 20 September 2011 at www.complexitynet.eu .

  5. A Newtonian metaphor is the clock, i.e., finely tuned gears ticking along predictably and reliably keeping time.

  6. B.J. Zimmerman, “A Complexity Science Primer: What is Complexity Science and Why Should I Learn About It/” in Edgeware – Primer, 2000.

  7. “The Many Roots of Complexity Science.” Accessed on 19 September 2011 at http://tuvalu.santafe.edu/events/workshops/index.php/
    The_Many_Roots_of_Complexity_Science
    .

  8. W.R. Ashby, An Introduction to Cybernetics, (London: Chapman and Hall, 1957).

  9. Y. Bar-Yam, “Complexity of Military Conflict: Multiscale Complex Systems Analysis of Littoral Warfare,” Report for Contract F30602-02-C-0158, 2003.

  10. E.N. Lorenz, “Deterministic Nonperiodic Flow,” in Journal of the Atmospheric Sciences, 1963, Vol. 20, pp. 130–141.

  11. A.M. Saperstein, “War and Chaos,” in American Scientist, 1996, Vol. 83, pp. 548-557.

  12. S. Milgram, “The Small World Problem,” in Psychology Today, 1967, Vol. 2, pp. 60-67.

  13. J. Leskovec and E. Horvitz, “Planetary-Scale Views on an Instant-Messaging Network,” Proceedings of the 16th International Conference on World Wide Web, 2008.

  14. A.K. Shaw, M. Tsvetkova, and R. Daneshvar, “The Effect of Gossip on Social Networks,” in Complexity, 2010, Vol. 16, No. 4, pp. 39-47.

  15. S. Wolfram, “Statistical Mechanics of Cellular Automata,” Rev. Mod. Phys., 1983, Vol. 55, pp. 601–644.

  16. S. Camazine, “Patterns in Nature,” in Natural History, June 2003, pp. 34-41.

  17. C.W. Reynolds, “Flocks, Herds, and Schools: A Distributed Behavioral Model,” in Computer Graphics, 1987, Vol. 21, No. 4, pp. 25-34.

  18. S. Hauert, S. Leven, F. Ruini, A. Cangelosi, J.C. Zufferey and D. Floreano, “Reynolds Flocking in Reality with Fixed-wing Robots: Communication Range versus Maximum Turning Rate,” Proceedings of the IEEE International Conference on Intelligent Robots and Systems, 2011, pp. 5015-5020.

  19. M. Sageman, Leaderless Jihad: Terror Networks in the Twenty-first Century, (Philadelphia, PA: University of Pennsylvania Press, 2008).

  20. D. Mackenzie, “The Science of Surprise - Can Complexity Theory help us understand the Real Consequences of a Convoluted Event like September 11,” in Discover Magazine, 2002.

  21. B. Ramalingam, H. Jones, T. Reba, and J. Young, “Exploring the Science of Complexity: Ideas and Implications for Development and Humanitarian Efforts,” Development, Overseas Development Institute, 2008, Vol. 16, pp. 535-543.

  22. J.H. Holland, Hidden Order : How Adaptation builds Complexity, (New York: Helix Books, 1995).

  23. J.J. Goble, “Combat Assessment of Non-lethal Fires: The Applicability of Complex Modelling to Measure the Effectiveness of Information Operations,” School of Advanced Military Studies, AY 01-02, 2002.

  24. P. Beautement, and C. Broenner, Complexity Demystified: A Guide for Practitioners, (Axminster, UK: Triarchy Press, 2011).

  25. D. Hendrick, “Complexity Theory and Conflict Transformation: An Exploration of Potential and Implications,” University of Bradford, Center for Conflict Resolution, Department of Peace Studies, 2009.

  26. S.E. Phelan, “What is Complexity Science, Really?” in Emergence, 2001, Vol. 3, pp. 120-136.

  27. K.A. Richardson, P. Celliers, and M. Lissack, “Complexity Science: A Grey Science for the Stuff in Between,” Proceedings of the 1st International Conference on Systems Thinking in Management, 2000, pp. 532-537.

  28. J. Wendell, “Complex Adaptive Systems: Beyond Intractability, Conflict Research Consortium,” Boulder, CO: University of Colorado, October 2003.

  29. J.D. Farmer and D. Foley, “The Economy Needs Agent-based Modelling,” in Nature, August 2009, Vol. 460, pp. 685-686.

  30. R.J. Allan, “Survey of Agent Based Modelling and Simulation Tools,” Technical Report DL-TR-2010-007, Science and Technology Facilities Council, October 2010.

  31. M. Gardner, “Mathematical Games – The Fantastic Combinations of John Conway’s New Solitaire Game “Life””,  in Scientific American, October 1970, Vol. 223, pp. 120-123.

  32. To name a few, one finds the Irreducible Semi-Autonomous Adaptive Combat (ISAAC) and Enhanced ISAAC Neural Simulation Toolkit (EINSTein) from the US Marine Corps Combat Development Command and geared towards land combat, the Map Aware Non-uniform Automata (MANA) from New Zealand Defence Technology Agency used for modeling civil violence management and maritime surveillance and coastal patrols, the BactoWars from the Australian Defence Science and Technology Organization (DSTO) utilized to address problems in the littoral domain, the Conceptual Research Oriented Combat Agent Distillation Implemented in the Littoral Environment (CROCADILE), the Warfare Intelligent System for Dynamic Optimization of Missions (WISDOM), and the Dynamic Agent Representation of Networks of Systems (DARNOS) developed by the Australian Defence Force Academy and DSTO.

  33. V.E. Middleton, “Simulating Small Unit Military Operations with Agent-based Models of Complex Adaptive Systems,” Proceedings of the IEEE Winter Simulation Conference, 2010, pp. 119-134.

  34. Applications of Complexity Science for Public Policy – New Tools for Finding Unanticipated Consequences and Unrealized Opportunities, Organization for Economic Co-operation and Development (OECD), September 2009.

  35. T.M. Cioppa and T.W. Lucas,“Military Applications of Agent-based Simulations,” Proceedings of the IEEE Winter Simulation Conference, 2004, pp. 171-180.

  36. Ibid, pp.173-175.

  37. See Note 32.

  38. Ibid.

  39. M.F. Ling, “Nonlocality, Nonlinearity, and Complexity: On the Mathematics of Modelling NCW and EB,” Proceedings of the 22nd International Symposium on Military Operational Research, 2005.

  40. Cioppa and Lucas, p. 178.

  41. T.I. Sanders and J.A. McCabe, “The Use of Complexity Science,” Washington Center for Complexity & Public Policy, October 2003.

  42. Ramalingam et al., p. 28.

  43. M.F. Beech, “Observing Al Qaeda through the Lens of Complexity Theory: Recommendations for the National Strategy to Defeat Terrorism,” Center for Strategic Leadership, Strategy Research Paper, 2004.

  44. P.J. Blakesley, “Operational Shock and Complexity Theory,” School of Advanced Military Studies, AY 04-05, 2005.

  45. C.R. Paparone, R.A. Anderson and R.R. McDaniel, “Where Military Professionalism meets Complexity Science,” in Armed Forces & Society, 2008, Vol. 34, No. 3, pp. 433-449.

  46. C. Rousseau, “Complexity and the Limits of Modern Battlespace Visualization, Command and Control,” in the Canadian Military Journal, Vol. 4, No. 2, Summer 2003, pp. 35-44.

  47. J. Gore, “Chaos, Complexity and the Military,” National Defense University, National War College, 1996.

  48. Examples include Command and Control, Strategic Planning, Stabilization, Support, and Peacekeeping Operations, Littoral, Air, and Asymmetric Warfare, Conflict Resolution, Common Operational Picture, Terrorism, and Network Robustness.

  49. D. Kilcullen, “Countering Global Insurgency,” in Small Wars Journal, 2004.

  50. Beech, pp. 1-16.

  51. E.D. Browne, “Comparing Theory and Practice – An application of Complexity Theory to General Ridgway’s Success in Korea,” Monograph, School of Advanced Military Studies, U.S. Army Command and General Staff College, 2010.

  52. Cioppa and Lucas, p. 178.

  53. D.G. Green and D. Newth, “Towards a Theory of Everything? - Grand Challenges in Complexity and Informatics,” in Complexity International, 2001, Vol. 8, pp. 1-12.

  54. F. Heylighen, “Complexity and Self-Organization”, Buildings, 2008, Vol. 5, No. 5, pp. 1-20.

  55. S.H. Strogatz, “Exploring Complex Networks,” in Nature, 2001, Vol. 410, pp. 268-276.

  56. D.G. Green, “Connectivity and the Evolution of Biological Systems,” in Journal of Biological Systems, 1994, Vol. 2, No. 1.

  57. Unmanned Systems Roadmap: 2007-2032, Washington, DC, Office of the Secretary of Defense, U.S.A. DoD, December 2007.

  58. W. Ren, R.W. Beard, & E.M. Atkins, “Information Consensus in Multivehicle Cooperative Control,” IEEE Control Systems, April 2007, Vol. 27, No. 2, pp. 71-82.