ID 27684198 © Rafael Ben Ari | Dreamstime.com
"Warfighting is fundamentally a human activity, in which humans choose what to do, consciously or subconsciously; rationally, irrationally or non-rationally", states Jim Storr. Therefore, it is susceptible to what behavioural economists call 'cognitive biases' expressed in heuristics, choices based on intuition that affect human judgement and weaken the theory of rational decision making. A realm so sensitive to cognitive biases provides fertile fields for actions designed to influence perceptions and consequently the decisions of the rival in our favour. For reasons elaborated, Western armies find it difficult to meet the challenge of Perception Management. The time has come to connect the cognitive and cultural biases to the conduct of war and to significantly and essentially expand the conduct of perception manipulation.
It is reasonable for strategic and operational planners to envy crime-scene forensic analysts. The latter come to a scene after the event has occurred; if the police did its job properly, the scene will be closed, and the analyst can do his job of collecting evidence in relative quiet and concentration; afterwards he takes the findings to a laboratory and use the tools of science (chemistry, biology, physics…) to arrive at scientific conclusions. Nothing like the uncertainty, and often chaos, of the strategic environment and even more so of the battlefield.
Researcher Itiel Dror reached a different conclusion. In an article titled How can Francis Bacon help forensic science? The four idols of human biases[i] he characterized the difficulties of forensic science in managing the biases of human thinking. He opens the article with a question that strategists and soldiers will empathize with: is forensic science actually a science and can one conduct a critical discussion of its paradigms? After concluding that it is indeed a science, he approaches it via Francis Bacon's four idols that bias human scientific research and uses them to analyze the failures of forensic scientists.
The first, Idola Tribus, defines our limitations as members of the human race, and especially our difficulty in looking beyond the confines of our point of view. Thus, Dror notes, the combination of high professionalism and the cumulative discovery of evidence causes forensic scientists to fix on a theory already in the first stages of their work and they find it hard to abandon it for another.
The second, Idola Specus, focuses on the personality, education, experience and world-view. Dror determines that forensic scientists see themselves as police detectives, and therefore identify with the need to use the evidence to point to a specific suspect and often one that is already in custody.
The third, Idola Fori, describes the effect of our interaction with our surroundings, especially our social and professional contacts. Dror notes that working together with police detectives causes the forensic scientists to adopt practices and terminology that do not necessarily support clear-cut scientific results.
The fourth, Idola Theatri, describes the determinations of blind faith based on narrow research and anecdotal observations. Dror argues that critical thinking is difficult when it might reveal human error in assessing the evidence.
Thus, in a relatively closed profession, based on precise scientific tools and affecting the fates of human beings, human biases can cause significant errors. Dror argues that the military profession is less sensitive to these errors, because its failures are exposed to all, therefore soldiers cannot not learn from the failures by acting in a closed and defensive manner.
We are sure that strategic and military planners will not agree with him. Our profession is a fertile field for biased judgments and erroneous decisions, that our rivals can exploit to their advantage in the struggle. In this article I will not reveal anything that is not known to the professionals in this field, and I am sure that in every topic I describe there is much more published material that I am not aware of and do not refer to.
My goal is to raise the discussion on the relationship between war, cognitive biases and Perception Management, and to argue that it is time for strategic and military planners to stop regarding it as a side-issue and focus their attention on it as is done in other areas of human activities such as politics and economics.
I have chosen to base my description of the relationship between human behaviour and the characteristics of war on Jim Storr's statement in his book The Human Face of War:[ii] "Warfighting is fundamentally a human activity, in which humans choose what to do, consciously or subconsciously; rationally, irrationally or non-rationally. Fundamentally, three things occur on the battlefield: men think, move and commit violence. All other activities support these functions"[iii].
He then continues: "The human, as an agent on the battlefield, can make rational choices… Conversely, soldiers may not always act rationally. Surprise, shock and fear all affect their behavior… the human is not just the victim of his emotions. He is an agent on the battlefield and he influences the outcome… His viewpoint or understanding of battle is highly important…"[iv].
Human perception is what leads to the results of a war:"…the normal condition for tactical success or defeat is the collective withdrawal of participation"[v].
Therefore the aim of the physical act in war is to effect human perception: "We seek a concrete mechanism which links manoeuvre and weapons effects to a reduction of individual and collective participation. The mechanism appears to involve shock, surprise and suppression"[vi].
Storr summarizes the characteristics of human battlefield decision-making: "Thus it seems that tactical decision-making should be very quick. It must 'deal with' many interrelated factors. It should aim to inflict damage whilst avoiding damage to one's own forces. It should exploit the strengths and weaknesses of the human beings involved in combat, both friendly and enemy. It is often undertaken in highly stressful circumstances, not least the fear of death or dismemberment. It should initiate and accommodate the outcomes of, strong interactions between forces on the battlefield, be robust against rogue outcomes of those interactions and yet support the clear communication of intent from commanders to subordinates"[vii].
Beyond its physical dimension, war is a human phenomenon. Its results are determined by the understandings of people among themselves and between them and others. In his book Storr focuses on the tactical level. On the strategic level, which will not be analyzed here, some of these characteristics are strengthened because the activity is more on the cognitive level rather than in action; others are weakened because the personal survival of those involved is not immediately threatened. Therefore, war is profoundly affected by the biases of human thinking.
Till the beginning of the 1970s economic science based itself on the concept of the homo economicus, claiming that people make rational economic decisions designed solely to maximize profit. Over time a new research concept grew, behavioural economics, developing the analysis of cognitive biases, expressed in heuristics – choices based on intuition that affect human judgement and weaken the theory of rational economic decision making. This direction of research has broadened beyond economics and is now a central view of human behaviour.
Despite war being a human activity, the penetration of the cognitive bias concept into military theory has been slow and partial. One of the pioneers is Richard Heuer, a researcher of the Intelligence profession, who already in 1981 identified the effect of cognitive bias on intelligence estimates and the decisions based on them[viii]. He analyzed the effect of biases on intelligence work, pointed to the ability of deception operations to influence the intelligence picture and the decisions made by a rival, and offered solutions to the problem. His theory was based on the determination that "Circumstances under which human perceptions are most commonly distorted have significant implications for understanding the nature and limitations of intelligence analysis"[ix].
However, a search by the current author based on an admittedly not methodical or comprehensive, found that research and discussion on the relationship between war and cognitive biases has increased only in the last decade and even that not in the core of military doctrinal debates[x].
Among other issues, the writers point out the following biases:
Availability Bias: the human tendency to estimate the probability of a current event based on examples available in their memory, usually those that have occurred repeatedly. An example is the tendency of people to avoid flying after the September 11 attack because of a fear of repetition, resulting in heavier road traffic, more car accidents and deaths, even though, statistically flying is much safer. A military example would be assuming deterrence of a rival from certain actions because in the past he suffered heavily when acting in that manner.
Anchoring Bias: Initial perceptions of an issue solidify and it is very difficult to change them even when presented with conflicting facts. An example would be the British success in planting in the Germans the false impression that Cyprus was held by 20,000 soldiers, and causing the Germans to not choose to conquer the island. Thus, creating in the rival's thinking reasons to rule out certain possible courses of action a priori.
Confirmation and Disconfirmation Bias: emphasizing data that strengthen our concepts or weaken rival theories. Thus the surprises at Pearl Harbour and the Yom Kippur War as examples of channeling the rivals decisions to suit our needs by supplying him false information.
Sunk Cost Fallacy: continued adherence to a course of action causing considerable losses only because we have already invested considerably in it. Thus, causing the rival to adhere to a course of action that has already caused him considerable damage.
The various writers suggest possible solutions to manage these biases. One of the main problems is the method of military situational assessment and decision making, the OODA Loop, of collecting information, analyzing it, making decisions based on it and implement them that are extremely sensitive to biases in the process. Among the suggestions are:
Methods to enable the decision makers to identify problems in their process: check lists and adding a process of analyzing competitive hypotheses.
Red Teams, serving as Devil's Advocates and inserting external considerations to monitor the analysis and decisions.
Techniques for identifying rival deception efforts on our decision-making process.
Conducting deception operations to create biases within the enemy's way of thinking.
However, it is doubtful if these methods will defeat the challenge. Williams summarizes: "The volatility, uncertainty, complexity, and ambiguity of our operating environment demand the military professionals make rapid decisions in situations were established military decision making process are either too narrow or ineffective… As a result, commanders find themselves engaged in more intuitive decision making… When subjective assessments, ego, and emotion are intertwined with cognitive processes, we realize that intuitive decision making is fraught with potential traps"[xi].
A realm so sensitive to cognitive biases provides fertile fields for actions designed to influence perceptions and consequently the decisions of the rival in our favour. I call this 'Consciousness Manipulation' or 'Perception Management'.
In recent years the world around the military profession, especially economic-marketing and politics, is boiling with manipulation of information – 'Fake News' – and emotions aimed at those the manipulators wish to channel to making certain decisions. Research and writing in this field is at a high and some is not yet ripe and full of contradictions – see for example the contradictory studies of the effects of the Russian campaign to influence the 2016 U.S. elections.
Perception Management is the core of political campaigning. The following quotations are from a book by three leading Israeli strategy consultants[xii], but similar statements can be found in the writings of many others in the field: "Often it seems that campaigns' staffs want to encourage 'strategic voting' – to create a Bandwagon Effect[xiii], and will manipulate polls or pseudo-polls to cause swing voters to vote for them only because of the urge to join a winning team. The candidate must provide them a picture that shows him on the way to victory. There are many ways of achieving such data, mostly by the manner questions are presented and formulated"[xiv].
"Manipulations would not succeed if it was not for the media. During campaigns the media needs an infinite amount of materials, it will adopt and publish any piece of information found rolling around. Polls are an excellent way of pushing useless or fake information into the media"[xv].
"Politicians use 'spins' to divert public debate to topics comfortable to them. A 'spin' can repair the results of events that have already occurred or pre-empt a future event. 'Spins' are conducted by creating a story suitable to the teller; releasing facts and data useful for him…"[xvi].
"Donald Trump conducted 2.0 generation social media campaign. He replaced Obama's infographics, structured video-clips and thought-out posts with personal tweets 'pulled from the cuff'. Most politicians work to have themselves referred to positively. Trump threw that rule out. Positively or negatively, all that mattered was to be talked about…"[xvii].
As Spiro Agnew's famous complaint: "The bastards changed the rules and didn't tell me".
The change in the military's attitude to Perception Management has already begun in Russia (and I am not referring to Western interpretation of what is mistakenly called the 'Gerasimov Doctrine'). In July 2018 a federal indictment was filed in the USA against 12 men accused of operating the Internet Research Agency LLC in St. Petersburg – the organization that conducted the interference operation in the American elections[xviii]. Though this is not written explicitly in the indictment, they are alleged to be agents of the Russian military intelligence (GRU).
The Chinese too have developed a Perception Management concept based on their military's official Three Wars doctrine: Public Opinion Warfare, Psychological Warfare and Lawfare.
Public Opinion Warfare precepts are: "'demoralize one’s opponent by a show of strength', 'create momentum to control the situation', 'assail strategic points', and 'seek the avoidance of injury'. In particular, it is critical to be the first to release information in a contingency and actively guide public opinion in order to achieve and preserve the initiative on the 'public opinion battlefield'. Beyond efforts to exploit an adversary’s shortcomings, the opponent’s attempts to engage in public opinion warfare must also be countered. For example, this approach is reflected in Beijing’s attempts to influence domestic and international public opinion with regard to the U.S. role in Asia…"[xix].
However, in the West official doctrines avoid explicitly delving into Perception Management. Thus, the definition of Information Operations in American doctrine:
"Information operations is the integrated employment, during military operations, of information-related capabilities in concert with other lines of operation to influence, disrupt, corrupt, or usurp the decision making of adversaries and potential adversaries while protecting our own (JP 3-13). An information-related capability is a tool, technique, or activity employed within a dimension of the information environment that can be used to create effects and operationally desirable conditions (JP 3-13). Examples of information related capabilities (IRCs) include military information support operations (MISO), military deception, operations security, public affairs, electronic warfare (EW), civil affairs operations (CAO), and cyberspace operations"[xx].
In the confined framework of this article there is no room to elaborate the failings of the American Information Operations doctrine elucidated above, but from the first paragraph it is clear that information is regarded, first and foremost, as a functional medium, whereas the aspect of perceptions is secondary; that the basis for Information Operations is procedural and organizational rather than qualitative; that it is a very complicated integration of dozens of secondary efforts (some 30 different bodies are elaborated as involved in the Information Operations situational assessment); and that the concept is focused on integration with military operations and has no element of an independent effort.
The British discussion of Strategic Communications reveals a deeper understanding of the change:
"Information can no longer be routinely subordinated to the more familiar and comfortable concepts of manoeuvre and force. Too often in the past we have placed information on the periphery of our operations, failing to understand the reinforcing, or changing, the attitude and behavior of selected audiences can have equal, if not greater, utility than force in securing operational objective"[xxi].
The British manual continues with a description of the effect of perceptions on achieving strategic goals. However, when it discusses application it states: "Strategic communication requires the co-ordinated use of different information capabilities of Defence such as information operations including psychological operations and presence, posture, profile alongside defence diplomacy and in conjunction with other levers such as manoeuvre and fires. Co-ordination of these information capabilities with Media Operations or Public Affairs is also required and oversight by a unifying information or communication authority is helpful. However, while all communication should be coherent, a firewall must exist between the routine conduct of Media Operations and other influence activities in the operational space, which could include operational and tactical deception. This firewall helps meet Defence's obligation to inform truthfully"[xxii].
This example shows the difficulty of the military establishment in adjusting to the transformation of the realm of perceptions: it focuses on the coordination of complex efforts at the expense of stratagems that will enable creating and imbuing an effective and winning narrative, and it restricts its capabilities by requiring an "obligation to inform truthfully". The result: a not enough sufficiently professional, hesitant, insufficiently resourced handling that does not strive to achieve a favourable decision in the struggle over perceptions.
Why do Western armies find it difficult to meet the challenge of Perception Management? We can assume a number of possible reasons – Perception Management is considered:
An un-military profession and therefore not the domain of military personnel.
Diverting attention and resources from the 'true story', that is the use of weapons to direct focused violence against the rival.
Immoral and unfair in that it requires military personnel to lie, cheat and deceive. This in contradiction to the Russian perception that believes "this is an effective tool for strategic subversion, by which one can decide wars and cause a regime-change even without employing regular forces on the kinetic battlefield"[xxiii].
Having the army – a state institution – possibly manipulating the principles of democracy, especially the rights to express an opinion and vote freely.
As too far from the anti-intellectual and mechanistic norms of behaviour that dominate the army in peacetime; as Storr writes: "… a depressing picture of armies forgetting old lessons and of authoritarian senior commander. To that should be added a tendency not to improve in peacetime…"[xxiv].
As a continuation of the trauma caused by the immature and wrong concept of Effect Based Operations, that Colin Grey described (as quoted by Storr) as: "…both unmistakably banal and dangerously illosury"[xxv]; and General James Matiss noted that "It is my view that EBO has been misapplied and overextended to the point that it actually hinders rather than helps joint operations"[xxvi].
The Time Has Come
The time has come for a significant change in the way the Perceptions Domain is viewed in the military profession. Clausewitz identified the importance of the perception aspect when he wrote: "One must say that the physical seem little more than wooden hilt, while the moral factors are the precious metal, the real weapon, the finally-honed blade… History provides the strongest proof of the importance of moral factors and their often incredible effect…"[xxvii].
Luttwak identified the complexity of the perception aspect of war when he defined two phenomena: the paradoxality of war – "the entire realm of strategy is pervaded by a paradoxical logic very different from the ordinary 'linear' logic by which we live in all other spheres of life. When conflict is absent… whenever, that is, strife and competition are more or less bound by law and custom, a noncontradictory linear logic rules, whose essence is mere common sense. Within the sphere of strategy… another and quite different logic is at work and routinely violates ordinary linear logic by inducing the coming together and reversal of opposites. Therefore it tends to reward paradoxical conduct while defeating straightforwardly logical action, yielding results that are ironical or even lethally damaging";[xxviii] and the 'post-heroic era' – "none of the advanced low-birth-rate countries of the world can play the role of a classic Great Power anymore… their societies are so allergic to casualties that they are effectively 'de-bellicized' or nearly so."[xxix]
Developments in information technologies and changes in human culture that stem from them only amplify these two phenomena and the connection between them. The claim, that must be examined in depth, is that today the paradoxality is penetrating into realms that were considered to be linear, such as politics and business; the post-heroic attitude strengthens the exploitation of the perception aspects over actual combat; and the connection between them creates a situation which increases greatly the centrality of perception manipulation as an effective tool.
One can today claim that the perception aspect is central to the conduct of conflicts. Consciousness has become a much more powerful middleman between the physical military action and its effects on the conduct of the rivals involved in it. The understanding that translating tactical events into strategic understanding is done in the perception medium is very old; but the changes in technology and human culture have enhanced the potential influence of this activity to unprecedented levels of significance to the conduct of war.
If we accept this conclusion, and as part of the efforts I will describe below, and the importance of continued study and analysis of this activity, then the following is required:
Changing the cultural perceptions of the military to honour the arts of lying, manipulating, faking and deceiving, in achieving strategic goals with out recourse to violence. What is more moral and correct than to prevent the slaughter of people by lying to them? Western military personnel struggle with digesting this idea, but the general public is more familiar with manipulations and find it more acceptable than killing people.
Within this framework we must adopt practices characterizing political election campaigns. Wars are similar to election campaigns, because they have clear goals; more or less clear time frames; they are a 'one shot' attempt to influence the situation; and require rapid decision making to paralyze opposing perception manipulations and achieve domination of our own narrative. To for nothing is much use made in political campaigns of terms and practices of military campaigns. The relevant cognitive and cultural biases must be identified to exploit all of Francis Bacon's four idols and to charge with them without shame or apology.
Perceptual manipulation requires considerable investment of resources: collecting and analyzing the relevant data requires diversion of intelligence assets from divining the exact location of the enemy to enable targeting to understanding the enemy's perceptions and the relevant enemy target-audiences for our messages; it requires to include the perception issue as a central element in the planning process, understanding that perceptions are the only bridge between the tactical actions and the strategic goals, therefore the actions have to be aimed and designed to impact them; and that, therefore, investing forces and efforts in operations focused at the manipulation and influence of perceptions is a central element of the tactical action.
It is important to exploit Western almost total domination of information technologies. The ability of resource-poor ISIS over many months to use Western information technologies in order to manipulate perceptions is a badge of shame to the failure of Western defence and military organizations to adapt to reality. The West, should it decide to do so, can eliminate its rivals from the perception domain with defensive and offensive methods.
It is important to understand that efforts can fail: there is enormous room here for trial and error to develop a concept of operations. Thus, in order to create an effective Anchorage Bias in the rival's perceptions, it is necessary to be the first to raise the issue in question so as to dominate the 'discussion'. Identifying the issues, planning and implementing the manipulation constitute a complex challenge. Another challenge is that in a political campaign it is enough to achieve 50% plus one votes, sometimes even less, to win, whereas in war the result must be much more lopsided. There are many dilemmas: for example, the rapid almost absolute connectivity in today's information domain, creates a situation in which the manipulation conducted on the rival is also completely exposed to the international audience, and worse, to one's own home-audience. How can we overcome this? In the First Gulf War General Norman Schwarzkopf chose to lie to the audience of a press-conference in order to strengthen his deception of the Iraqi enemy and justified this in retrospect by the need for democracy to defend itself and reduce human casualties on both sides.
The time has come to connect the cognitive and cultural biases to the conduct of war and to significantly and essentially expand the conduct of perception manipulation. This is the present and even more so – the future. Enroute, we shall be required, as Itiel Dror tries to show in the profession of forensic science, to conduct a deep analysis of the military profession by debating the difficult questions of the artistic and scientific elements required in it and its ability to adapt to the changing perceptual environment surrounding it.
[i] tiel E. Dror, "How Can Francis Bacon Help Forensic Science? The Four Idols of Human Biases", 50 Jurimetrics J. 93–110 (2009)
[ii] Jim Storr, The Human Face of War, Continuum: Birmingham war studies series, Cornwall, Great Britain, 2009.
[iii] Storr, p. 36.
[iv] Storr, pp. 48 – 49.
[v] Storr, p. 51.
[vi] Storr, p. 83.
[vii] Storr, pp. 130 – 131.
[viii] Richards J. Heuer, Jr, "Strategic Deception and Counterdeception: A Cognitive Process Approach", International Studies Quarterly, Vol. 25, No. 2, Symposium in Honor of Hans J. Morgenthau (Jun., 1981), pp. 294-327.
[ix] Heuer, p, 296.
[xi] CPT Chen Jing Kai, "Cognitive Biases: The Root of Irrationality in Military Decision-Making", Pointer, Journal of the Singapore Armed Forces, VOL.42 NO.2
[xii] Nigel Dobson-Keeffe and Major Warren Coaker, "Thinking More Rationally: cognitive biases and the Joint Military Appreciation Process", Australian Defence Force Journal, Issue 197 (2015)
[xiii] Iain King, "What Do Cognitive Biases Mean for Deterrence?", Real Clear Defense, February 12, 2019
[xv] Cmdr. Tony Schwarz, "The psychology of operational planning", Armed Forces Journal, March 20, 2014
[xvii] Major Blair S. Williams, "Heuristics and Biases in Military Decision Making", Military Review, September-October 2010.
[xviii] Williams. P.68.
[xix] Eyal Arad, Moshe Gaon and Erez Yaakobi, Killer Instinct: The Complete Guide for the Candidate and the Campaign Manager, Rishon LeZion: Miskal – Yediot Ahronot, 2018 (Hebrew).
[xx] The 'Bandwagon effect' is the phenomenon of a popular trend attracting even greater popularity. It is a result of the ‘follow the crowd’ mentality. The political barnstorming of the Williams Jennings Bryan campaign of 1900 used bandwagons with their musicians and candidates rolled through towns, and local politicos literally began hopping on bandwagons to endorse and show support and generate enthusiasm for their candidate. http://www.wordwizard.com/phpbb3/viewtopic.php?f=7&t=6642
[xxi] Eyal Arad, Moshe Gaon and Erez Yaakobi, p. 149.
[xxii] Eyal Arad, Moshe Gaon and Erez Yaakobi, p. 151.
[xxiii] Eyal Arad, Moshe Gaon and Erez Yaakobi p. 178.
[xxiv] Eyal Arad, Moshe Gaon and Erez Yaakobi, pp. 214-215 (Srulik Einhorn: 'The Twitting politician').
[xxv] internet_research_agency_indictment.pdf, https://www.justice.gov/file/1035477
[xxvi] "Twelve Russians charged with US 2016 election hack", BBC News, 13 July 2018 https://www.bbc.com/news/world-us-canada-44825345
[xxvii] Elsa B. Kania, "The PLA’s Latest Strategic Thinking on the Three Warfares", ChinaBrief Volume XVI, Issue 13, August 22, 2016, pp. 10 -14.
[xxviii] Headquarters, Department of the Army, ATP 3-13.1: The Conduct of Information Operations, October 2018, chapter 1, 1-1 https://fas.org/irp/doddir/army/atp3-13-1.pdf
[xxix] Ministry Of Defence, Joint Doctrine Note 1/12 (JDN 1/12) Strategic Communication: The Defence Contribution, p. V.
[xxxi] Joint Doctrine Note 1/12 (JDN 1/12) Strategic Communication, p. 1 – 8.
[xxxii] Dima Adamski, Cybernetic Operational Art, Eshtonot 11, August 2015, p. 36.
[xxxiii] Storr, p. 187.
[xxxiv] Storr, p. 12.
[xxxv] James N. Mattis, "USJFCOM Commander’s Guidance for Effects-based Operations", Parameters, Autumn 2008, p. 18.
[xxxvii] Clausewitz, C. von, On War, Howard, M. & Paret, P, (ed. and trans.), Everyman's library, 1993, p. 217.
[xxxviii] Edward Luttwak, Strategy: The Logic of War and Peace Revised and Enlarged Edition, The Belknap Press of Harvard University Press, 2001, p. 2.
[xxxix] Luttwak, pp. 73 - 74.