Jump to content

Cognitive Warfare

From Wikipedia, the free encyclopedia

Cognitive Warfare (CW) is any military activities which specifically focus on manipulating perceptions, beliefs, and decision-making processes to alter cognitive brain function.[1][2] While cognitive warfare is a form of PSYOP that utilize propaganda, disinformation, etc., it distinguishes itself from other information related activities by its objectives - "its goal is not what individuals think, but rather, the way they think."[3] Cognitive Warfare refers to the way that human thought, reasoning, sense-making, decision-making, and behaviour may be engineered through not only the manipulation of information, but also by the A.I./ML network of algorithms which push information through the bloodstream of the internet.[1][2][4]

[edit]

Cognitive warfare evolves as an extension of information warfare (IW) and psychological operations (PSYOPs).[2] Operations in the information environment are traditionally conducted in five core capabilities - electronic warfare (EW), psychological operations (PSYOPs), military deception (MILDEC), operational security (OPSEC), and computer network operations (CNO).[3][5] So on one hand, information warfare aims at controlling the flow of information in support of traditional military objectives, mainly to produce lethal affects on the battlefield.[3] On the other hand, cognitive warfare degrades the capacity to know and produce foreknowledge, transforming the understanding and interpretations of situations by individuals and in the mass consciousness.[2][3] Rather than Weapons of Mass Destruction (WMD), scholars such as Dr. James Giordano call this form of warfare "Weapons of Mass Disruption (WMD2)".[6] And for this reason, academics have called this, "the contemporary age of mutually assured destruction."[1]

Cognitive Warfare weaponry

[edit]

Cognitive warfare starts with data. when data is matched with personally identifiable information (PII), A.I. predictive models can determine your private traits and attributes such as personality and behavioural vulnerabilities.[7] Using this psychological and psychographic profile, an influence campaign is created and adjusted in real time by A.I. ML models until the desired cognitive and behaviour affects on the individual and/or population are achieved.[8][1][4] These influence campaigns used in cognitive warfare and cognitive operations are executed through the private influence industry known as Strategic Communications (StratCom).[1][9][10] Strategic Communications is utilized in multiple sectors as a method of advertising and public relations.[1] So while Cognitive Warfare refers to these activities when used in military settings, cognitive operations are not exclusive to the military ward.[1][3][4] There are multiple methods used to achieve the desired affects in cognitive brain function through strategic communications - two of the most notable being microtargeting and attitudinal inoculation.[4]

The affects of cognitive warfare & cognitive operations are achieved through attitudinal inoculation by using information for a conflicting purpose.[3][11] By using weak counterarguments in the support of a narrative, the receiver seeks out identity supporting information that further strengthens their threatened position - thus building psychological perseverance mechanisms such as confirmation bias.[4] The held opinion, attitude, interpretation of events, etc. then becomes resistant to a stronger attack (hence the medical terminology referring to a vaccine).[4] This methodology significantly increases the resilience of audiences and enables them to withstand any attempts from others to change their opinions, decisions, interpretation of events, etc.[2][4][12][11]

Microtargeting is a suggestive recommender algorithm (SeRA).[1] It's similar to native algorithms on websites that promote related content and suggested media based on not only what you like, but also what people like who are similar to you. In the case of microtargeting, rather than a user being provided content based on what a website's algorithms predict that they will like, they're instead microtargeted messages from the website's sponsors. Websites essentially give this capability to the highest bidder as an advertising technique. But not all 'sponsored' material are labeled as such, and not all strategic communications campaigns are designed to look like advertisements (primarily in political and military StratComs).[1][4]

When an audience is only microtargeted information that they are most likely to resonate with, they are algorithmically segmented in to what's known as online echochambers which don't allow the group to see a conflicting counterbalance of content.[11] Algorithmically segmenting audience in to echochambers has defensive and offensive military applications.[4] In military counteroperations such as counterinsurgency (COIN), the doctrine calls to separate and isolate populations from one another both physically and psychologically.[4][13] Not only does this prevent the threat network from recruiting neutral and friendly populations, but is assists in the development of psychological perseverance mechanisms like confirmation bias - "significantly, the methodology increases the resilience of susceptible audiences and enables them to withstand foreign propaganda effects."[12][11] In doing so, this helps counterinsurgents & third-party counterinsurgents leverage the population and operational environment against the threat network.[4][13]

But there are sociological and psychological side effects to these algorithms occurring in native suggestive recommender systems and microtargeting alike. Decades of peer reviewed research show that echochambers, in the physical world and online, cause asymmetric and political polarization, extremism, confusion, dissonance, negative emotional responses (fear, anger, etc.), reactance, microaggressions, and third-person effects.[14][15][16][17][18][19][20][21][22][23][24][25][26][27] And because of these psychological perseverance mechanisms like confirmations bias, this can be very problematic based on the work of Nyhan & Reifler (2010). Nyhan & Reifler found that even attempting to correct false beliefs often reinforces rather than dispels these beliefs among those who hold them most strongly. This is known as the backfire effect – "in which corrections actually increase misperceptions."[28][29][30][31] Whereby, through confirmation bias, these mechanisms utilized in military counteroperations actually lead to the backfire effect and exacerbate the very issues they are attempting to counter - extremism, dissonance, polarization, etc.[1][4]

Ms. Anna Vladimirova-Kryukova is an associate of NATO StratCom and Data Protection Officer at COBALT Latvia. When speaking at the 2019 Riga StratCom Dialog on strategic communications in cognitive warfare settings, she asserts, "microtargeting really exists and it's micro-effects also exist. And yes it can actually impact you and the whole society."[32] Anna continues at this NATO conference, "in the case of microtargeting we really can manipulate the minds of people. And this can lead to very serious results and very serious problems to wars and very serious conflicts, biases and can also impact lives and can lead to losing lives. Or it can endanger lives. And unfortunately I had such cases in my practice so its really happening."[32] And because these tactics are completely clandestine, this has blurred the threshold in terms of reacting to these threats.[3][8]

Objectives of CW & downstream effects

[edit]

Objectives of Cognitive Warfare are to shape/control and enemy's cognitive thinking and decision-making; to manipulate and degrade a nation's values, emotions, national spirit, cultural traditions, historical beliefs, political will; to achieve adversarial strategic geopolitical objectives without fighting; to influence human/societal reasoning, thinking, emotions, et. al. aligned with specific objectives; and degrade a populations trust in their institutions.[2] In doing so, this allows for the weakening/distruption of military, political & societal cohesion; and undermining/threatening of democracy.[2]CW is also used to leverage extremist groups to create chaos, political violence, and crisis.[4] Through CW, cultural genocide and be facilitated by targeting cognitive biases to create hate and generate racism.[2]Cognitive warfare has also been used by authoritarian societies to restructure society and groom populations to accept "continuous surveillance."[2] Through Cognitve Warfare, this allows these authoritarian societies to "remove individuals/outliers who resist and insist on Freedom of Speech, Independent Thinking, etc."[2] Some of the harmful effects of cognitive warfare can be militated with cognitive security - cognitive resilience through educational training in areas such as critical thinking, media literacy, awareness of these capabilities, and other relevant topics.[1][2]

'Cognitive Domain' or 'Cognitive Dimension'?

[edit]

Traditionally, there are five domains of war where military actions are taken - Air, Land, Sea, Space, and Cyberspace. These are spaces where warfare is traditionally waged. Actions taken in these domains aim to achieve affects in one or more of the three dimensions of war - The human dimension, the physical dimension, and the information dimension. But because "the brain is and will be the 21st century battlescape ,"[6] western scholars and defense circles are now torn between the addition of a new domain; some calling it the 'cognitive domain,' and others the 'human domain.' The Chinese military has made their own determination, deciding on the cognitive domain with all activities therein called 'cognitive domain operations (CDO)'.[33]

History of Cognitive Warfare weaponry

[edit]

Cognitive warfare weaponry has multiple agnostic applications including commercial, political and covert IW and CW military operations.[2][11] Gary Bonick JR. was the first to publish the DARPA origins of this weaponry, exactly how it is deployed, and its impacts on mental health & democracy.[1][4][34] DARPA began research and development on "sentiment detection and opinion mining" for "influence operations" using automated & semi-automated systems on July 11, 2010, in a program called 'Social Media in Strategic Communications (SMISC).[34][9] Cognitive Warfare and Cognitive Operations are waged through the influence industry known as Strategic Communications (StratCom).[1][4] The scope of DARPA's SMISC program was to create automated/semi-automated systems with regard to "influence operations" for the purposes of "inducing identities" (engineering behaviour through identity politics), modeling these "emergent communities," creating "bots in social media", "automated content generation", "sentiment detection and opinion mining", "crowd sourcing," and "narrative structure analysis," etc.[34][9] The Directorate of the U.S. National Science Foundation, Erwin Gianchandani, who has published extensively and presented at numerous international conferences on the subject of computational systems modeling of biological networks, has called this DARPA program "Warfare from Social Media."[10][35][36][34]

Just after the start of this SMISC program, DARPA, the Intelligence community, and Boeing Phantom Works, began funding research at the University of Cambridge also for sentiment detection and opinion mining.[37][38][39][40] This research began the breakthrough of determining private traits and attributes from digital records of human behaviour.[7] This allowed DARPAs SMISC platform the capability to target based on metrics of mental health gauged by a Big 5 OCEAN score. An OCEAN score is a metric used by clinicians to diagnose personality and behavioural disorders and/or vulnerabilities.

After the military funded technology was built, Cambridge University opened up a for-profit spin-off of its Psychometrics Center called 'Cambridge Personality Research.' This wing of the University offered something called "Preference" to political clients.[41] For a fee, this wing of the University would leverage people's data that it had gathered for academic purposes from Facebook, Twitter, etc. and target political audiences based on metrics of mental health; amongst other categorical variables.[41][42][43] Other than the military-industrial complex, Facebook also partnered on this early cognitive research.[44] And it was Facebook that ended up with the patent for it.[45]

The technology inevitably found its way to the premier NATO military contractor known as SCL Group.[46][47][48] This contractor amalgamated this OCEAN modeling it with its own version of SMISC weapon which it called the Ripon Platform.[46][47][48][49] Until 2015, there was only one private military contractor in the world licensed to deploy weapons-grade communications tactics - SCL Group.[50] It was deemed "weapons grade communications tactics" and export controlled by the British government.[51]The founder and owner of this premier NATO military contractor also founded and owned the U.K. based Influence Advisory Council, advising on influence standards and reviewing best practices of allied governments.[52][53][54] In 2015, the U.K. government lifted the export designation - just before this company then bypassed U.S. election laws to deploy these weapons-grade tactics on behalf of the Trump campaign in the 2016 U.S. presidential election through a front facing shell company known as Cambridge Analytica.[51][55][56][57][58][59][60][61] During this 2016 presidential campaign, this premier NATO contractor targeted those who scored highly neurotic on their OCEAN profiles - specifically targeting paranoid minded individuals having paranoid ideations with fear-based information and disinformation.[62][11][63][51] A whistleblower from the company named Christopher Wylie came foward and explained on multiple occasions why this demographic was being targeted with these cognitive warfare tactics - "with the view that these are the people that Steve Bannon would be able to use in order to spark an insurgency in the United States."[64][65] Less than two years after his statements, the United States saw what was determined to be an insurrection at the U.S. capital building in Washington D.C., a trait of an insurgency.

During the presidential transition period just weeks after the 2016 U.S. presidential election, Obama's State Department signed a contract with this same military contractor to adopt its strategies for the Global Engagement Center's (GEC) to counter violent extremism. According to a December 2016 no-bid contract for the DoS GEC, the State Department says that "In sum. After six years of research and countless demonstrations by companies wishing to sell their services, the GEC is not aware of any companies that approach the sophistication and effectiveness of SCL in designing data-driven influence campaigns that demonstrably work."[66][67]

The front facing shell company was brought to the attention of Congress, but the single hearing they had on the matter consisted of data privacy protection rather than what was being done with the data.[38] Congress was led to believe the military technology used in cognitive operations was snake-oil, despite the aforementioned DoS comments and its premier status in the field of Strategic Communications.[38] Because the front facing shell company of SCL Group went defunct just prior to the hearing, U.S. Congress felt that no further action should be taken to investigate the matter further - despite the U.K. Parliament determining:

"SCL's alleged undermining of democracies in many countries, by the active manipulation of the facts and events, was happening alongside work done by the SCL Group on behalf of the UK Government, the US Government, and other allied governments. We do not have the remit or the capacity to investigate these claims ourselves."[68]

References

[edit]
  1. ^ a b c d e f g h i j k l m Bonick JR, Gary. Political & Cognitive Warfare or Strategic Communications (in English/Spanish). ISBN 979-8321366752.{{cite book}}: CS1 maint: unrecognized language (link)
  2. ^ a b c d e f g h i j k l Masakowski, Yvonne, PhD. (April 11, 2022). "Newport Lecture Series: "Artificial Intelligence & Cognitive Warfare" with Yvonne Masakowski". YouTube.{{cite web}}: CS1 maint: multiple names: authors list (link)
  3. ^ a b c d e f g ""#RigaStratComDialogue | Strategy Talk by NATO General Paolo Ruggiero"". YouTube.
  4. ^ a b c d e f g h i j k l m n o Bonick JR, Gary. The Political-Military Complex: A Retrospect of Counterinsurgency (COIN) and Counter Violent Extremism (CVE). ISBN 979-8321176016.
  5. ^ Wilson, Clay, 'Information Operations, Electronic Warfare, and Cyberwar: Capabilities and Related Policy Issues', LIBRARY OF CONGRESS WASHINGTON DC CONGRESSIONAL RESEARCH SERVICE, 2007 Mar 20
  6. ^ a b ""Dr. James Giordano: The Brain is the Battlefield of the Future"". YouTube. October 29, 2018.
  7. ^ a b Michal Kosinski, David Stillwell, and Thore Graepel | PNAS | March 11, 2013. 201218772; published ahead of print March 11, 2013. "Private traits and attributes are predictable from digital records of human behavior". Archived from the original on June 18, 2018.{{cite web}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)
  8. ^ a b ""Target Audience Analysis" – Joint Warfare Center" (PDF).
  9. ^ a b c "DARPA Solicitation – Social Media in Strategic communications". Archived from the original on May 31, 2023.
  10. ^ a b Gianchandani, Erwin (July 23, 2011). "DARPA: Learning Warfare from Social Media".
  11. ^ a b c d e f ""Cambridge Analytica whistleblower Christopher Wylie testifies before Congress – watch live"". YouTube.
  12. ^ a b ""Countering propaganda: NATO spearheads use of behavioural change science" | Date: May 12, 2015". Archived from the original on April 15, 2016.
  13. ^ a b "Field Manual 3-24 | INSURGENCIES AND COUNTERING INSURGENCIES (MCWP 3-33.5) (THIS ITEM IS PUBLISHED W/ BASIC INCL C1)" (PDF).
  14. ^ Bail, Christopher A.; Argyle, Lisa P.; Brown, Taylor W.; Bumpus, John P.; Chen, Haohan; Hunzaker, M. B. Fallin; Lee, Jaemin; Mann, Marcus; Merhout, Friedolin; Volfovsky, Alexander (September 11, 2018). ""Exposure to opposing views on social media can increase political polarization" | By: Christopher A. Bail, Lisa P. Argyle, Taylor W. Brown, John P. Bumpus, Haohan Chen, M. B. Fallin Hunzaker, Jaemin Lee, Marcus Mann, Friedolin Merhout, and Alexander Volfovsky | Edited by Peter S. Bearman, Columbia University, New York, NY, and approved August 9, 2018 (received for review March 20, 2018) | August 28, 2018 | 115 (37) 9216-9221". Proceedings of the National Academy of Sciences. 115 (37): 9216–9221. doi:10.1073/pnas.1804840115. PMID 30154168.
  15. ^ "Democracy for Realists: Why Elections Do Not Produce Responsive Government" | By: Achen CH, Bartels LM | Princeton Univ Press, Princeton | 2016
  16. ^ "Public Opinion and Policy in the American States" | By: Erikson RS, Wright GC, McIver JP | Cambridge University Press, Cambridge, UK | 1993
  17. ^ "When  the People  Speak:  Deliberative Democracy  and  Public Consultation" | By: Fishkin  JS | Oxford University Press, Oxford | 2011
  18. ^ "A   new  era   of   minimal  effects?   The   changing foundations of political communication" | J Commun 58:707–731 | By: Bennett   WL,  Iyengar   S | 2008
  19. ^ Sunstein C (2002) Republic.com (Princeton Univ Press, Princeton)
  20. ^ "The political blogosphere and the 2004 U.S. election: Divided they blog" | By: Adamic, L., & Glance, N. | 2005, August 21 | Paper presented at the 3th International Workshop on Link Discovery (pp. 36–43), Chicago, IL. New York, NY: ACM
  21. ^ Wollebæk, Dag; Karlsen, Rune; Steen-Johnsen, Kari; Enjolras, Bernard (April 2019). ""Anger, Fear, and Echo Chambers: The Emotional Basis for Online Behavior" | Social Media + Society | April-June 2019: 1 –14 | By: Dag Wollebæk, Rune Karlsen, Kari Steen-Johnsen and Bernard Enjolras". Social Media + Society. 5 (2). doi:10.1177/2056305119829859.
  22. ^ Michela Del Vicario; Bessi, Alessandro; Zollo, Fabiana; Petroni, Fabio; Scala, Antonio; Caldarelli, Guido; Eugene Stanley, H.; Quattrociocchi, Walter (2015). ""Echo chambers in the age of misinformation" | By: Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., & Quattrociocchi, W. | 2015 | arXivpreprint arXiv:1509.00189". arXiv:1509.00189 [cs.CY].
  23. ^ "Mapping social dynamics on Facebook: The Brexit debate" | By: Del Vicario, M., Zollo, F., Caldarelli, G., Scala, A., & Quattrociocchi, W. | 2017 | 'Social Networks,' 50, 6–16.
  24. ^ "The filter bubble: What the Internet is hiding from you" | By: Pariser, E. | 2011 | New York, NY: Penguin Press
  25. ^ ""The Law of Group Polarization" | By: Cass R. Sunstein | John M. Olin Program in Law and Economics Working Paper No. 91, 1999".
  26. ^ ""Unintended Effects of Public Relations in Strategic Communication: A First Synthesis" | By: Jie Xu | 'Asia Pacific Public Relations Journal', 2019, Vol 20, p1 | ISSN: 1440-4389" (PDF).
  27. ^ ""Unintended Effects of Advertising: An Updated Qualitative Review" | By: Jie Xu | January 2020 | Review of Communication Research 8:1-16 | DOI:10.12840/ISSN.2255-4165.021".
  28. ^ ""When Corrections Fail: The Persistence of Political Misperceptions" | By: Nyhan Brendan, Reifler Jason | 2010 | Political Behavior 32:303–30". doi:10.1177/2378023116689791.
  29. ^ ""The Backfire Effect" | By: Matthew Wills | April 3, 2017". April 3, 2017.
  30. ^ "A dissonance analysis of the boomerang effect" | By: Cohen, Arthur R. | March 1962 | 'Journal of Personality' 30 (1): 75–88 | Doi:10.1111/j.1467-6494.1962.tb02306.x. | PMID 13880221
  31. ^ "Silverman, Craig (June 17, 2011). "The Backfire Effect: More on the press's inability to debunk bad information". Columbia Journalism Review, Columbia University (New York City)".
  32. ^ a b ""Discussion. Microtargeting and the ethics of the attention economy." | May 7, 2020 | 'NATO StratCom COE' YouTube Channel | recording of a breakaway discussion from the "Riga StratCom Dialogue 2019" that took place in Riga, June 11, 2019". YouTube. May 7, 2020.
  33. ^ Baughman, Josh. "How China Wins the Cognitive Domain" (PDF).
  34. ^ a b c d Bonick JR, Gary. History of Behavioural Engineering: Eugenics, The Mental Hygiene Movement & the Tavistock Institute. ISBN 979-8322255901.
  35. ^ Gianchandani, Erwin. "UVA Alumnus Aims to Accelerate Breakthroughs in Critical and Emerging Technologies as Head of New NSF Directorate".
  36. ^ "Erwin Gianchandani - National Science Foundation".
  37. ^ Kosinski, M., Wang, Y., Lakkaraju, H., & Leskovec, J. "Mining big data to extract patterns and predict real-life outcomes".{{cite web}}: CS1 maint: multiple names: authors list (link)
  38. ^ a b c "Cambridge Analytica whistleblower Christopher Wylie testifies before Congress – watch live". YouTube.
  39. ^ "The business of social networking". Archived from the original on August 28, 2011.
  40. ^ "IMSC News – USC Infolab – University of Southern California | February 2000" (PDF). Archived from the original (PDF) on September 2, 2023.
  41. ^ a b "Homepage – Products/Services – Preference tool".
  42. ^ ""Trait Prediction Engine" | University of Cambridge – The Psychometrics Center". Archived from the original on June 20, 2016.
  43. ^ ""With friends like these..." | PUBLISHED: April 22, 2011 | University of Cambridge – 'Research News'". Archived from the original on April 24, 2011.
  44. ^ ""CSAR Lecture: Dr. David Stilwell, on 'Big Data Psychometrics'" | November 4, 2019 | 'CSAR' YouTube Channel". YouTube. November 4, 2019.
  45. ^ ""US8825764B2 – Determining user personality characteristics from social networking system communications and characteristics – Google Patents"".
  46. ^ a b Albright, Jonathan (October 13, 2017). "Cambridge Analytica: the Geotargeting and Emotional Data Mining Scripts".
  47. ^ a b "Disinformation and 'fake news': Final Report Contents – Aggregate IQ". Parliament.uk.
  48. ^ a b Locklear, Mallory (April 13, 2018). "A look at the ad-targeting tools AggregateIQ left exposed online". Engadget.
  49. ^ ""Digital, Culture, Media and Sport Committee" | Wednesday May 2, 2018, Meeting started at 2.36pm, ended 4.16pm | Witnesses: Chris Vickery, Director, Cyber Risk Research, UpGuard".
  50. ^ "Countering propaganda: NATO spearheads use of behavioural change science". Archived from the original on April 15, 2016.
  51. ^ a b c "Digital, Culture, Media and Sport Committee Hearing | Tuesday April 17, 2018, Meeting started at 10.51am, ended 1.34pm".
  52. ^ "Three Crucial Questions about Target Audience Analysis" (PDF).
  53. ^ "Influence Advisory Panel – 'The Panel'". Archived from the original on August 19, 2014.
  54. ^ "Influence Advisory Panel - 'Nigel Oakes.'". Archived from the original on February 25, 2015.
  55. ^ ""Disinformation and 'fake news': Final Report – Digital, Culture, Media and Sport Committee"".
  56. ^ ""A RESPONSE TO MISSTATEMENTS IN RELATION TO CAMBRIDGE ANALYTICA"" (PDF).
  57. ^ ""THE DARK TRUTH ABOUT CAMBRIDGE ANALYTICA'S TIES TO TRUMPWORLD" | BY: MAYA KOSOFF | May 23, 2018, 12:28 PM". Vanity Fair. May 23, 2018.
  58. ^ ""Trump camp paid $5.9m to cambridge analytica from July 29 through Dec 12, 2016. Before then, ZERO h/t @lachlan" | 'Sam Stein' Twitter Post | @samstein". Archived from the original on June 9, 2019.
  59. ^ ""After working for Trump's campaign, British data firm eyes new U.S. government contracts" | By: Matea Gold and Frances Stead Sellers | February 17, 2017, at 6:03 p.m. EST". The Washington Post.
  60. ^ ""OFFICE OF THE DEMOCRATIC LEADER | Interview of CHRISTOPHER WYLIE"" (PDF).
  61. ^ ""INTERVIEW OF ALEXANDER NIX" | U.S. House of Representatives, Permanent Select Committee on Intelligence, Washington, D.C. | Thursday, December 14, 2017" (PDF).
  62. ^ ""Whistleblower: Cambridge Analytica Aimed To Trigger Paranoia And Racial Biases" | May 16, 2018 10:50 AM ET | By: Laura Sydell, Ian Wren". NPR.
  63. ^ ""Written Statement to the United States Committee on the Judiciary | In the Matter of Cambridge Analytica and other Related Issues"".
  64. ^ ""Christoper Wylie: Cambridge Analytica | The Project" | November 28, 2019 | 'The Project' YouTube Channel". YouTube. November 28, 2019.
  65. ^ ""Whistleblower Explains How Cambridge Analytica Helped Fuel U.S. 'Insurgency'" | October 8, 20192:45 PM ET | Heard on Fresh Air | By: Terry Gross". NPR. Archived from the original on October 8, 2019.
  66. ^ "U.S. Department of State Case #: F-2017-05742 | Doc #: C06612648 | July 30, 2018".
  67. ^ ""Research, Strategy and Analysis | U.S. Department of State | Office of The Under Secretary for Public Diplomacy and Public Affairs | Global Engagement Center (GEC)".
  68. ^ ""Disinformation and 'fake news': Interim Report" | DCMS Committee | Fifth Report of Session 2017–19 | HC 363 | July 29, 2018" (PDF).