Class 13: Responses to Political Violence

Addressing violence online

Opening notes

Key concepts review

  • political violence - the use of force by a person or group with a political motivation/purpose
  • essay question example
  • concepts from previous class meetings

Klausur essay question example

  1. Broad introduction
  1. Elaborate in detail
  1. Describe examples
  1. Concluding summary

What can states do to prevent or reduce political violence? Describe several options and discuss advantages and disadvantages.

Key concepts (1)

  • political violence - the use of force by a person or group with a political motivation/purpose
    • e.g., assault, robbery, rioting, insurgency, assassination, terrorism, rebellion, guerrilla warfare and civil war, revolution
    • can be differentiated by nature of the objectives, the targets of attacks, the organisational structure of groups, and by the repertoire of actions

Key concepts (2) - causes of PV activity

  1. broad environment/contextual factors (macro-level)
    • preconditions: factors that set the stage for PV over the long run
    • precipitants: specific events that immediately precede the occurrence of PV
  2. circumstances and actors (meso-level)
    • PV as part of ‘strategy’ (for certain goals) and may be ‘rational’
  3. psychological variables that encourage or inhibit (micro-level)
    • ego-defensive needs, cognitive processes, socialization — but also normality
    • evolving dynamics of commitment, risk, solidarity, loyalty, guilt, revenge, isolation

Key concepts (3)

  • radicalisation (attitudinal) - social and psychological process of increased commitment to extremist political or religious ideology
    • mirrored by deradicalisation
    • cognitive alignment - recognition of some conditions as wrong \(\rightarrow\) framing of those conditions as unjust and violence as just \(\rightarrow\) singling out of specific responsibilities, and the demonisation of the other
  • engagement (behavioural) - participation in politically violent activity
    • mirrored by disengagement

Key concepts (4)

  • strategy - a combination of a claim (or demand), a tactic, and a site (or venue); alternatively, consisting of 3 elements:
    1. Targeting - who/what is being acted upon by tactics
    2. Tactics - types of collective action and manner of their performance
    3. Timing - some moments present greater opportunity than others

Key concepts (5)

  • radical subcultures - a cultural group within a larger culture with its own traits, beliefs, and interests, typically distinct from and sometimes at odds with the larger culture
  • leadership - Weber’s 3 ideal types: legal, traditional, charismatic

Key concepts (6)

  • foreign fighters - individuals who travel to a conflict zone from another territory (prima facie evidence of radicalism \(\rightarrow\) engagement in political violence; a ‘security failure’ by authority of origin state?)
    • motivations: ideology, benefits, interpersonal connections, etc.
    • state response options: praise/support, ignore/disregard, programmatic intervention, criminal justice intervention, forceful intervention

Key concepts (7)

  • electoral violence - coercion directed towards actors and/or objects during the electoral cycle … part of a menu of electoral manipulation
    • intra-systemic violence - “try to win under the existing system”; “suppress or drown out the voices of political opponents”
    • anti-systemic violence - “depress participation as much as possible in order to undermine the legitimacy of the election”

Key concepts (8)

  • escalation: a rise in the frequency and/or severity of violent actions
    • framing logics, strategic logics, organisational logics, constituency/social logics
  • restraint: a deliberate restriction (either reducing or completely stopping) of violent actions
    • strategic logics, moral logics, logic of ego maintenance, logic of outgroup definition, organisational logic
  • dimensions of repression/social control - identity of repressive agent, character of repressive action, whether repressive action is observable
    • many repressive options…

Notable research findings (1), by class number

–2. (a) economic factors are not reliable predictors of terrorist activity; (b) social factors help drive right-wing terrorism (Piazza 2017)

–3. (a) paths of radicalisation: ideological, instrumental, solidaristic - (della Porta 2018; Bosi and Porta 2012); (b) 5 barriers to mass violence: i. viewed as counterproductive, ii. preference for interpersonal violence, iii. changes in focus availability, iv. internal org. conflict, v. moral apprehension (Simi and Windisch 2020)

–5. post-conflict radical milieu can be key factor in mobilising for political violence (Metodieva 2022)

–7. common profile of ISIS foreign fighters: male, well-educated, urban, unmarried, and young (Morris 2023)

Notable research findings (2), by class number

–8. violence decreases turnout but that the effect is larger for anti-systemic violence; intra-systemic violence appears intended to selectively depress turnout among opposition supporters (Harbers, Richetta, and van Wingerden 2022); non-violent more than twice as likely to achieve full or partial success compared to violent cases (Chenoweth and Stephan 2011), nonviolent campaigns are better at eliciting broad and diverse support, nonviolent campaigns create more defections among the opposition, nonviolent campaigns have a broader set of tactics at their disposal, nonviolent campaigns often maintain discipline even in the face of escalating oppression; violence complementing already and continuing high mobilisation is effective in making regime more sensitive to protest costs (Kudelia 2018)

Notable research findings (3), by class number

–10. people may shift their attitudes about political violence… when a different movement poses a new situational variation (Setter and Nepstad 2022); extremists (esp. Islamists) gain more discursive space after attacks, politicians from right-wing parties were more visible than politicians from left-wing parties in political debates after extreme right and Islamist attacks, the content of public debates after terrorist attacks was related to the ideological motive behind the attack, Terrorist attacks reduce the public legitimacy of extremist actors and their political agenda in public debates, legitimacy of Islam decreases to a greater extent after Islamist attacks than the legitimacy of nationalism does after extreme right attacks (issues) (Völker 2023)

Notable research findings (4), by class number

–12. bans: attitude towards violence not a clearly important factor, two key conditions: veto player agreement and (especially) securitization (Bourne and Veugelers 2022); bans can be motivated by social pressure mechanisms, (specific) visibility is important for bans, German government applies instrumental logic rather than legal logic in banning decisions

Tips for preparing for Klausur

  • review class slides
  • reread your notes from readings
    • maybe (re-)read a couple of the required readings
  • think through cases you know of
  • think through other cases we discussed (through readings or your peers’ expertise)

Tips for preparing for Klausur

  • review class slides

  • reread your notes from readings

    • maybe (re-)read a couple of the required readings
  • think through cases you know of

  • think through other cases we discussed (through readings or your peers’ expertise)

  • don’t panic

Political violence online

  • Opening questions: what is political violence online?
  • Common uses of online tools by extremists
  • Counterspeech

Starting questions

  • What is violence online?
    • What forms does it take? (What qualifies as ‘violence’?)
    • Where does it happen?
    • Who are the perpetrators? Who are the victims?
    • Are there problems particular to violence online compared to elsewhere?
  • How have politically violent groups/actors that you know of used the internet?

Common uses of online tools by extremists

Common uses of online tools by extremists

financing

Common uses of online tools by extremists

training

financing

Common uses of online tools by extremists

coordinating

training

financing

Common uses of online tools by extremists

recruitment

coordinating

training

financing

Common uses of online tools by extremists

‘agitprop’

recruitment

coordinating

training

financing

Extremist uses of online tools - agitprop

‘agitprop’

  • (agitation and propaganda)
  • setting agenda: what issue(s) to focus on
  • spreading narratives: how to view those issue(s)
  • create and/or distribute related content: writing, pictures, audio, videos, games, etc.
  • use multiple channels (e.g., mainstream, like FB, Twitter; fringe, like 4chan, 8kun, gettr)
    • alternative news outlets

Common uses of online tools by extremists

‘agitprop’

recruitment

coordinating

training

financing

Extremist uses of online tools - recruitment

recruitment

  • create/manage online spaces (forums, chatrooms, groups)
  • communicate with sympathisers
    • aided by agitprop that resonates with susceptible individuals
    • impart sense of purpose or belonging

Common uses of online tools by extremists

‘agitprop’

recruitment

coordinating

training

financing

Extremist uses of online tools - coordinating

coordinating

  • encrypted messaging among group members
  • logistical preparations
  • plan and arrange (privately or publicly) meetings, events, protests, attacks

Common uses of online tools by extremists

‘agitprop’

recruitment

coordinating

training

financing

Extremist uses of online tools - training

training

  • guides or tutorials to operational activity:
    • fleeing to join group
    • avoiding detection
    • skills training (fighting, weapons, tools, hacking)

Common uses of online tools by extremists

‘agitprop’

recruitment

coordinating

training

financing

Extremist uses of online tools - financing

financing

Legality and/or Terms of Service Compliance
Legal and/or Non-Violation of Terms Illegal and/or Violation of Terms

Extremist uses of online tools - financing

financing

Legality and/or Terms of Service Compliance
Legal and/or Non-Violation of Terms Illegal and/or Violation of Terms
1. Donations/self-funding

Extremist uses of online tools - financing

financing

Legality and/or Terms of Service Compliance
Legal and/or Non-Violation of Terms Illegal and/or Violation of Terms
1. Donations/self-funding
2a. Sale of goods (merchandise, music, real estate, etc)

Extremist uses of online tools - financing

financing

Legality and/or Terms of Service Compliance
Legal and/or Non-Violation of Terms Illegal and/or Violation of Terms
1. Donations/self-funding
2a. Sale of goods (merchandise, music, real estate, etc)
2b. Sale of services (memberships, events, etc)

Extremist uses of online tools - financing

financing

Legality and/or Terms of Service Compliance
Legal and/or Non-Violation of Terms Illegal and/or Violation of Terms
1. Donations/self-funding
2a. Sale of goods (merchandise, music, real estate, etc)
2b. Sale of services (memberships, events, etc)
3. Criminal activities

Common uses of online tools by extremists

‘agitprop’

recruitment

coordinating

training

financing

Models of ‘counterspeech’ - Saltman, Kooti, and Vockery (2021)

RQs:

Beyond measuring the basic metrics of reach and engagement, can [online intervention programmes] show behavioral change and/or sentiment shift in the intended target audience exposed to this content? Could exposure to counterspeech in at-risk or radicalized audiences perhaps have the unintended consequence of further radicalization, or act as a catalyst to the radicalization process? How best can private tech companies work with non-governmental organizations (NGOs) and experts in the PVE/ CVE space?

  • goal? deradicalisation or disengagement?
  • model of policy remedy? (what actors are involved?)

FB’s P/CVE concept - (Saltman, Kooti, and Vockery 2021)

two counterspeech modes to test - (2021)

  • developed with FB’s Counterterrorism and Dangerous Organizations Policy team and the Safety Research team.
  • A/B: model activated by “a hard indicator of engagement with a violent extremist group or piece of content; sends relevant counterspeech over a period of time.
    • tested in English (UK) and Arabic (Iraq) spaces
    • partnered with (1) International Center on Security and Violent Extremism (ICSVE) (U.S.), (2) ConnectFutures (UK), and (3) Adyan Foundation (Lebanon)
  • Redirect initiative: assumes passive viewing and engagement with content can be a gateway to active engagements with extremism; aims to intervene ‘early’, connecting certain search terms to resources and redirects
    • informed by Life After Hate (U.S.) NGO

two counterspeech modes to test - (2021)

A/B mode

Redirect mode

findings - (Saltman, Kooti, and Vockery 2021)

  1. no evidence that counterspeech does harm
  2. A/B test: among highest risk group, decreased engagement with violent extremist content observed
  3. focusing on behavioural signals to define an at-risk audience is helpful
  4. single, isolated signals of one shared piece of violent extremist content are misleading indicators of individual attitudes
  5. counterspeech videos must be short and clear
  6. Redirect Initiative (intervening on passive content searches) model yields increases in engagement with online resources and off-line practitioners and resources

a coda from recent research (Fielitz and Marcks 2021)

(Authors are writing about right-wing extremism—but their points apply more broadly.)

Counter-speech has become the most important form of action for projects against right-wing extremism on the Internet.

  • “supporters of far-right organisations are highly resistant to fact-based arguments”
  • poorly handled counter-speech risks “triggering defensive reflexes or serving as a stooge [that] far-right online activists can easily [instrumentalise]” — so may sometimes be better to remain silent, avoid elevating FR narratives
  • three dilemmas democratic actors face (cannot ‘fight fire with fire’):
    1. Polarisation dilemma: using emotional speech (similar to extremists’ speech) to counter extremist speech can fuel polarisation, undermining democracy
    2. Truth dilemma: spreading fake news (as some extremists do) is inconsistent with good-faith democratic values
    3. Mobilisation dilemma: (digital) mobilisation against extremist speech (a) likely increases attention for extremist speech and (b) can lead to a ‘digital arms race’

Poll: addressing extremism online

A QR code for the survey.

Take the survey at https://forms.gle/91eNe9j9fPzkqRVz5

  • Who should define what is extremist content?
  • Who should shape policy responding to extremist content?
  • What should predominant approach be?
  • Is deplatforming effective for dealing with online extremism?
  • Should criminal penalties exist for spreading disinformation?

Multi-platform activity (Mitts 2025) - regulation challenges

  • assumption: stronger action by platform companies will decrease their ability to exploit the internet
    • this assumption is plausible in isolated platform perspective—less so in multi-platform perspective
    • platforms largely moderate content in isolation, but extremist actors coordinate activity across multiple platforms
  • adaptation mechanisms:
    • platform migration: move to alternative platforms
    • messaging: moderate discourse on regulated platforms
    • mobilisation: problematise platforms’ content moderation policies/practices

Regulation approaches (Gorwa 2024)

  • contest
  • collaborate
  • convince

Regulation approaches (Gorwa 2024)

  • collaborate
  • convince
  • contest
  • legally binding, enforceable rules
    • executive orders; legislatures pass laws (e.g., data protection, competition regulation, consumer safety; cybersecurity)

Regulation approaches (Gorwa 2024)

  • contest
  • collaborate
  • convince

Regulation approaches (Gorwa 2024)

  • contest
  • convince
  • collaborate
  • non-binding, voluntarily enacted rules designed with government input, occasionally featuring binding procedural constraints
    • may be agreed by a mix of industry, firm, and civil society stakeholders → implemented voluntarily by industry

Regulation approaches (Gorwa 2024)

  • contest
  • collaborate
  • convince

Regulation approaches (Gorwa 2024)

  • contest
  • collaborate
  • convince
  • using existing channels to raise grievances rather than striving for new rules
    • often occurs in the shadows, few public-facing elements
    • ‘cheap’ to pursue and implement → but also ‘cheap’ consequences (no binding sanctions, probably least impactful)

Regulation approaches (Gorwa 2024)

  • contest - legally binding, enforceable rules
  • collaborate - non-binding, voluntarily enacted rules designed with government input, occasionally featuring binding procedural constraints
  • convince - using existing channels to raise grievances rather than striving for new rules
  • likelihood of approach success depends on political will (sufficient demand for change) and power to intervene (shaped by state’s market power, regulatory capacity, domestic and international context, and norms)
  • trend of platform governance hybridization

Major extant regulation, forums, etc. (cf. Gorwa 2024; Conway et al. 2023)

‘dangerous actors’ on Meta (Biddle 2021)

‘dangerous actors’ on Meta (Biddle 2021)

CasaPound

Extreme right, Islamist, Drug cartel, Extreme left, Buddhist nationalist, Separatist

CasaPound v. Facebook (e.g., Golia and Behring 2020)

  • CasaPound (e.g., Froio et al. 2020): (neo-)‘fascist’ organisation in Italy, highly active on digital media

  • 9.9.2019: Facebook deactivates CasaPound page (and representatives), arguing the content is ‘hate speech’ and ‘incitement to violence’, violating Facebook’s Terms of Use

  • CP argues (before Court of Rome) ‘it proposed an update of historical fascism that exclusively values its social policies, and that it has publicly condemned racial laws’; that it does not violate Facebook’s terms of service — and that CP protected by article 21 of the Italian constitution

  • see further at: https://globalfreedomofexpression.columbia.edu/cases/casapound-v-facebook/

  • what is at stake here? who decides? who ought to?

define what is impermissible

make policy responses

CasaPound v. Facebook (e.g., Golia and Behring 2020)

  • Court decision:
    • ‘Facebook holds a special position and its mission aims to uphold freedom of expression’
    • CP page deactivation violated its rights as a political party (under article 49 of the Constitution)
    • ordered FB to reactivate page(s) and pay a penalty of 800 EUR for each day of deactivation

CasaPound v. Facebook (e.g., Golia and Behring 2020)

  • Facebook appealed (unsuccessfully), saying it is ‘a private company operating for profit protected by art. 41 of the Constitution’, that:
  • Zuckerberg initially referred to Facebook as a ‘utility’…

the order had erroneously attributed a special nature to the contract between the social network and the user, when it was instead an ordinary contract under civil law. In the absence of any legal basis, according to Facebook, it is not possible to attribute public service obligations to private sector players such as the protection of freedom of association and expression. Likewise, Facebook argued that it is not required to ensure special protection to some users such as organizations engaged in political activities by virtue of their role in the political debate.

Dealing with the extremism online: effects, legitimacy

Predominant approach and deplatforming

should be predominant approach

deplatforming effective?

Deplatforming effects

  • diminishing the scale of influence (Ghaffary 2022)
    • Facebook, Youtube: billions of users
    • Parler, Gettr (e.g.): at most a few million users
    • Telegram: a few hundred million users, little regulation
  • the ‘whack-a-mole’ problem: extremist social media accounts removed, but reappear on other sites and/or under aliases

Deplatforming effects (Thomas and Wahedi 2023)

  • RQ: How does removing the leadership of online hate organisations from online platforms change behaviour in their target audience?
  • cases: six network disruptions (i.e., deplatforming) on Facebook
    • NB: the researchers are/were Meta employees
  • finding: network disruptions reduced the consumption and production of hateful content

The results suggest that strategies of targeted removals, such as leadership removal and network degradation efforts, can reduce the ability of hate organizations to successfully operate online.

  • BUT, finding comes from looking at one platform in isolation

Deplatforming effects (Chandrasekharan et al. 2017)

  • 10 June 2015, Reddit banned several subreddits, including: r/fatpeoplehate and r/CoonTown
  • RQ1: What effect did Reddit’s ban have on the contributors to banned subreddits?
  • RQ2: What effect did the ban have on subreddits that saw an influx of banned subreddit users?
  • findings:
    • many users from banned subreddits became inactive
      • led to a drop in Reddit users (some migrated to other platforms)… what’s the significance of this finding?
    • volume of active users’ posting mostly unchanged
    • a dramatic decrease in hate speech usage by the treatment users post-ban

Content moderation, deplatforming legitimacy (Pradel et al. 2024)

  • key concept: Toxic speech as consisting of…
    1. incivility,
    2. intolerance, and
    3. violent threats
  • experimental design: randomly exposed people (in U.S.) to toxic speech social media posts → effect on users’ content moderation preferences

Moderation, deplatforming legitimacy (Pradel et al. 2024)

Moderation, deplatforming legitimacy (Pradel et al. 2024)

A coda: disinformation culpability

Should criminal penalties exist for spreading disinformation?

Any questions, concerns, feedback for this class?

Anonymous feedback here: https://forms.gle/NfF1pCfYMbkAT3WP6

Alternatively, please send me an email: m.zeller@lmu.de

References

Bailard, Catie Snow, Rebekah Tromble, Wei Zhong, Federico Bianchi, Pedram Hosseini, and David Broniatowski. 2024. Keep Your Heads Held High Boys!’: Examining the Relationship Between the Proud BoysOnline Discourse and Offline Activities.” American Political Science Review 118 (4): 2054–71. https://doi.org/10.1017/S0003055423001478.
Biddle, Sam. 2021. “Revealed: Facebook’s Secret Blacklist of Dangerous Individuals and Organizations.” The Intercept, October.
Bosi, Lorenzo, and Donatella Della Porta. 2012. “Micro-Mobilization into Armed Groups: Ideological, Instrumental and Solidaristic Paths.” Qualitative Sociology 35 (4): 361–83. https://doi.org/10.1007/s11133-012-9237-1.
Bourne, Angela K., and John Veugelers. 2022. “Militant Democracy and Successors to Authoritarian Ruling Parties in Post-1945 West Germany and Italy.” Democratization 29 (4): 736–53. https://doi.org/10.1080/13510347.2021.2012160.
Chandrasekharan, Eshwar, Umashanthi Pavalanathan, Anirudh Srinivasan, Adam Glynn, Jacob Eisenstein, and Eric Gilbert. 2017. “You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech.” Proceedings of the ACM on Human-Computer Interaction 1 (CSCW): 1–22. https://doi.org/10.1145/3134666.
Chenoweth, Erica, and Maria J. Stephan. 2011. Why Civil Resistance Works: The Strategic Logic of Nonviolent Conflict. New York: Columbia University Press.
Conway, Maura, Ashley A Mattheis, Sean McCafferty, and Miraji H Mohamed. 2023. “Violent Extremism and Terrorism Online in 2023.” Brussels: EU Radicalisation Awareness Network (RAN) Policy Report.
della Porta, Donatella. 2018. “Radicalization: A Relational Perspective.” Annual Review of Sociology 21: 461–74. https://doi.org/10.1146/annurev-polisci-042716.
Earl, Jennifer. 2007. “Leaderless Movement The Case of Strategic Voting.” American Behavioral Scientist 50 (10): 1327–49.
Fielitz, Maik, and Holger Marcks. 2021. “When Counter-Speech Backfires: The Pitfalls of Strategic Online Interaction.” Global Network on Extremism & Technology. https://gnet-research.org/2021/07/26/when-counter-speech-backfires-the-pitfalls-of-strategic-online-interaction/.
Froio, Caterina, Pietro Castelli Gattinara, Giorgia Bulli, and Matteo Albanese. 2020. CasaPound Italia: Contemporary Extreme-Right Politics. Abingdon: Routledge.
Ghaffary, Shirin. 2022. “Does Banning Extremists Online Work? It Depends.” Vox, February.
Golia, Angelo Jr, and Rachel Behring. 2020. “Online Fascist Propaganda and Political Participation in CasaPound v. Facebook.” Verfassungsblog: On Matters Constitutional, February.
Gorwa, Robert. 2024. The Politics of Platform Regulation. Oxford: Oxford University Press.
Harbers, Imke, Cécile Richetta, and Enrike van Wingerden. 2022. “Shaping Electoral Outcomes: Intra- and Anti-systemic Violence in Indian Assembly Elections.” British Journal of Political Science, October, 1–17. https://doi.org/10.1017/S0007123422000345.
Kudelia, Serhiy. 2018. “When Numbers Are Not Enough: The Strategic Use of Violence in Ukraine’s 2014 Revolution.” Comparative Politics 50 (4): 501–21.
Metodieva, Asya. 2022. Foreign Fighters and Radical Influencers: Radical Milieus in the Postwar Balkans. Taylor & Francis.
Mitts, Tamar. 2025. Safe Havens for Hate: The Challenge of Moderating Online Extremism. Princeton, NJ: Princeton University Press.
Morris, Andrea Michelle. 2023. “Who Becomes a Foreign Fighter? Characteristics of the Islamic State’s Soldiers.” Terrorism and Political Violence, January, 1–19. https://doi.org/10.1080/09546553.2022.2144730.
Piazza, James A. 2017. “The Determinants of Domestic Right-Wing Terrorism in the USA: Economic Grievance, Societal Change and Political Resentment.” Conflict Management and Peace Science 34 (1): 52–80. https://doi.org/10.1177/0738894215570429.
Pradel, Franziska, Jan Zilinsky, Spyros Kosmidis, and Yannis Theocharis. 2024. “Toxic Speech and Limited Demand for Content Moderation on Social Media.” American Political Science Review 118 (4): 1895–1912. https://doi.org/10.1017/S000305542300134X.
Rauchfleisch, Adrian, and Jonas Kaiser. 2024. “The Impact of Deplatforming the Far Right: An Analysis of YouTube and BitChute.” Information, Communication & Society, May, 1–19. https://doi.org/10.1080/1369118X.2024.2346524.
Saltman, Erin, Farshad Kooti, and Karly Vockery. 2021. “New Models for Deploying Counterspeech: Measuring Behavioral Change and Sentiment Analysis.” Studies in Conflict and Terrorism 0 (0): 1–24. https://doi.org/10.1080/1057610X.2021.1888404.
Setter, Davyd, and Sharon Erickson Nepstad. 2022. “How Social Movements Influence Public Opinion on Political Violence: Attitude Shifts in the Wake of the George Floyd Protests.” Mobilization: An International Quarterly 27 (4): 429–44. https://doi.org/10.17813/1086-671X-27-4-429.
Simi, Pete, and Steven Windisch. 2020. “Why Radicalization Fails: Barriers to Mass Casualty Terrorism.” Terrorism and Political Violence 32 (4): 831–50. https://doi.org/10.1080/09546553.2017.1409212.
Thomas, Daniel Robert, and Laila A. Wahedi. 2023. “Disrupting Hate: The Effect of Deplatforming Hate Organizations on Their Online Audience.” Proceedings of the National Academy of Sciences 120 (24): e2214080120. https://doi.org/10.1073/pnas.2214080120.
Völker, Teresa. 2023. “How Terrorist Attacks Distort Public Debates: A Comparative Study of Right-Wing and Islamist Extremism.” Journal of European Public Policy, October, 1–28. https://doi.org/10.1080/13501763.2023.2269194.