Syria’s transition after the overthrow of the Bashar al-Assad regime in December 2024 marks a historic turn, but the sociopolitical and psychological legacies of violence remain deeply entrenched. Peacebuilders in Syria and in the Diaspora are now engaged in nascent efforts to promote healing and reconciliation across communities. To be successful, peacebuilding efforts must extend to the digital information landscape. In the aftermath of war, many algorithmic systems – powered by artificial intelligence – continue to amplify digital content that focuses on grievances and divisions and perpetuates misinformation. If these algorithmic influences go unchanged, they risk reproducing polarization and impeding trust in institutions and peace processes.
Syrians now face a transitional phase marked by political volatility, where the narratives of legitimacy and national identity are actively renegotiated. While armed hostilities have largely ceased, the informational terrain remains divisive and volatile: algorithmic recommendation systems on social media continue to shape how Syrians interpret the past war, and most critically, how they imagine the future. Under the Assad regime, official media served as tightly controlled instruments of state messaging, pushing many Syrians toward social media platforms like Facebook for uncensored local news. After the regime’s collapse, the absence of clear, trusted official information channels has left social media to fill the resulting information vacuum, deepening fragmentation and competing interpretations of events.
Building peace and fostering social integration in transitional contexts requires moving toward shared understanding about the past and common horizons for the future. Yet, the algorithmic systems that organize and shape online spaces are often invisible and difficult to analyze, let alone influence. Users are rarely aware of why they see certain content, how recommendations are generated, or how repeated exposure influences perception. Algorithmic curation differs from traditional editorial judgment. Instead of structured gatekeeping, algorithms amplify patterns of interaction, which in post-conflict Syria can include anger, fear, and grievance as dominant drivers. The result is not a neutral reflection of public sentiment but a digitally-mediated construction of reality that may deepen social divides.
Algorithmic amplification contributes to “symbolic violence,” defined as the imposition of interpretive frameworks that normalize certain narratives while marginalizing others. In post-war transitions such as Syria’s, this can manifest in several ways:
- Reinforcing “echo chambers” of polarized interpretations of the conflict’s causes and actors, impeding mutual recognition of suffering.
- Prioritizing emotionally charged narratives over deliberative, reconciliatory discourse.
- Undermining institutional credibility when platforms magnify misinformation or partisan agendas.
Post-conflict transitions are particularly vulnerable to algorithmic harms. Unlike stable democracies with robust media ecosystems, Syria’s newly forming civil and political institutions are still gaining legitimacy. Syrians are turning to online platforms for updates on governance, service delivery, and security. In this vacuum, misleading or fabricated content circulates rapidly through trust-based networks, particularly on encrypted messaging platforms such as WhatsApp, where rumors and sectarian narratives are difficult to verify or correct. Competing actors can exploit this uncertainty to sow mistrust. This has direct implications for how Syrians assess transitional justice and the legitimacy of new institutions. Without some degree of narrative convergence, the transitional government’s efforts to move forward risk being perceived as partial, exclusionary, or even illegitimate.
In this context, those working to build peace in the aftermath of war need to factor in and engage the heightened importance of AI-powered digital spaces. Specifically, they should take the following steps:
- Integrate digital dynamics into conflict analysis and peace process design. Peace actors must treat algorithmic mediation not as a peripheral technical issue but as a central component of conflict dynamics and shaping post-conflict transitions.
- Promote shared narratives online to reinforce transitional justice initiatives. Anticipating algorithmic distortions, programs that encourage collective memory-making and truth-telling should incorporate digital engagement into communication strategies.
- Embed digital literacy in reconciliation programming. Beyond addressing fact verification, digital literacy efforts can give citizens a greater ability to critically interpret algorithmically mediated content and interrogate its role in shaping meaning.
- Engage with technology platforms to encourage risk mitigation. Local actors and international partners should engage with technology companies to discuss ways to mitigate harms in conflict-affected contexts, focusing on reducing amplification of divisive narratives during sensitive transitional periods.
Syria’s transition after the overthrow of the Assad regime represents a hopeful but fragile moment. The absence of widespread organized violence does not mean the absence of conflict dynamics. Peace in Syria requires constructing shared meanings and rebuilding trust through a multi-channeled approach and ensuring that the platforms through which citizens encounter information do not inadvertently re-entrench old divisions. Addressing algorithmic risk is not merely technical; it is central to sustaining a durable and just peace.
Download a PDF of this issue »
Eskandar Ataallah is a doctoral student in international peace studies at the University of Notre Dame’s Kroc Institute for International Peace Studies and a doctoral student affiliate at the Kellogg Institute, both part of the Keough School of Global Affairs. He is a Syrian peacebuilder who worked previously with USAID, the UN, and Catholic Relief Services.




