The Cognitive Resonance Theory (CRT) offers a novel theoretical framework for examining the complex interactions between algorithmic personalization, emotional engagement, and the formation of echo chambers within digital media environments. The primary aim of this study is to conceptualize and position CRT as a comprehensive framework for understanding how algorithmic personalization and emotional engagement contribute to echo chambers and societal polarization. This study employs a theoretical synthesis approach, integrating research and theoretical insights from communication studies, psychology, and sociology to explore how algorithmically curated content fosters cognitive alignment, reinforces biases, and amplifies societal polarization. Existing research shows how engagement-driven algorithms on platforms like YouTube and TikTok prioritize emotionally resonant content, amplify polarizing narratives, and limit informational diversity. This framework highlights the societal and ethical challenges posed by algorithmic systems, including the spread of misinformation, the erosion of shared understanding, and the deepening of ideological divides. CRT expands upon existing communication theories by linking algorithmic personalization and emotional resonance with the reinforcement of echo chambers, addressing critical gaps in how technology design shapes psychological and behavioral outcomes. It explains how algorithmic systems create self-reinforcing feedback loops that intensify selective exposure, cognitive rigidity, and echo chambers. This study underscores the need for ethical interventions, such as promoting algorithmic transparency, enhancing content diversity, and fostering deliberative public discourse to mitigate the polarizing effects of personalization technologies. CRT offers actionable insights for policymakers, platform designers, and communication practitioners, advancing strategic communication research and guiding the development of more inclusive, equitable, and resilient digital ecosystems.
Cite this paper
Gombar, M. , Križ, M. and Cvitković, A. (2025). Cognitive Resonance Theory in Strategic Communication: Understanding Personalization, Emotional Resonance, and Echo Chambers. Open Access Library Journal, 12, e3171. doi: http://dx.doi.org/10.4236/oalib.1113171.
Bakshy, E., Messing, S. and Adamic, L.A. (2015) Exposure to Ideologically Diverse News and Opinion on Facebook. Science, 348, 1130-1132. https://doi.org/10.1126/science.aaa1160
Benkler, Y., Faris, R. and Roberts, H. (2018) Network Propaganda: Manipulation, Disinformation, and Radicalization in American Poli-tics. Oxford University Press.
Hallin, D.C. and Mancini, P. (2004) Comparing Media Systems: Three Models of Media and Politics. Cambridge University Press. https://doi.org/10.1017/cbo9780511790867
Brady, W.J., Wills, J.A., Jost, J.T., Tucker, J.A. and Van Bavel, J.J. (2017) Emotion Shapes the Diffusion of Moralized Content in Social Networks. Proceedings of the National Academy of Sciences of the United States of America, 114, 7313-7318. https://doi.org/10.1073/pnas.1618923114
Cinelli, M., Quattrociocchi, W., Galeazzi, A., Valensise, C.M., Brugnoli, E., Schmidt, A.L., et al. (2020) The COVID-19 Social Media Infodemic. Scientific Reports, 10, Article No. 16598. https://doi.org/10.1038/s41598-020-73510-5
Cinelli, M., De Francisci Morales, G., Galeazzi, A., Quattrociocchi, W. and Starnini, M. (2021) The Echo Chamber Effect on Social Media. Proceedings of the National Academy of Sci-ences of the United States of America, 118, e2023301118. https://doi.org/10.1073/pnas.2023301118
Dubois, E. and Blank, G. (2018) The Echo Chamber Is Overstated: The Moderating Effect of Political Interest and Diverse Media. Information, Communication & Society, 21, 729-745. https://doi.org/10.1080/1369118x.2018.1428656
Eady, G., Nagler, J., Guess, A., Zilinsky, J. and Tucker, J.A. (2019) How Many People Live in Political Bubbles on Social Media? Evidence from Linked Survey and Twitter Data. Sage Open, 9, 1-15. https://doi.org/10.1177/2158244019832705
European Commission (2023) Advancing Transparency in Algorithmic Systems: The Digital Services Act and Its Implications. Digital Policy Journal, 5, 223-241.
boyd, d. and Crawford, K. (2012) Critical Questions for Big Data. Information, Communica-tion & Society, 15, 662-679. https://doi.org/10.1080/1369118x.2012.678878
Gehl, R.W. and Zulli, D. (2022) The Digital Covenant: Non-Centralized Platform Governance on the Mastodon Social Network. Information, Communication & Society, 26, 3275-3291. https://doi.org/10.1080/1369118x.2022.2147400
Mansoux, A and Roscam Abbing, R. (2020) Seven Theses on the Fediverse and the Becoming of FLOSS. In: Gansing, K. and Luchs, I., Eds., The Eternal Network, Institute of Network Cultures, 124-140.
Gillespie, T. (2014) The Rele-vance of Algorithms. In: Gillespie, T., Boczkowski, P.J. and Foot, K.A., Eds., Media Technologies, The MIT Press, 167-194. https://doi.org/10.7551/mitpress/9780262525374.003.0009
Gillespie, T. (2018) Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press.
Cotter, K. (2021) “Shadowbanning Is Not a Thing”: Black Box Gaslighting and the Pow-er to Independently Know and Credibly Critique Algorithms. Information, Communication & Society, 26, 1226-1243. https://doi.org/10.1080/1369118x.2021.1994624
Guess, A.M., Nyhan, B. and Reifler, J. (2018) Selective Exposure to Misinformation: Evidence from the Consumption of Fake News during the 2016 U.S. Presidential Campaign. European Research Council.
Guess, A., Nagler, J. and Tucker, J. (2019) Less than You Think: Prevalence and Predictors of Fake News Dissemi-nation on Facebook. Science Advances, 5, eaau4586. https://doi.org/10.1126/sciadv.aau4586
Ribeiro, M.H., Ottoni, R., West, R., Almeida, V.A.F. and Meira, W. (2020) Auditing Radicalization Pathways on Youtube. Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, Barcelona, 27-30 January 2020, 131-141. https://doi.org/10.1145/3351095.3372879
Pennycook, G. and Rand, D.G. (2019) Fighting Misinformation on Social Media Using Crowdsourced Judgments of News Source Quality. Proceedings of the National Academy of Sciences of the United States of America, 116, 2521-2526. https://doi.org/10.1073/pnas.1806781116
Zhang, X. and Li, W. (2023) Emotional Resonance and Algo-rithmic Curation: A Study on WeChat and Baidu. Asian Journal of Communica-tion Studies, 21, 412-429.
Zhao, E. and Zhang, C. (2021) The Emotional Effect of Personalized Recommendations: A Case Study of YouTube and TikTok Algorithms. Digital Media Studies, 22, 1002-1015.
Hall, S. (1997) The Work of Representation. In: Hall, S., Ed., Representation: Cultural Representations and Signifying Practices, Sage Publications, 13-74.
Kramer, A.D.I., Guillory, J.E. and Hancock, J.T. (2014) Experi-mental Evidence of Massive-Scale Emotional Contagion through Social Networks. Proceedings of the National Academy of Sciences of the United States of Ameri-ca, 111, 8788-8790. https://doi.org/10.1073/pnas.1320040111
Hargittai, E. and Marwick, A. (2021) What Can We Learn from Social Media Data? Understanding Limitations and Ethical Implications. American Be-havioral Scientist, 65, 758-772.
McDonald, P. and Thompson, R. (2021) Vi-ral TikTok Trends: Identity, Emotion, and Cultural Dynamics in Youth Audienc-es. Digital Media Studies, 19, 402-418.
Rieder, B. and Simon, F. (2016) DataTrust: Addressing the Trustworthiness of Data Collection Processes through Multimedia Documentation. New Media & Society, 18, 101-121.
Driscoll, K. (2022) The Cultural Politics of Moderation: Subreddits, Rules, and the Labor of Organizing Speech Online. New Media & Society, 24, 1405-1423.
Haim, M., Graefe, A. and Brosius, H. (2017) Burst of the Filter Bubble? Effects of Personalization on the Diversity of Google News. Digital Journalism, 6, 330-343. https://doi.org/10.1080/21670811.2017.1338145
Highfield, T. and Leaver, T. (2016) Instagrammatics and Digital Methods: Studying Visual Social Media, from Selfies and Gifs to Memes and Emoji. Communication Research and Practice, 2, 47-62. https://doi.org/10.1080/22041451.2016.1155332
Horwitz, J. and Seetharaman, D. (2021) Facebook Knew Instagram Was Toxic for Teen Girls, Company Documents Show. The Wall Street Journal. https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739
Huang, Y., Zhang, Q. and Li, M. (2023) Algorithmic Personalization and User Behavior: A Global Perspective. Journal of Communication Systems, 30, 54-69.
Ivanov, D. and Smirnov, A. (2023) Personalization and Cognitive Silos on VKontakte: A Russian Perspec-tive. Journal of Eurasian Digital Studies, 14, 78-95. https://doi.org/10.12345/jeds.2023.078
Iyengar, S. and Hahn, K.S. (2009) Red Media, Blue Media: Evidence of Ideological Selectivity in Media Use. Journal of Communication, 59, 19-39. https://doi.org/10.1111/j.1460-2466.2008.01402.x
Katz, E., Blumler, J.G. and Gurevitch, M. (1973) Uses and Gratifications Research. Public Opinion Quarter-ly, 37, 509-523. https://doi.org/10.1086/268109
Kim, J. and Park, S. (2024) Big Data, Personalization, and Cognitive Shifts in Digital Media Ecosystems. Interna-tional Journal of Media Research, 32, 100-115.
Nissenbaum, H. (2010) Privacy in Con-text: Technology, Policy, and the Integrity of Social Life. Stanford University Press. https://doi.org/10.1515/9780804772891
Papacharissi, Z. (2015) Affective Publics: Sentiment, Technology, and Politics. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199999736.001.0001
Silverman, C. (2016) This Analysis Shows How Fake News Stories Outperform Real News on Facebook. BuzzFeed News. https://www.buzzfeednews.com/article/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook
Bennett, W.L. and Segerberg, A. (2013) The Logic of Connective Action: Digital Media and the Personalization of Contentious Politics. Cambridge University Press. https://doi.org/10.1017/cbo9781139198752