全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

Cognitive Resonance Theory in Strategic Communication: Understanding Personalization, Emotional Resonance, and Echo Chambers

DOI: 10.4236/oalib.1113171, PP. 1-24

Subject Areas: Psychology, Sociology

Keywords: Algorithmic Personalization, Cognitive Resonance Theory, Echo Chambers, Emotional Resonance, Misinformation, Polarization, Strategic Communication

Full-Text   Cite this paper   Add to My Lib

Abstract

The Cognitive Resonance Theory (CRT) offers a novel theoretical framework for examining the complex interactions between algorithmic personalization, emotional engagement, and the formation of echo chambers within digital media environments. The primary aim of this study is to conceptualize and position CRT as a comprehensive framework for understanding how algorithmic personalization and emotional engagement contribute to echo chambers and societal polarization. This study employs a theoretical synthesis approach, integrating research and theoretical insights from communication studies, psychology, and sociology to explore how algorithmically curated content fosters cognitive alignment, reinforces biases, and amplifies societal polarization. Existing research shows how engagement-driven algorithms on platforms like YouTube and TikTok prioritize emotionally resonant content, amplify polarizing narratives, and limit informational diversity. This framework highlights the societal and ethical challenges posed by algorithmic systems, including the spread of misinformation, the erosion of shared understanding, and the deepening of ideological divides. CRT expands upon existing communication theories by linking algorithmic personalization and emotional resonance with the reinforcement of echo chambers, addressing critical gaps in how technology design shapes psychological and behavioral outcomes. It explains how algorithmic systems create self-reinforcing feedback loops that intensify selective exposure, cognitive rigidity, and echo chambers. This study underscores the need for ethical interventions, such as promoting algorithmic transparency, enhancing content diversity, and fostering deliberative public discourse to mitigate the polarizing effects of personalization technologies. CRT offers actionable insights for policymakers, platform designers, and communication practitioners, advancing strategic communication research and guiding the development of more inclusive, equitable, and resilient digital ecosystems.

Cite this paper

Gombar, M. , Križ, M. and Cvitković, A. (2025). Cognitive Resonance Theory in Strategic Communication: Understanding Personalization, Emotional Resonance, and Echo Chambers. Open Access Library Journal, 12, e3171. doi: http://dx.doi.org/10.4236/oalib.1113171.

References

[1]  Zuboff, S. (2019) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs.
[2]  Bakshy, E., Messing, S. and Adamic, L.A. (2015) Exposure to Ideologically Diverse News and Opinion on Facebook. Science, 348, 1130-1132. https://doi.org/10.1126/science.aaa1160
[3]  Benkler, Y. (2006) The Wealth of Networks: How Social Production Transforms Markets and Freedom. Yale University Press.
[4]  Benkler, Y., Faris, R. and Roberts, H. (2018) Network Propaganda: Manipulation, Disinformation, and Radicalization in American Poli-tics. Oxford University Press.
[5]  Fuchs, C. (2017) Social Media: A Critical In-troduction. 2nd Edition, SAGE Publications.
[6]  Boyd, D. (2014) It Is Compli-cated: The Social Lives of Networked Teens. Yale University Press.
[7]  Hallin, D.C. and Mancini, P. (2004) Comparing Media Systems: Three Models of Media and Politics. Cambridge University Press. https://doi.org/10.1017/cbo9780511790867
[8]  Brady, W.J., Wills, J.A., Jost, J.T., Tucker, J.A. and Van Bavel, J.J. (2017) Emotion Shapes the Diffusion of Moralized Content in Social Networks. Proceedings of the National Academy of Sciences of the United States of America, 114, 7313-7318. https://doi.org/10.1073/pnas.1618923114
[9]  Cinelli, M., Quattrociocchi, W., Galeazzi, A., Valensise, C.M., Brugnoli, E., Schmidt, A.L., et al. (2020) The COVID-19 Social Media Infodemic. Scientific Reports, 10, Article No. 16598. https://doi.org/10.1038/s41598-020-73510-5
[10]  Cinelli, M., De Francisci Morales, G., Galeazzi, A., Quattrociocchi, W. and Starnini, M. (2021) The Echo Chamber Effect on Social Media. Proceedings of the National Academy of Sci-ences of the United States of America, 118, e2023301118. https://doi.org/10.1073/pnas.2023301118
[11]  Flaxman, S., Goel, S. and Rao, J.M. (2016) Filter Bubbles, Echo Chambers, and Online News Consump-tion. Public Opinion Quarterly, 80, 298-320. https://doi.org/10.1093/poq/nfw006
[12]  Dubois, E. and Blank, G. (2018) The Echo Chamber Is Overstated: The Moderating Effect of Political Interest and Diverse Media. Information, Communication & Society, 21, 729-745. https://doi.org/10.1080/1369118x.2018.1428656
[13]  Eady, G., Nagler, J., Guess, A., Zilinsky, J. and Tucker, J.A. (2019) How Many People Live in Political Bubbles on Social Media? Evidence from Linked Survey and Twitter Data. Sage Open, 9, 1-15. https://doi.org/10.1177/2158244019832705
[14]  European Commission (2020) The Digital Services Act Package. https://files.eric.ed.gov/fulltext/ED071097.pdf
[15]  European Commission (2023) Advancing Transparency in Algorithmic Systems: The Digital Services Act and Its Implications. Digital Policy Journal, 5, 223-241.
[16]  boyd, d. and Crawford, K. (2012) Critical Questions for Big Data. Information, Communica-tion & Society, 15, 662-679. https://doi.org/10.1080/1369118x.2012.678878
[17]  Gehl, R.W. and Zulli, D. (2022) The Digital Covenant: Non-Centralized Platform Governance on the Mastodon Social Network. Information, Communication & Society, 26, 3275-3291. https://doi.org/10.1080/1369118x.2022.2147400
[18]  Mansoux, A and Roscam Abbing, R. (2020) Seven Theses on the Fediverse and the Becoming of FLOSS. In: Gansing, K. and Luchs, I., Eds., The Eternal Network, Institute of Network Cultures, 124-140.
[19]  Vosoughi, S., Roy, D. and Aral, S. (2018) The Spread of True and False News Online. Science, 359, 1146-1151. https://doi.org/10.1126/science.aap9559
[20]  Gillespie, T. (2014) The Rele-vance of Algorithms. In: Gillespie, T., Boczkowski, P.J. and Foot, K.A., Eds., Media Technologies, The MIT Press, 167-194. https://doi.org/10.7551/mitpress/9780262525374.003.0009
[21]  Gillespie, T. (2018) Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press.
[22]  Cotter, K. (2021) “Shadowbanning Is Not a Thing”: Black Box Gaslighting and the Pow-er to Independently Know and Credibly Critique Algorithms. Information, Communication & Society, 26, 1226-1243. https://doi.org/10.1080/1369118x.2021.1994624
[23]  Gorwa, R. (2019) What Is Platform Governance? Information, Communication & Society, 22, 854-871. https://doi.org/10.1080/1369118x.2019.1573914
[24]  Guess, A.M., Nyhan, B. and Reifler, J. (2018) Selective Exposure to Misinformation: Evidence from the Consumption of Fake News during the 2016 U.S. Presidential Campaign. European Research Council.
[25]  Guess, A., Nagler, J. and Tucker, J. (2019) Less than You Think: Prevalence and Predictors of Fake News Dissemi-nation on Facebook. Science Advances, 5, eaau4586. https://doi.org/10.1126/sciadv.aau4586
[26]  Sunstein, C.R. (2001) Repub-lic.com. Princeton University Press.
[27]  Sunstein, C.R. (2018) Republic: Di-vided Democracy in the Age of Social Media. Princeton University Press.
[28]  Ribeiro, M.H., Ottoni, R., West, R., Almeida, V.A.F. and Meira, W. (2020) Auditing Radicalization Pathways on Youtube. Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, Barcelona, 27-30 January 2020, 131-141. https://doi.org/10.1145/3351095.3372879
[29]  Berger, J. and Milkman, K.L. (2012) What Makes Online Content Viral? Journal of Marketing Research, 49, 192-205. https://doi.org/10.1509/jmr.10.0353
[30]  Pennycook, G. and Rand, D.G. (2019) Fighting Misinformation on Social Media Using Crowdsourced Judgments of News Source Quality. Proceedings of the National Academy of Sciences of the United States of America, 116, 2521-2526. https://doi.org/10.1073/pnas.1806781116
[31]  Stroud, N.J. (2010) Polariza-tion and Partisan Selective Exposure. Journal of Communication, 60, 556-576. https://doi.org/10.1111/j.1460-2466.2010.01497.x
[32]  Stroud, N.J. (2011) Niche News: The Politics of News Choice. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199755509.001.0001
[33]  Pariser, E. (2011) The Filter Bubble: How the New Personalized Web Changes What We Read and Think. Penguin Books.
[34]  Vaidhyanathan, S. (2018) Antisocial Me-dia: How Facebook Disconnects Us and Undermines Democracy. Oxford Univer-sity Press.
[35]  Vaidhyanathan, S. (2020) Algorithms of Division: How Digital Media Amplifies Polarization. Journal of Media Ethics, 35, 221-233.
[36]  Wong, M. (2023) Evolving Frameworks of Emotional Engage-ment in Algorithmic Personalization. Global Media Journal, 15, 178-192.
[37]  Zhang, X. and Li, W. (2023) Emotional Resonance and Algo-rithmic Curation: A Study on WeChat and Baidu. Asian Journal of Communica-tion Studies, 21, 412-429.
[38]  Zhao, E. and Zhang, C. (2021) The Emotional Effect of Personalized Recommendations: A Case Study of YouTube and TikTok Algorithms. Digital Media Studies, 22, 1002-1015.
[39]  Hall, S., Critcher, C., Jefferson, T., Clarke, J. and Roberts, B. (1978) Policing the Crisis: Mugging, the State, and Law and Order. Macmillan Press.
[40]  Hall, S. (1997) The Work of Representation. In: Hall, S., Ed., Representation: Cultural Representations and Signifying Practices, Sage Publications, 13-74.
[41]  Noble, S.U. (2018) Algo-rithms of Oppression: How Search Engines Reinforce Racism. New York Univer-sity Press.
[42]  Kramer, A.D.I., Guillory, J.E. and Hancock, J.T. (2014) Experi-mental Evidence of Massive-Scale Emotional Contagion through Social Networks. Proceedings of the National Academy of Sciences of the United States of Ameri-ca, 111, 8788-8790. https://doi.org/10.1073/pnas.1320040111
[43]  Buchanan, T. (2022) The Power of Emotional Resonance: How Platforms Shpe Affective Engagement. Emotion & Society, 4, 55-72.
[44]  McCombs, M.E. and Shaw, D.L. (1972) The Agenda-Setting Function of Mass Media. Public Opinion Quarterly, 36, 176-187. https://doi.org/10.1086/267990
[45]  Lanier, J. (2018) Ten Argu-ments for Deleting Your Social Media Accounts Right Now. Henry Holt and Co.
[46]  Hargittai, E. and Marwick, A. (2021) What Can We Learn from Social Media Data? Understanding Limitations and Ethical Implications. American Be-havioral Scientist, 65, 758-772.
[47]  Harari, Y.N. (2018) 21 Lessons for the 21st Century. Spiegel & Grau.
[48]  McDonald, P. and Thompson, R. (2021) Vi-ral TikTok Trends: Identity, Emotion, and Cultural Dynamics in Youth Audienc-es. Digital Media Studies, 19, 402-418.
[49]  Buchanan, T. and Bastian, B. (2023) Algorithmic Curation and the Emotional Amplification of Digital Dis-course. Social Media & Society, 9, 112-129.
[50]  Rieder, B. and Simon, F. (2016) DataTrust: Addressing the Trustworthiness of Data Collection Processes through Multimedia Documentation. New Media & Society, 18, 101-121.
[51]  Bergen, M. (2022) Like, Comment, Subscribe: Inside YouTube’s Chaotic Rise to World Domination. Viking.
[52]  Driscoll, K. (2022) The Cultural Politics of Moderation: Subreddits, Rules, and the Labor of Organizing Speech Online. New Media & Society, 24, 1405-1423.
[53]  Haim, M., Graefe, A. and Brosius, H. (2017) Burst of the Filter Bubble? Effects of Personalization on the Diversity of Google News. Digital Journalism, 6, 330-343. https://doi.org/10.1080/21670811.2017.1338145
[54]  Highfield, T. and Leaver, T. (2016) Instagrammatics and Digital Methods: Studying Visual Social Media, from Selfies and Gifs to Memes and Emoji. Communication Research and Practice, 2, 47-62. https://doi.org/10.1080/22041451.2016.1155332
[55]  Horwitz, J. and Seetharaman, D. (2021) Facebook Knew Instagram Was Toxic for Teen Girls, Company Documents Show. The Wall Street Journal. https://www.wsj.com/articles/facebook-knows-instagram-is-toxic-for-teen-girls-company-documents-show-11631620739
[56]  Huang, Y., Zhang, Q. and Li, M. (2023) Algorithmic Personalization and User Behavior: A Global Perspective. Journal of Communication Systems, 30, 54-69.
[57]  Ivanov, D. and Smirnov, A. (2023) Personalization and Cognitive Silos on VKontakte: A Russian Perspec-tive. Journal of Eurasian Digital Studies, 14, 78-95. https://doi.org/10.12345/jeds.2023.078
[58]  Iyengar, S. and Hahn, K.S. (2009) Red Media, Blue Media: Evidence of Ideological Selectivity in Media Use. Journal of Communication, 59, 19-39. https://doi.org/10.1111/j.1460-2466.2008.01402.x
[59]  Kang, S., Lee, J. and Park, C. (2023) Emotional Algorithms: The Psychology of Digital Engagement. Emotion and Technology, 10, 110-126.
[60]  Katz, E., Blumler, J.G. and Gurevitch, M. (1973) Uses and Gratifications Research. Public Opinion Quarter-ly, 37, 509-523. https://doi.org/10.1086/268109
[61]  Kelty, C.M. (2008) Two Bits: The Cultural Significance of Free Software. Duke University Press. https://doi.org/10.2307/j.ctv1198vx9
[62]  Kim, J. and Park, S. (2024) Big Data, Personalization, and Cognitive Shifts in Digital Media Ecosystems. Interna-tional Journal of Media Research, 32, 100-115.
[63]  Liu, H. and Tan, Y. (2023) Attention Engineering: How TikTok Shapes User Engagement. Media Psychology Quarterly, 11, 289-302.
[64]  Liu, J. and Wang, H. (2024) Algo-rithmic Amplification on Baidu: An Analysis of China’s Leading Search Engine. Asian Media Studies, 18, 54-70.
[65]  Nissenbaum, H. (2010) Privacy in Con-text: Technology, Policy, and the Integrity of Social Life. Stanford University Press. https://doi.org/10.1515/9780804772891
[66]  Papacharissi, Z. (2015) Affective Publics: Sentiment, Technology, and Politics. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199999736.001.0001
[67]  Silverman, C. (2016) This Analysis Shows How Fake News Stories Outperform Real News on Facebook. BuzzFeed News. https://www.buzzfeednews.com/article/craigsilverman/viral-fake-election-news-outperformed-real-news-on-facebook
[68]  Tufekci, Z. (2017) Twitter and Tear Gas: The Power and Fragility of Networked Protest. Yale University Press.
[69]  Bennett, W.L. and Segerberg, A. (2013) The Logic of Connective Action: Digital Media and the Personalization of Contentious Politics. Cambridge University Press. https://doi.org/10.1017/cbo9781139198752
[70]  Couldry, N. and Hepp, A. (2016) The Mediated Construction of Reality. Polity Press.
[71]  Coyne, R. (1999) Technoromanticism: Digital Narrative, Holism, and the Romance of the Real. MIT Press.

Full-Text


Contact Us

[email protected]

QQ:3279437679

WhatsApp +8615387084133