Journal of Science Policy & Governance
|
Volume 22, Issue 01 | March 6, 2023
|
Op-Ed: Supporting Democracy through Content-Neutral Social Media Policies
Christopher L. Quarles
Corresponding author: [email protected] |
Keywords: Social Media Policy; content moderation; information infrastructure; de-amplification
https://doi.org/10.38126/JSPG220108
Executive Summary
The internet and social media carry vast amounts of new information every second. To make these flows manageable, platforms engage in content moderation, using algorithms and humans to decide which content to recommend and which to remove. These decisions have profound effects on our elections, democratic debate, and human well-being. The U.S. government cannot directly regulate these decisions due to the scale of the content and the First Amendment. Rather than focusing exclusively on whether or what content gets moderated, policy-makers should focus on ensuring that incentives and processes create an information infrastructure that can support a robust democracy. These policies are most likely to be content-neutral. Three content-neutral mechanisms are promising targets for policy: process, transparency, and de-amplification.
-Read the full article through download.-
Background header image courtesy of PostBeyond
Christopher L. Quarles is a PhD candidate in the School of Information, a researcher with the Center for Ethics, Society & Computing, and a fellow at the Stone Center for Inequality Dynamics. His current research focuses on how information technology affects how we group ourselves, and on systemic trends in inequality and opportunity. In the long term, he hopes to have a practical impact on the way our information infrastructure evolves to support humanity.
References
- Bail, Chris. 2021. Breaking the Social Media Prism: How to Make Our Platforms Less Polarizing. Princeton, NJ: Princeton University Press.
- Balkin, Jack M. 2016. “Information Fiduciaries and the First Amendment.” U.C. Davis Law Review 49 (4): 1183–1234.
- Balkin, Jack M.. 2018. “Free Speech Is a Triangle.” Columbia Law Review 118 (7): 2011–56.
- Barabási, Albert-László, and Réka Albert. 1999. “Emergence of Scaling in Random Networks.” Science 286 (October): 509–12. https://doi.org/10.1126/science.286.5439.5 09.
- Bogen, David. 1983. “The Origins of Freedom of Speech and Press.” Maryland Law Review 42 (3): 429–65.
- Brady, William J., Killian McLoughlin, Tuan N. Doan, and Molly J. Crockett. 2021. “How Social Learning Amplifies Moral Outrage Expression in Online Social Networks.” Science Advances 7 (33): 1–15. https://doi.org/10.1126/sciadv.abe5641.
- Bruns, Axel. 2019. “After the ‘APIcalypse’: Social Media Platforms and Their Fight against Critical Scholarly Research.” Information, Communication & Society 22 (11): 1544–66. https://doi.org/10.1080/1369118X.2019.16 37447.
- Bruns, Axel, Anja Bechmann, Jean Burgess, Andrew Chadwick, Lynn Schofield Clark, and et al. 2018. “Facebook Shuts the Gate after the Horse Has Bolted, and Hurts Real Research in the Process.” Internet Policy Review. https://policyreview.info/articles/news/facebook-shuts-gate-after-horse-has-bolted-and -hurts-real-research-process/786.
- Burke, Kathryn. 2005. “Early American Letter Writing.” https://postalmuseum.si.edu/research-articl es/letter-writing-in-america.
- Carrasco-Farré, Carlos. 2022. “The Fingerprints of Misinformation: How Deceptive Content Differs from Reliable Sources in Terms of Cognitive Effort and Appeal to Emotions.” Humanities and Social Sciences Communications 9 (1): 162. https://doi.org/10.1057/s41599-022-01174 -9.
- DiResta, Renee. 2019. “A New Law Makes Bots Identify Themselves - That’s the Problem.” Wired, July 24, 2019. https://www.wired.com/story/law-makes-b ots-identify-themselves/.
- Ferrara, Emilio, Onur Varol, Clayton Davis, Filippo Menczer, and Alessandro Flammini. 2016. “The Rise of Social Bots.” Communications of the ACM 59 (7): 96–104. https://doi.org/10.1145/2818717.
- Finkel, Jacob, Steven Jiang, Mufan Luo, Rebecca Mears, Danaë Metaxa-Kakavouli, Camille Peeples, Brendan Sasso, Arjun Shenoy, Vincent Sheu, and Nicolás Torres-Echeverry. 2017. “Fake News and Misinformation: The Roles of the Nation’s Digital Newsstands, Facebook, Google, Twitter and Reddit.” https://law.stanford.edu/wp-content/uploads/2017/10/Fake-News-Misinformation-FIN AL-PDF.pdf.
- Frimer, Jeremy A., Harinder Aujla, Matthew Feinberg, Linda J. Skitka, Karl Aquino, Johannes C. Eichstaedt, and Robb Willer. 2022. “Incivility Is Rising Among American Politicians on Twitter.” Social Psychological and Personality Science, April. https://doi.org/10.1177/194855062210838 11.
- Gillespie, Tarleton. 2018. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press.
- Goodman, Ellen P. 2021. “Digital Fidelity and Friction.” Nevada Law Journal 21 (2): 623–54.
- Greene, Travis, David Martens, and Galit Shmueli. 2022. “Barriers to Academic Data Science Research in the New Realm of Algorithmic Behaviour Modification by Digital Platforms.” Nature Machine Intelligence 4 (4): 323–30. https://doi.org/10.1038/s42256-022-00475 -7.
- Habermas, Jürgen. 1991. The Structural Transformation of the Public Sphere: An Inquiry into a Category of Bourgeois Society. MIT Press.
- Hampton, Keith N, Lee Rainie, Weixu Lu, Maria Dwyer, Inyoung Shin, and Kristen Purcell. 2014. “Social Media and the ‘Spiral of Silence.’” Washington, DC. http://www.pewinternet.org/2014/08/26/s ocial-media-and-the-spiral-of-silence/.
- Harshman, Jason, Rebecca Hill, and James Moran. 2020. “Media and Communication Technology in the Making of America.” EDSITEment!, 2020. https://edsitement.neh.gov/closer-readings/media-and-communication-technology-mak ing-america.
- Harvard University. n.d. “Social Science One.” Accessed January 9, 2023. https://socialscience.one/.
- Hasell, Ariel. 2021. “Shared Emotion: The Social Amplification of Partisan News on Twitter.” Digital Journalism 9 (8): 1085–1102. https://doi.org/10.1080/21670811.2020.18 31937.
- Hosanagar, Kartik, and Vivian Jair. 2018. “We Need Transparency in Algorithms, but Too Much Can Backfire.” Harvard Business Review, July 2018. https://hbr.org/2018/07/we-need-transpar ency-in-algorithms-but-too-much-can-backfi re.
- Iyengar, Shanto, and Douglas S Massey. 2019. “Scientific Communication in a Post-Truth Society.” Proceedings of the National Academy of Sciences 116 (16): 7656–61. https://doi.org/10.1073/pnas.1805868115.
- Jensen, Michael. 2018. “Russian Trolls and Fake News: Information or Identity Logics?” Journal of International Af airs 71 (1.5): 115–24.
- Keller, Daphne. 2021. “Amplification and Its Discontents: Why Regulating the Reach of Online Content Is Hard.” https://knightcolumbia.org/content/amplification-and-its-discontents.
- Klonick, Kate. 2020. “The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression.” Yale Law Journal 129 (8): 2418–99.
- Lada, Akos, Meihong Wang, and Tak Yan. 2021. “How Does News Feed Predict What You Want to See?” 2021. https://about.fb.com/news/2021/01/how-d oes-news-feed-predict-what-you-want-to-see/.
- Leonelli, Sabina, Rebecca Lovell, Benedict W. Wheeler, Lora Fleming, and Hywel Williams. 2021. “From FAIR Data to Fair Data Use: Methodological Data Fairness in Health-Related Social Media Research.” Big Data & Society 8 (1). https://doi.org/10.1177/205395172110103 10.
- Luo, Xueming, Siliang Tong, Zheng Fang, and Zhe Qu. 2019. “Frontiers: Machines vs. Humans: The Impact of Artificial Intelligence Chatbot Disclosure on Customer Purchases.” Marketing Science 38 (6): mksc.2019.1192. https://doi.org/10.1287/mksc.2019.1192.
- McGregor, Shannon C. 2019. “Social Media as Public Opinion: How Journalists Use Social Media to Represent Public Opinion.” Journalism 20 (8): 1070–86. https://doi.org/10.1177/146488491984545 8.
- Nicholas, Gabriel, and Dhanaraj Thakur. 2022. “Learning to Share: Lessons on Data-Sharing from Beyond Social Media.” https://cdt.org/insights/learning-to-share-le ssons-on-data-sharing-from-beyond-socialmedia/.
- Quarles, Christopher L., and Lia Bozarth. 2022. “How the Term ‘White Privilege’ Affects Participation, Polarization, and Content in Online Communication.” PLOS ONE 17 (5): e0267048. https://doi.org/10.1371/journal.pone.0267 048.
- Rhum, Kimberly. 2021. “Information Fiduciaries and Political Microtargeting: A Legal Framework for Regulating Political Advertising on Digital Platforms.” Northwestern University Law Review 115 (6): 1829–73.
- Simeone, Osvaldo. 2018. “A Very Brief Introduction to Machine Learning With Applications to Communication Systems.” IEEE Transactions on Cognitive Communications and Networking 4 (4): 648–64. https://doi.org/10.1109/TCCN.2018.288144 2.
- Simpson, Erin, and Adam Conner. 2020. “Fighting Coronavirus Misinformation and Disinformation.” https://www.americanprogress.org/article/f ighting-coronavirus-misinformation-disinfor mation/.
- The United States Department of Justice. n.d. “Citizen’s Guide to U.S. Federal Law on Child Pornography.” Accessed September 9, 2022. https://www.justice.gov/criminal-ceos/citiz
- ens-guide-us-federal-law-child-pornography. Vidales, Jesus. 2022. “Texas Social Media ‘Censorship’ Law Goes into Effect after Federal Court Lifts Block.” The Texas Tribune, September 16, 2022. https://www.texastribune.org/2022/09/16 /texas-social-media-law/.
- Walker, Shawn, Dan Mercea, and Marco Bastos. 2019. “The Disinformation Landscape and the Lockdown of Social Platforms.” Information, Communication & Society 22 (11): 1531–43. https://doi.org/10.1080/1369118X.2019.16 48536.
- Zakrzewski, Cat. 2022. “11th Circuit Blocks Major Provisions of Florida’s Social Media Law.” Washington Post, May 23, 2022. https://www.washingtonpost.com/technology/2022/05/23/florida-social-media-11th-c ircuit-decision/.
DISCLAIMER: The findings and conclusions published herein are solely attributed to the author and not necessarily endorsed or adopted by the Journal of Science Policy and Governance. Articles are distributed in compliance with copyright and trademark agreements.
ISSN 2372-2193
ISSN 2372-2193