Often when we talk about infrastructure our focus is efficiency, effectiveness, and what innovation towards these ends might look like. The activities of making and sustaining infrastructures are numerous. Infrastructuring includes innovation, design, and construction. As importantly, and in need of more attention, are the critical activities of maintenance and repair infrastructures require. Yet, even with a renewed focus on maintenance and repair, we are often overlooking another possibility for infrastructuring: deconstruction.
To focus on the deconstruction of infrastructures is complicated by the fact that we are amid an age of renewed interest (and need) for new infrastructural investment. Funding for public infrastructure has plummeted in recent decades. Between 2007 and 2017, public spending on infrastructure in the United States declined by 9.9 billion (Kane & Timer, 2019). Divestment in infrastructure has been, at least in part, a response to infrastructures working as hoped and becoming invisible (Star, 1999). However, the time of convenient invisibility has run out. Infrastructures that seemed to be humming along have eroded, broken, and otherwise fell short of need in the midst of a pandemic, civil unrest, and overt threats to democracy. For all these reasons, and more, our collective focus has been on building a scaffolding around the critical infrastructures on which we rely–energy systems, schools, and healthcare among them–and working to make them more resilient, accessible, and sustainable.
We face an important moment to ask where new investment should be aimed, namely the $1.2 trillion to be allocated by the US Bipartisan Infrastructure Law. How much of this funding should prioritize innovation, maintenance, and repair? If infrastructure is the foundation of our most intractable problems, and the root at which solutions must be aimed, then we ought to also be asking what infrastructures ought to be dismantled.
There is evidence of an appetite for this kind of thinking about infrastructure even within the Bipartisan Infrastructure Law itself. While the majority of the funding will go toward building infrastructures, it includes $1 billion for the Reconnecting Communities initiative to remove urban roadways that are dividing or disrupting communities (it is worth noting that this is a far smaller figure than the originally proposed $20 billion towards the initiative). As long overdue infrastructure investment starts to flow, we face a critical moment to reckon with which infrastructures merit rebuilding, which require rethinking, and perhaps which should be unbuilt or broken down.
These questions apply to multidimensional infrastructures, including our digital information infrastructures. When it comes to information infrastructures, we ought to ask what it means to interrupt and/or deconstruct an infrastructure and on what grounds we should participate in these activities. Information infrastructures include the structures and systems that allow for communication, knowledge sharing, and political discourse (Bowker et al., 2010). The design and building of this infrastructure is largely taking place online via private social media platforms. Access to information infrastructures, like broadband internet remains inequitable, but for those with access, social media platforms constitute a predominant source for information on the state of the world. Social media is an information and social infrastructure that necessitates maintenance and repair, and perhaps, thoughtful deconstruction.
Social media researchers are asking a range of questions of our social media systems. Including: What digital literacy interventions help users confirm the veracity of news and what interventions help this cause? How effective are source reliability labels for helping users vet news sources? What kinds of fact-checking are perceived as most trustworthy among users? How pervasive is state-backed media in propaganda campaigns globally? How entrenched are structural racism, misogyny, and xenophobia in our information and social infrastructures?
Each of these questions sheds light on particular actors, activities, affordances, and phenomena that culminate in the question of what to do about the pervasive problem of mis- and disinformation. A multidimensional infrastructure framework can provide an important purview to these problems at scale. For example, in a study investigating the social and information infrastructures created by QAnon communities in Italy, Pasquetto et al. (2022) make the case for applying an infrastructural lens rather than a specific campaign or movement lens to the issues of mis-/disinformation. In their view, an infrastructural approach allows you to “co-investigate [multiple units of analysis] as they operate in relation to the organizational, collaborative, and procedural, epistemic work put in place by the disinformation agents” (p. 25). Furthermore, “by studying an infrastructure, as opposed to studying a campaign or a narrative, we see the … operation from a distance, and we visualize all its complexities and distributions” (p. 25).
However innovative they may be, our information and communication technologies are built within a social and material reality marred by structural racism, misogyny, and xenophobia. Scholars including Safiya Noble and Ruja Benjamin argue that these logics of oppression are infused into the very code upon which our digital infrastructures increasingly are built. Information infrastructures are not immune. Rachel Kuo argues for the infusion of structural racism, misogyny, and xenophobia in studies of disinformation and propaganda. Dr. Kuo and her co-authors call for more researchers to investigate the ways that these phenomena are shaped by and shape hierarchies of power in our political discourse (Nguyễn et al., 2022). Doing so is essential to our understanding of how information infrastructures are co-constitutive of power hierarchies, and the maintenance or dissolution of hegemonic social orders. Thus, disinformation and propaganda are products as well as producers of social inequity. Social inequity has implications on multidimensional infrastructures. This work makes clear the exigency for qualitative studies of misinformation. Conceptualized as part of an information infrastructure, this is particularly important as we understand all infrastructure to be cultural, historical, and built out of a particular geopolitical and economic context.
Qualitative, and more specifically, community-based participatory methods are well-positioned to continue this work in collaboration with diaspora communities. The cultural, historical, and structural implications and roots of our information infrastructures are important as we look toward the types of actors who are responsible for the spread of mis-/disinformation and propaganda.
While these biases can emerge in social media organically, they can also be planted more deliberately by state-backed efforts to sow division.
Samantha Bradshaw studies government and technology with an emphasis on the implications of emerging technology on democracy. Her recent work has focused on the role of state-backed media in endemic disinformation. Samantha Bradshaw’s work builds on the cultural, historical, and structural underpinnings of this problem that are the focus of Dr. Kuo’s work, but hone in on the state actors and their role in these campaigns. Dr. Bradshaw’s work focuses on Russian state-backed media as a key actor in western disinformation ecosystems. Along with her co-authors, Bradshaw et al. (2022) argue that when we focus too closely on one problem in information infrastructures (i.e. covert disinformation by state actors), we risk losing sight of a significant problem area (i.e. overt propaganda from state-backed media). This information infrastructure functions on a “full spectrum” meaning platforms through action or inaction exert political power. For instance, in regards to the war in Ukraine, platform politics results in real (physical, embodied, and geopolitical) influence.
Zooming out a bit, organized social media campaigns are becoming more commonplace among various governments both authoritarian and democratic (Bradshaw & Howard, 2017). Depending on the organizational structure and practices of the campaign, their reach is more or less difficult to track. Interestingly, democracies seem to have the most disparate and varied approaches to propaganda and also some of the most density. This speaks to a more patchwork infrastructure that is likely difficult to study and difficult to manage. We know that the flows of mis-/disinformation are not strictly a function of state-backed media. The dispersed nature of these flows can be difficult to track, but important to understand.
Too often we assume that disinformation emerges from the periphery to infect mainstream informational infrastructures. However, Micheal Bernstein, has been tracing the origins of misinformation via meme sharing. What he and his coauthor have found is that contrary to popular assumptions, cultural artifacts (i.e. memes) that are widely shared originate in core communities and that points to the importance of the cultural relevance of core communities for “culture-making” (Morina & Bernstein, 2022). In so far as this is related to misinformation, it points us to a wider phenomenon of information sharing not as dominated by periphery-to-core dissemination, but perhaps more so by core-to-periphery or core-to-core dissemination. Rather than thinking about misinformation or disinformation as generated by the periphery and the extreme or fringe groups, this work positions us to reflect on the role of the mainstream (core) media in mis-/disinformation production and spread.
These findings are important in the context of our guiding question: what infrastructures should we unbuild? And perhaps more importantly, how much needs to be unbuilt in order to make these infrastructures more resilient to future abuses? What the research highlighted above has made clear is that our information infrastructures are imbued with logics of oppression that diminish our ability to connect across difference and propel the spread of harm through mis-/disinformation. As we reckon with questions about decentralization and democratization, to what degree will our information infrastructures have to fundamentally change in order to avoid the recreation of these same problems?
When it comes to solutions to these issues, platforms appear to be unwilling or unable to make the broad structural changes that are necessary to get to the root of these issues, rather, patchwork solutions attempt to alleviate some of the harm though generally these “solutions” fall short. Dr. Bernstein and co-authors have investigated one of the most prominent strategies to mitigating the harm of mis/disinformation: content moderation. However, as this work shows, the efficacy of content moderation hinges on perceived legitimacy, which is a responsibility of platforms to thoughtfully consider in the design of content moderation protocols (Pan et al., 2022). Despite the intractable problem of perceived legitimacy being more tied to alignment with user opinion than the content moderation mode used, there is still room to design content moderation protocols that engender more trust and perceived confidence. This would require an emphasis on experts (with more done to establish their legitimacy as experts) paired with judges and algorithms.
When it comes to innovating strategies for combating mis/disinformation, Kevin Aslett, shows us that not all approaches are as effective as we might hope. Dr. Aslett and his co-authors suggest that digital literacy guides need to be wary of suggesting that readers “do their own research” to verify the veracity of news they have already encountered because doing so may deepen their initial beliefs and increase exposure to more misinformation (Aslett et al., 2022a). Rather, findings suggest that guides would be more effective if they encourage readers to read the entire article rather than solely the headline and investigate sources of the information (Aslett et al. 2022b). The hope is that source reliability labels can function both as a screening of misinformation that will prevent readers from clicking through, but also cumulatively as a learning effect leading to stronger ability to identify misinformation independently. Findings suggest that labels may be most useful for heavy consumers of misinformation, though, notably, this is a small subset of the general population (Aslett et al., 2022c).
To understand mis-/disinformation as an infrastructural problem is to look beyond the instances and actors and understand the structures that fortify its continuation. What can be done when the platforms that constitute mis-/disinformation infrastructures are unwilling to unbuild these infrastructures enough to make any meaningful dent in the harms thereby created? This question has led many to call for an internet infrastructure that is truly built with the public-interest at its core rather than commercial interests. But the question remains as to how we can get there and just how much we will have to undo, disassemble, and unbuild. To put it in physical terms, what would “taking it down to the studs” mean in terms of our information infrastructure reliance on private social platforms? What can we learn from physical infrastructures that have been repurposed, and changed substantially without losing their fundamental aim of connection? Furthermore, how might it be in the big platforms interest to do so, especially in light of the push toward decentralization of their networks?
If we approach the mis-/dis-information crisis we face as an infrastructural problem, we are poised to think about the structures of power that uphold the status quo and include those taken for granted aspects of our current information infrastructures as aspects that are able to be disassembled, deconstructed, and rid of in order to make space for something better.
This research brief was written by Madison Snider, In-House Research Fellow at Siegel Family Endowment.
Works cited
Aslett, K., Godel, W., Sanderson, Z., Nagler, J., Bonneau, R., Persily, N., & Tucker, J. A. (2022a). Testing The Effect of Information on Discerning the Veracity of News in Real-Time. https://kaslett.github.io/Documents/Effect_Information_CSMaP.pdf
Aslett, K., Sanderson, Z., Godel, W., Persily, N., Nagler, J., & Tucker, J. A. (2022b). Do Your Own Research? Searching for Additional Information Online About Misinformation Increases Belief in Misinformation. https://kaslett.github.io/Documents/Do_Your_Own_Research_Aslett_et_al.pdf
Aslett, K., Guess, A. M., Bonneau, R., Nagler, J., & Tucker, J. A. (2022c). News credibility labels have limited average effects on news diet quality and fail to reduce misperceptions. Science advances, 8(18), eabl3844. https://www.science.org/doi/10.1126/sciadv.abl3844
Bowker, G. C., Baker, K., Millerand, F., & Ribes, D. (2009). Toward information infrastructure studies: Ways of knowing in a networked environment. In International handbook of internet research (pp. 97-117). Springer, Dordrecht.
Bradshaw, S., DiResta, R., & Giles, C. (2022, August 17). How Unmoderated Platforms Became the Frontline for Russian Propaganda. Lawfare. https://www.lawfareblog.com/how-unmoderated-platforms-became-frontline-russian-propaganda-0
Bradshaw, S., & Howard, P. (2017). Troops, trolls and troublemakers: A global inventory of organized social media manipulation. https://ora.ox.ac.uk/objects/uuid:cef7e8d9-27bf-4ea5-9fd6-855209b3e1f6
Kane, J. W., & Tomer, A. (2019). Shifting into an era of repair: US infrastructure spending trends. Brookings Institution. https://www. brookings. edu/research/shifting-into-an-era-of-repair-us-infrastructure-spending-trends/
Morina, D., & Bernstein, M. S. (2022). A Web-Scale Analysis of the Community Origins of Image Memes. Proceedings of the ACM on Human-Computer Interaction, 6(CSCW1), 74:1-74:25. https://doi.org/10.1145/3512921
Nguyễn, S., Kuo, R., Reddi, M., Li, L., & Moran, R. E. (2022). Studying mis- and disinformation in Asian diasporic communities: The need for critical transnational research beyond Anglocentrism. Harvard Kennedy School Misinformation Review. https://doi.org/10.37016/mr-2020-95
Pan, C. A., Yakhmi, S., Iyer, T. P., Strasnick, E., Zhang, A. X., & Bernstein, M. S. (2022). Comparing the Perceived Legitimacy of Content Moderation Processes: Contractors, Algorithms, Expert Panels, and Digital Juries. Proceedings of the ACM on Human-Computer Interaction, 6(CSCW1), 1–31. https://doi.org/10.1145/3512929
Pasquetto, I. V., Olivieri, A. F., Tacchetti, L., Riotta, G., & Spada, A. (2022). Disinformation as Infrastructure: Making and Maintaining the QAnon Conspiracy on Italian Digital Media. Proceedings of the ACM on Human-Computer Interaction, 6(CSCW1), 84:1-84:31. https://doi.org/10.1145/3512931
Star, S. L. (1999). The ethnography of infrastructure. American behavioral scientist, 43(3), 377-391.