Yvonne Eadon is an assistant professor in the School of Information Science at the University of Kentucky. At the time of this interview, she was a Postdoctoral Research Fellow at the Center For Information, Technology, and Public Life at the University of North Carolina. Conceptualizing conspiracy theorizing as a form of knowledge production, Dr. Eadon investigates feminized conspiracy theories and the gender dynamics within online communities that form around them, as well as how information institutions interface with and assist researchers with alternative or conspiratorial viewpoints.
Before 2016, exploring conspiracy theories might have been a fun pastime—falling down internet rabbit holes and debating fringe ideas with friends. Fast forward eight years, and these once-obscure notions have taken center stage, shaping many of the hot-button issues in the upcoming presidential election. The internet has made it easier than ever for conspiracy theories to spread, allowing ideas that once circulated in small circles to gain widespread visibility. Social media algorithms amplify this, creating echo chambers where misinformation flourishes. As a result, the line between fact and fiction has blurred for many. We sat down with Yvonne to delve deeper into the rise of online conspiracy theories, why they matter, and how language shapes our understanding of technology and the information ecosystem—touching on everything from Taylor Swift to the widespread belief that your phone is always listening.
Tell us about yourself and your background. What brought you to this point in your career? What drew you to this area of research and your current projects?
I came into my PhD program at UCLA wanting to study technology and museum spaces, book history, and artist’s books. I was interested in curatorial questions. That’s very different from what I do now.
With the election in fall 2016 being the same year I started my PhD, I became really interested in conspiracy theorists and the informational implications of electing a conspiracy theorist to the highest office in the land. My work examined how people who believe in conspiracy theories do their research and look for information in both formal institutions such as archives and libraries and informal spaces, including the internet.
For my dissertation, I interviewed people who consider themselves serious researchers of conspiracy theories around topics such as UFOs or the JFK assassination. Often, they were very uncomfortable with being labeled “conspiracy theorists.” They often wanted to be called investigative journalists or researchers.
I talked to two women who were UFO researchers based outside of the US. Talking to them was fascinating and enlightening because they approached their research in such a radically different way from the men that I spoke to who were researching the same topics. They interviewed people who had had encounters with extraterrestrials from a position of care, wanting to help them in this classically feminist way of interviewing people. I became really interested in how they navigated the very male-dominated world of UFO research.
From there, I became more interested in questions about how different groups of people—including women and queer people—relate to conspiracy theories. Which ones do they relate to and why?
You recently published an article, “You Could Hear a Hair Pin Drop”: Queer Utopianism and Informal Knowledge Production in the Gaylor Closeting Conspiracy Theory” in which you examine knowledge production and conspiracy theory in relation to Taylor Swift and “Gaylor” theories among particular communities. Can you tell me a bit more about that work?
“Gaylor” is this idea that Taylor Swift is queer and closeted and communicates to her queer fans through deeply coded messages that reflect queer history. For example, “dropping hairpins” was mid-century slang in LGBTQ+ communities for dropping hints about your sexuality—the term was also used in one of the first journalistic accounts of Stonewall, referring to the event as “the hairpin drop heard round the world.” Taylor Swift used the term “hairpin drop,” in two different songs, leading some fans to believe that she was referencing the phrase as a way to “drop hairpins” about her own sexuality. There are many, many different nodes of evidence like that within the Gaylor conspiracy theory.
What do pop culture conspiracy theory studies around topics like Gaylor teach us about the workings of conspiracy in knowledge production more broadly?
I should start by pointing out that my definition of “conspiracy theory” is different from the one that many people hold. This definition often expresses racism and antisemitism and reflects the far right. My definition is very broad and flexible. It refers to a group of people trying to accomplish something in secret. So, it can be a one-world government or a surprise party.
Conspiracy theories emerge when there are a series of absences or clues. If someone is a fan of something or has a particular interest in the topic, researching the conspiracy theory can be compelling and intellectually stimulating. If you look at the JFK assassination, there are a lot of really weird absences. That’s also true for Taylor Swift. Swift has in fact talked openly about how she communicates in a coded way with her fans through her lyrics, music video visuals, album art, fashion choices, and social media posts.
Absences or a series of clues become more compelling when they relate to a piece of the theorist’s identity. So people relate to Taylor Swift as a queer person because they themselves haven’t felt safe or ready to come out at a certain point in their lives. I’m interviewing “Gaylors” right now. One person told me about a six-year relationship she had with someone where both people remained closeted because they weren’t in a circumstance where it was safe to come out. They weren’t ready to be publicly queer. This is someone who really relates to Taylor Swift in that way.
You can see the same kind of thing happening with wildly different results in other areas of conspiracy theory. For example, those who believe in the QAnon conspiracy are often afraid of immigrants coming to take their jobs. That deep story about immigrants activates the anger and entitlement at the center of white identity. On the other hand, Gaylor beliefs may activate the kind of loneliness and longing that can accompany navigating the world as a queer person.
A lot of people also talk about conspiracy theories as they relate to fear. Certainly a lot of the big, dangerous, political conspiracy theories like QAnon do come from feelings of fear. But I also think it’s not always fear that spawns conspiracy theories or motivates a community. There are many other strong embodied emotions and threats to or insights into one’s identity. There is an emotional landscape that has been under-explored in studies of conspiracy theory, functioning at different levels of harm.
A lot of people might immediately dismiss conspiracy theories as untrue. How do you respond to that idea as a scholar examining theorists’ practices? What methodologies do you use to understand the conspiracy theorists’ perspectives?
I try to study conspiracy theories where I can really understand and empathize with how people would come up with these ideas. I don’t look at hugely harmful conspiracy theories that might be racist or antisemitic, and that could be immediately dismissed as untrue. Instead, I try to look at conspiracy theories that emerge because we don’t have enough data. My methodological orientation is toward existing and living in gray areas and liminal spaces about belief, making space for possibility.
You also argue that conspiracy theories help users make sense of black-boxed technology. How do conspiracy theorists’ content creation efforts inform users’ understanding of algorithms?
There’s been a lot of good work on how people make sense of algorithmic functioning—the data it uses and how it prioritizes certain things over others.
There are a couple of really interesting examples of this. One theory that many people have is that our phones are listening to us. This idea has been around since around 2016. People might see something they discussed with a friend advertised to them online, so they assume the phone was listening.
As you said, there are these black boxes that contain these enormously accurate and well-developed data shadows of us as users. The algorithms pull from what is advertised to us. But if you’re a layperson, you’re not always thinking about how the internet works, so the most likely explanation for the advertisement might be that your phone is listening to what you say.
For example, a friend of mine was looking at her phone when we visited the beach in North Carolina. She saw ads for Shibumi Shades, a sunshade that works in the wind that is very popular at beaches in North Carolina. We had been talking about Shibumis earlier in the day and she said, “Oh my God! My phone is listening to me! It’s advertising these things.” It was actually a location thing. Because we were on the beach in the Carolinas, it was advertising these sunshades to us. Functionally, the phone is kind of listening to you, but not through the microphone. It’s through all of the other little data points—the location in this case—that it is “listening.” That’s a really interesting example of how people understand algorithms.
Here’s an example of the flip side of this. People sometimes theorize that a given algorithm—like the TikTok algorithm, for example—is pushing certain creators and not others. This gets at something that is true. It’s been shown repeatedly that because of biased training data and other factors, algorithms like TikTok’s have significant racial biases. They actually do deprioritize creators of color and the platform does push white creators. Algorithmic folk theories like this are sometimes correct, so we can’t dismiss them outright.
In another recent publication, “Combating contamination and contagion: Embodied and environmental metaphors of misinformation”, you and your co-author, Stacy Wood, explore the possibilities and limitations that health and environmental metaphors have on understanding our information ecosystem. How did this work come to be?
Stacy and I collaborate frequently, and we were talking about how there wasn’t a lot of scholarship about how metaphors specifically around mis- and disinformation can really shape how we think about it and how we develop solutions. This is a theoretical piece more than an empirical piece that emerged from these conversations and subsequent reviews of the literature around mis- and disinformation.
The public health and environmental metaphors that we often use to describe mis- and disinformation—”infodemic” and “information pollution,” for example—are often used as a kind of shortcut for communicating the direness of the situation. But if we’re uncritically invoking public health and environmental metaphors all the time it can really limit how we think about the gaps in the information landscape and how we can devise creative and critical solutions.
The article points to several efforts to find solutions to this problem, including individual literacy efforts and governance mechanisms. In what ways do those efforts have potential?
Governance puts the burden on the system, whereas literacy puts it on the individual. To me, developing solutions rooted in platform governance makes more sense, because the system should be responsible for its own failings. But, under capitalism, it’s quite difficult to get companies to care about misinformation or governments to care about regulating misinformation.
It’s also incredibly difficult to categorize certain things as misinformation and others as not. Satire is a really interesting example of the challenge of categorization. For instance, a satire website tweeted that Justin Timberlake had drugs popular in the gay community in his system when he was arrested for a DUI this summer. A lot of people interpreted this as fact so it ended up essentially functioning as misinformation. Issues around misinformation are hard to disentangle, and this speaks to the need for us to be able to talk about it in metaphorical terms because it’s so challenging.
Returning to the question of public health and environmental metaphors specifically, there’s an inherent moral weight assigned to pandemics and environmental disasters. Disease is bad. Pollution is bad. But information doesn’t function in such black-and-white terms. We can’t always talk about information quality and information integrity on the internet in a binary. Information is really hard to actually pin down in terms of moral weight and social value.
What are the biggest challenges you face in approaching your research? What obstacles stand in the way of this kind of research more broadly?
One of my biggest challenges is that it can be hard for my research to seem valuable or hard hitting because it often deals in traditionally “feminine” topics. For example, many people see my Gaylor research and don’t really understand the broader applicability or insights. I’m also femme myself. So I think that my research topic and my gender presentation can result in some challenges in that area. This issue is even more difficult for my participants, some of whom repeatedly experience dismissal and derision from the mainstream world because their beliefs seem silly, marginal, or ridiculous.
You’re working on a book proposal right now. Tell us more about that project.
It’s still early days. I’m hoping to fold in some of my research with women. It will include UFO researchers and Gaylor researchers. I might be talking to some “Free Britney” people. That’s an interesting example, because it was a conspiracy theory that turned out to be true. It was also very big in the gay community. I also hope to include examples that relate to true crime—the repeated urban legends and moral panics around suburban white women being trafficked, for instance.
I want to look at how these little examples of conspiratorial thinking can come into queer communities and groups of women online and in-person. I am interested in how this also shapes online community-building and collective knowledge production.
What are you reading, watching, and listening to right now that you would recommend to readers, and why?
I’m reading Tamara Kneese’s Death Glitch right now, which is about how platforms are changing our experiences around death. It’s really fantastically written and researched. I’m also always listening to “Who? Weekly,” which is a celebrity gossip podcast that’s so much more than that. It’s a wide methodological project that categorizes celebrities in this very specific way and has also built this huge community of pop culture obsessives. It’s really fun. I’m also listening to “Fur and Loathing,” which is a true crime podcast about a chlorine gas attack on a 2014 Furry convention. It’s a really interesting example of how the internet, true crime, and fringe communities all boil together to create some pretty fascinating results.
More from Yvonne Eadon:
- “Combating contamination and contagion: Embodied and environmental metaphors of misinformation” (with Stacy E. Wood), Convergence, May 2024
- “You Could Hear a Hair Pin Drop”: Queer Utopianism and Informal Knowledge Production in the Gaylor Closeting Conspiracy Theory”, Social Media + Society, April 2024