Sara M. Watson reflects on the importance of building a responsible tech ecosystem
Last year, technology critic and independent analyst Sara M. Watson began investigating the role of responsible and ethical tech functions in industry. She was intrigued by their potential to impact development and tech procurement decision-making and usage across a range of companies. Watson is now mapping the responsible tech ecosystem as a Siegel Research Fellow at All Tech Is Human.
We sat down with Watson to find out what responsible tech is, why she felt it was important to engage a variety of stakeholders in pursuing her research, the urgency of the questions she’s investigating, and how she plans to engage with other members of the Siegel Research Fellow community in the year to come.
Tell us a bit about yourself. What brought you to this point in your career? How do you and your work fit into the responsible tech ecosystem?
I’m an independent analyst and technology critic. And I’ve worn many hats throughout my career: ethnographer, enterprise IT analyst and CIO advisor, startup content strategist. No matter what, I’ve always been researching and writing about tech.
While I’ve been in tech for my entire career—starting as the odd duck English major intern in the office of the CIO at Liberty Mutual insurance company—I’ve also had a complex relationship with owning my role in the ecosystem. I’m not an engineer or a coder. The extent of my programming skills stop at editing HTML or figuring out how to compose a custom ringtone on my Ericsson brick phone as a teen.
But I’ve always been interested in how technology shapes and influences culture and society. I cobbled together a media studies degree between literature and film studies in a moment when Facebook was founded, video streaming first became possible on the web, and the first iPhone came out. And I was dabbling in meme studies with a few nerdy recent grads in Web Ecology Project and ROFLCon, long before memes became weaponized and commodified in the infinite scroll attention economy. And I was organizing panels about how we were paying with our data as early as 2011, long before we widely understood that the stakes of surveillance capitalism reached far beyond targeted advertising. I’ve always been an interdisciplinarian and a bridger of audiences and worlds.
Feral academic, corporate queer. Thankfully there’s a lot of folks like me—with nonlinear career trajectories and interdisciplinary backgrounds—in the responsible tech ecosystem who wouldn’t necessarily identify as a “technologist” but are deeply invested in understanding and shaping tech’s role in society.
How did you come to research the responsible tech ecosystem? Why did you decide to pursue this research at All Tech Is Human, a nonprofit organization?
I came to All Tech Is Human as a Siegel Research Fellow by way of research I conducted as a principal analyst at Forrester last year, where I was looking at the rise of responsible and ethical tech functions in the industry. I was intrigued by the introduction of groups like the Office of Ethical and Humane Use of Technology at Salesforce, and wanted to explore how their remit was starting to impact product development. I was interested to explore how this work might trickle into enterprise tech decision-making for those who buy and deploy technology in support of their businesses across not just the tech industry, but all companies—from CPGs, to banking, to manufacturing and beyond.
The premise is, as every company uses technology to reach customers and employees, every company is accountable for tech’s impacts on stakeholders. Therefore, every company needs to have a responsible tech strategy.
I was only just scratching the surface on that project when I left Forrester last year, because I believe I can have greater impact by interacting with the broader set of movers and shakers in the responsible tech space in this moment rather than waiting for these questions to trickle into the boardroom and the C-Suite offices of non-tech firms. The generative AI moment has certainly accelerated that trajectory, but seeing how ESG has become a divisive flashpoint tells me we have a long way to go before we see responsible tech KPIs and benchmarks in board meetings.
What is responsible technology, and who are the folks advocating, building, and researching in that space?
Responsible technology starts from the belief that all technology embeds human values and decisions in its design. No technology is neutral. Put another way, all tech is human!
Every element of technology is the product of human decisions and values. What responsible and ethical tech efforts attempt to do is articulate those values and assumptions, make deliberate and informed decisions, and account for their impacts on every stakeholder.
But of course it’s not easy to agree on shared values, especially on a global scale. So we need spaces like All Tech Is Human to convene stakeholders across industry, policy and regulators, researchers and academics, community members, and even students who are eager to find their place in a tech-centric society and workforce.
What are some of the big questions driving your work as a Siegel Research Fellow at All Tech Is Human (ATIH)?
For five years or so, ATIH has been doing this resourcing and relationship-building work very well, by helping folks “find the others” and building and connecting parts of the community. Our network is so robust, and yet we only have anecdotal understanding of the type of folks who make up our community. We quip that everyone is a unicorn—with meandering career paths that lead us to tech with “non-traditional” tech backgrounds. That’s fundamentally what diversifying the tech pipeline looks like.
But what does that diversity look like across the network in real numbers? How many folks have crossed over from policy to industry, for example? Who has a social science or humanities background and what roles are they in now? We have an opportunity to describe different personas across the ecosystem that will allow us to better serve their needs with content, programming, and resources going forward when we have a clearer understanding of our segmentation. Answering those questions will also set us up for measuring the impact that members of the community have on the trajectory of tech and society issues over a longer time horizon going forward.
What are the most urgent needs within the responsible tech ecosystem at this moment?
I’m personally really interested in this particular moment of responsible tech realignment. On the one hand, the call for responsible tech has never been more urgent in the face of AI developments, threats to democracy, and global conflicts that are polarizing speech and driving the spread of misinformation. Yet over the last few years, we’ve seen contraction and layoffs across the tech industry, often with trust and safety and responsible innovation teams first to suffer from belt tightening austerity measures.
I’m interested in investigating the state of these self-regulatory responsible tech efforts when these competing forces are at play. How many of the people we have helped get hired through ATIH job boards still have roles in those Big Tech orgs today? How effective is a self-regulatory theory of change when it is so deeply dependent on market conditions? We have a particularly unique vantage point by being a resource where many different stakeholders converge and convene to tackle these questions in a broader theory of change.
What are the biggest challenges you face in approaching these research questions? What obstacles stand in the way of this kind of research more broadly?
Responsible tech practices can be seen as a liability at companies. Gaming out potential harms and unintended uses that don’t actually change product design or incentive structures becomes evidence of complicity. We’ve seen what that risk looks like in the Facebook Files. So what incentives do firms have to support impact assessments and consequence scanning exercises in the development lifecycle?
Until we can do more to connect the carrot of consumer benefits to rebuilding trust, we’ll have to rely on the stick of policy to incentivize this work. Most importantly, all of these levers of tech regulation have to be in play all at once. It’s not an “either/or”—it’s a “yes/and.” That means getting engineers, policymakers, consumer demand, and societal norms and values in alignment in this fast-moving, multidimensional chess game.
Who are some stakeholders in the Responsible Tech Ecosystem that are overlooked and under prioritized? Why should their voices be heard?
Youth. Even in our recent event highlighting youth issues in tech, the youngest panelist was a recent college grad. They’re not present at mixers, events, or in our ATIH Slack community yet. There are obvious challenges and barriers to engaging young people, but the more we can listen to youth experience and harness their energy for change, the more we will be connected to imagining better futures for our relationship to technology.
All Tech Is Human has prioritized resource-building for students looking to embark on careers in tech that align with their values and visions for the world they want to live in. The mentorship program is a phenomenal starting point for building and sharing those connection points between generations.
What do you hope the impact of your research can be for society? What methods will help you achieve those outcomes?
I’ve always been motivated by projects that help clarify and reframe ways of thinking about complex problems. At best, research informed by constructive technology criticism helps us take nothing for granted by examining power structures, interrogating where powerful language and ideology stems from, and imagining alternative futures. Constructive technology criticism aims to bring stakeholders together in productive conversation rather than pitting them against each other.
Constructive technology criticism poses alternative possibilities. It skews toward optimism, or at least toward an idea that future technological societies could be improved. Beyond intellectual arguments, constructive criticism is embodied, practical, and accessible, and it offers frameworks for better living with technology. I’m hoping that I can bring that critical care to the work that ATIH has already done to develop into this next phase of ecosystem building.
How are you planning on leveraging the Siegel Research Fellow community in doing this work? Are there any other Siegel Research Fellows whom you are especially excited to learn with and from during your fellowship year?
I jumped at the opportunity to present my early research design and scoping to the Siegel Research Fellow cohort at one of the first fellows meetings this fall. From my experience as a fellow at the Berkman Klein Center for Internet and Society, I know just how much I benefit from getting input early and often—especially to get unstuck when I am tempted to boil the ocean. I’ve already gotten some very practical advice on methods and tools, as well as moral support in tackling this very rich but open-ended research opportunity!
I’m also really looking forward to exploring opportunities to collaborate with fellow interdisciplinarian Caroline Sinders, the founder of the human rights and design lab, Convocation Research+Design and a Siegel Research Fellow. Their work and intersectional career is inspiring, and I hope we can find a chance to explore creative research interventions together. A recursive self-regulating solutions speculative chatbot based on an All Tech Is Human training corpus, perhaps…? I’ve been exploring theories of narrative change through complementary creative responsible tech projects like A People’s History of Tech and contributing to a Sundance premiering documentary called Eternal You, exploring startups that use AI to create avatars that allow deceased relatives to speak with their loved ones.
More from Sara M. Watson:
- Report: Toward a Constructive Technology Criticism, by Sara M. Watson, published by Columbia Journalism Review, October 2016
- Podcast: Tackling Technology’s Trust Problem, with guest Sara M. Watson, hosted by Forrester, recorded August 2022
- Blog post: “Now Is Not The Time To Pull Back On Responsible And Ethical Tech Efforts,” by Sara M. Watson, published by Forrester, September 7, 2022
- Blog post: “Technology Is Not Neutral,” by Sara M. Watson, published by Forrester, April 2022