Interview with Digital Rhetorician Estee Beck on Algorithmic Surveillance

University of Minnesota Codework Collaborative

admin@codework:~$ sudo codework

The CodeWork team is proud to present a special CodeWork interview with digital rhetorician Dr. Estee Beck. We discuss digital rhetoric, what is it, and how does it help illuminate issues regarding algorithmic surveillance.


Interview with digital rhetorician Dr. Estee Beck on digital rhetoric, algorithms, and surveillance studies

Digital rhetorician Dr. Estee Beck discusses the nature of digital rhetoric and its role in helping others to understand the implications of tracking technologies in networked environments.

Algorithmic Surveillance Interviewee

  • Profile image of E. Beck

    Dr. Estee Beck | Digital Rhetoric, Surveillance

    Dr. Estee Beck is an Assistant Professor of rhetoric and writing at the University of Texas at Arlington in the Department of English. Her work focuses on understanding the rhetorics of tracking technologies and agency in networked spaces. Her scholarship can be found in Computers & Composition and Hybrid Pedagogy.

Interview transcript with Dr. Estee Beck

The CodeWork team is thrilled and honored to present this interview between CodeWork member, Chris Lindgren, and digital rhetorician, Dr. Estee Beck.

Please feel free to share this interview (#codework). You can contact either Chris (@lndgrn) or Dr. Beck (@estee_beck) on Twitter.

Locating object out of many.
(src of original img: Tarbell, 25 Apr. 2012)

Lindgren: For our readers, could you provide some information about what your field of digital rhetoric is and its aims?

Beck: What I find intriguing about digital rhetoric is its historical naming. Both Elizabeth Losh in her 2009 book, Virtualpolitik, and Douglas Eyman (2012) in a blog post on Sweetland Digital Rhetoric Collaborative attribute Richard Lanham with coining “digital rhetoric” in his 1993 book, The Electronic Word. In the chapter, “Digital Rhetoric and the Digital Arts” Lanham argued the computer is a “rhetorical device” (p. 31). He asked humanities educators to consider how electronic texts figured in the curriculum. Certainly, scholars cannot overlook Lanham’s contribution to digital rhetoric and digital humanities.

However, I tend to view digital rhetoric emerging from computers and writing in the mid to late 1970s and early 1980s. Through the work of people like Hugh Burns, Charles Moran, Lisa Gerrard, Kathleen Kiefer, Cynthia Selfe, Bill Wahlstrom, Gail Hawisher, and so many others, the rise of software development, programming, and computer-based writing happened in connection with writing in composition classrooms. Speaking of computers and writing, Gail Hawisher, Paul LeBlanc, Charles Moran, and Cynthia Selfe chronicled the history of the field in Computers and the Teaching of Writing in American Higher Education: 1979-1994, where they make mention of the many people and projects involved in what I consider to be digital rhetoric and composition work.

In ways, I also see the work of computers & composition and digital rhetoric as currently distinct. Much of the scholarship happening in digital rhetoric relies upon rhetorical theory--a qualitative finding Crystal VanKooten drew attention to in her talk on research methodologies at the inaugural Indiana Digital Rhetoric Symposium in April 2015, organized by Justin Hodgson and Scot Barnett. VanKooten performed an analysis of the scholarship and identities of those speakers in attendance of the symposium (many of whom identify as digital rhetoricians doing digital rhetoric) and found 63% of those speakers performed rhetorical theory in their digital rhetorical work. Arguably much of the rhetorical theory digital rhetoricians contribute is not tied to pedagogy or the composition classroom at undergraduate and graduate levels. While computers and composition has decades of scholarship of theory and practice, much of its work positions the social, political, cultural, and economic conditions of technology use to pedagogy, and/or composition itself. This is all to acknowledge that there are complementary theories and practices in both disciplines, with a history extending into the 1970s and 1980s.

Lindgren: Thank you for that survey and the resources, too! Considering this wide array of scholarship, where and how do you situate your work? If you could, maybe, define digital rhetoric within the scope of your research agenda.

Beck: My work draws on transdisciplinary scholarship from architecture, computer science, feminism, philosophy, critical theory, media studies, surveillance studies, and of course rhetoric & composition along with computers & writing and digital rhetoric. What I’m look at within these fields are ethics, aesthetics, and morals. Who defines--and gets to define--an ethics for working with people and objects? How are values and beliefs shaping intellectual work that expand and limit inquiry? In what ways are beauty and desire commodified in digital spaces and for what audiences? What interests me about these questions and differing fields of inquiry are how power functions to promote and oppress ways of knowing about public and private communities. More importantly, how does an ethical engagement with people, language, animals, and objects promote more inclusive and divergent thought and action paths?

From these disciplinary perspectives, I tend to view digital rhetoric as a subfield of rhetoric and complementary to the subfield of computers and writing. As a subfield, digital rhetoric is concerned with the practice of informing, exploring, analyzing, making, theorizing, using [how] people, materials, and institutions circulate in digital spaces. However, I also want to keep this definition elastic enough to account for changes in perspectives in years to come.

Lindgren: I have a follow-up question to your emphasis on transdisciplinary and being "elastic": We, here, at CodeWork, are using this term, "codework," for others across a wide-range of disciplines to interpret. How does codework play into digital rhetoric? Specifically, I'm wondering how you see it in your own scholarly work.

Beck: Interestingly, while at the Indiana Digital Rhetoric Symposium, I had the honor of speaking with Kevin Brock, who is doing fascinating research with computer code and rhetoric. He’s looking to the subfield of critical code studies—a discipline where scholars apply critical theory and hermeneutics to computer code—as a corollary to what he calls “rhetorical code studies.” Such studies within rhetoric might be concerned with rhetorical analysis of computer code or with using the tools and methods rhetoricians use to examine discourse. When I think of the term “CodeWork” in relation to digital rhetoric, a rhetorical code studies comes to mind.

I also see CodeWork as acknowledging the scholar-teachers who’ve advocated for programming and code in digital rhetoric and complementary fields like composition, rhetoric, and scientific & technical communication. Annette Vee has a fabulous article on computer programming as literacy, and she and James Brown, Jr. have a forthcoming special issue in the journal, Computational Culture on the intersection of rhetoric and code. Writing scholar Karl Stolley has promoted writing teachers and scholars to learn to code and has talked a great deal about source code—to experience and craft code as an act of writing.

Yet, I also look to the future of CodeWork with a rhetorical code studies and think about how such studies benefit from a diverse contribution of code, theory, and scholarship from people of color, from queer bodies, from disabled bodies, and so on. I want to experience and learn from a multicultural scholar-teacher base about computer code within digital rhetoric.

In my own scholarship and teaching, I tend to think of code and digital rhetoric theoretically. Currently, I’m interested in how values and beliefs become encoded and transmitted in computer code, and how those ideologies function when end-users interact with computer code. I’m also interested in a resulting ethics with code—how regulation of code (I’m thinking of Lawrence Lessig here), responds to political, social, and cultural registers of what’s good and right—and those two concepts may (or may not) be in the best interest of corporations, governments, and not necessarily the public. Surveillance, privacy, and algorithms are all concepts that intersect with an ethics of code.

Lindgren: It's great to know that scholars, such as yourself, are taking on issues surrounding code, surveillance, and regulation. I’m also interested in learning more about how values and beliefs become encoded/transmitted in/through computer code. I’ve particularly found theories of mediation to be useful methodologically to shed some light on these ways that values and beliefs are carried and rejected in and through technology. How do you see yourself investigating this process as rhetorical?

Beck: What I’m trying to get at in my work is how the operation and function of algorithms perform/embody rhetoric since algorithms are language objects that affect real change on/within human and machine processes. So, if rhetoricians contend algorithms are rhetorical agents, then how does having a non-human object expand rhetoric’s legacy of focus on the human? What I’m really after in my work, however, is what and how the general population views algorithms. Specifically, if the public considers algorithms as having a type of agency and/or a persuasion, then what critical tools will the public use to unpack the underlying ideologies of algorithmic structures? And, by becoming a more informed citizenry about the invisible processes underneath the interface, how will a critical public use information for the betterment of the self and the community? It’s not that I tend to think algorithms = bad. However, I do tend toward a soft determinist stance and view that technology does, in some ways, shape human activity. As a result, prosumer culture should critique underlying mechanisms for “x” reasons, whatever those reasons are for a person or community.

Lindgren: Your work, from what I gather, is calling for others in rhetoric to see algorithms as persuasive, or as having the capacity to be persuasive. Could you walk us through a specific example of what this looks like, or how it unfolds?

Beck: When I first began thinking about persuasive computer algorithms, I was thinking about how Facebook’s newsfeed functioned to provide personalized results and, of course, those pesky advertisements as well. At the time, I was captivated by Edward Snowden’s disclosures about the government collecting telephone metadata on US citizens. As journalist Glenn Greenwald continued reporting about the National Security Agency’s treasure trove of data and their collection efforts, I began thinking that the data algorithms collected was a form of surveillance.

I started reading in the fields of surveillance studies and computer science to understand how scholars and programmers wrote about online surveillance and computer code. As a rhetorician, I noticed how cultural, social, political, and economic values and beliefs framed surveillance and programming cultures. Early on, I declared—without much evidence at the time to support the claim mind you—that algorithms had persuasive abilities through the reading of cultural values in the literature. I spent well over a year coming up with support to prove that claim.

It wasn’t until I read technology studies scholar, Lucas Introna’s (2011) article, “The Enframing of Code: Agency, originality and the plagiarist” that I was able to make a clearer connection with algorithmic persuasion. In this article, Introna made an argument that authorial agency transfers to computer code though an encoding process, i.e., the rule-governed behaviors, values, morals, and ethics of the author extend into the written languages of computer code. When end-users interact with such code, the users are “re-users” of those behaviors, leading to performativity (a la Butler). I take up Introna’s argument and contend persuasion becomes encoded when programmers perform any act of writing code. The three appeals—logos, pathos, and ethos—are already embedded in computer code, through the historical, cultural, social, political, and economic conditions and environments programmers work within.

Take the no longer used EdgeRank algorithm by Facebook, for example. The algorithm, Σ ue we de, provided Facebook users with stories they most likely will interact with more because of ue) affinity between viewing and interacting with users; we) weight of creation, ‘like,’ connection, view; de) decay of time from the moment of interaction. In thinking about the means of persuasion, how does this algorithm operate with logos, pathos, and ethos? Perhaps the most public and compelling case occurred with the Facebook emotional contagion study by researchers Adam Kramer, Jamie Guillory, and Jeffrey Hancock (2014). By adjusting the algorithm to feature more or less positive posts on the newsfeed, the researchers found users posted more or less positive posts on their own accounts as a result.

Lindgren: I’m intrigued by this concept of enframing as, perhaps, a phenomena to study as rhetorical and as writing. What ways do you think other scholars and researchers should conduct such research in the scope of code and surveillance?

Beck: Right now, I’m interested in connections and collaborations with empirical researchers, who can study code and surveillance using a variety of methods and methodologies. One area I’ve been curious about is a workplace/programmer study with those who write code. What are their thoughts about what they do? How do they see the code they write as “writing” or as part of (or not) a surveillance network? What are the considerations programmers employ when writing code, i.e., machine limitations or cultural impact? Now, I think it’d be cool to study a place like Facebook about their coding practices, and employee perspectives about their analytics, but whether that could happen or not is another story.

With that said, I also see connections with those who work in environmental/sustainability; cultural; and feminist rhetorics. What might the perspectives be of a cultural rhetorician about code and surveillance, or the environmental impact of maintaining a surveillance culture, or the political and bodily dimensions of writing female-identified bodies in surveillance states? Just as researchers in surveillance studies have explored surveillance from multiple critical perspectives over the last thirty years, so can those in rhetoric and writing. After all, we write, read, and play in digital spaces.

Lindgren: Your work in rhetoric seems to be coming to terms with how rhetoric and surveillance studies can/should be better integrated, due to increasing efforts by governments and corporations to extend their gaze and influence with networked technologies. How do you see rhetoric’s role in understanding algorithms impacting such issues of surveillance?

Beck: Foremost, I think the larger field of rhetoric and composition needs more scholarship on issues or surveillance and privacy, whether it’s connected to the classroom, through activism, or theories about a rhetorics of surveillance. Certainly, there’s already existing scholarship about surveillance and privacy applied to identity, course management systems, assessment, research, power, social media, classrooms, and gaming (Beck, 2015; Beck, Blair, Grohowski, forthcoming; Crow, 2013; Hawkes, 2007; Janangelo, 1991; Reyman, 2013; Selfe & Hawisher, 1991; Vie, 2014). However, now our educators and scholars within rhetoric and composition have to contend with are issues of biometrics with wearables, sensor technologies with the Internet of Things movement, social media analytics and tracking because millions of people use these everyday items.

In Jay David Bolter and Richard Grusin (1999) work, they use a term, “immediacy.” This is an idea that computing technology is rendered invisible from human experience while people are engaged in the technology. As Cynthia Selfe (cf. Beck, 2013) has elsewhere remarked, when people stop questioning the interfaces—and arguably engage in immediacy—that’s when ideology works the most strongly. So, my curiosity about the Internet of Things movement, where algorithms collect and sort biometric and click data, is how people give up their data because of the perceived benefits of the product or service. Why is that? How has the rhetoric of sharing, of online networking, of connecting with others led people to participate in algorithmic surveillance? This is just one area where I think digital rhetoricians can take up scholarship by examining the political and social campaigns of connection and benefits many companies tout for people to use their products and services, while behind the scenes collecting and profiting off people’s data.

I think it’s an exciting time to work within digital rhetoric, especially with so many concerns over surveillance and privacy the news and social media circulate on a weekly basis. Rhetoricians are poised to enter into these conversations, and I look forward to continued scholarship in this area.

Lindgren: I want to pick your brain about the recent push for wearables and the “Internet of Things.” In what domains are you interested in unpacking issues surrounding rhetoric and surveillance. Maybe you could speak to why you’re drawn to these domains. Essentially, what needs examining and why?

Beck: The push toward tracking health related items has me curious about what the companies of those consumables do with customer data; so, it’s a concern with the domain of civic and social life. Of course, with a lot of software coming out these days in a perpetual “beta” stage, there’s always a movement to collect user data of the software to improve the product for prosumers (see Toffler, 1980). With local, state, and federal governments in the United States, there’s legal cause to share data with those entities in the interest of building cases against those who would harm the self or others. There’s also sharing of anonymized data with researchers to learn more about specific populations for studies. Each of those sharable events to external parties come with their own sets of risks from what the public has viewed in the media with the Edward Snowden disclosures and the Facebook emotional contagion study.

What concerns me more is the disclosure of prosumer data to other third-parties like insurance companies, legal firms, financial entities, and other organizations and business that would use such data to make decisions impacting populations of people. Of course, surveillance researchers have been talking about this concern for well over a decade (see the journal Surveillance & Society for a starting point). And, I’m looking at this concern from a writing studies perspective--how is the general public writing in the surveillance state, providing data and meta-data about their search habits and location histories for organizations and governments to churn for profit or other reasons?

In the area of the “Internet of Things” and prosumer data, I attended a great panel at Computers & Writing 2015 in Menomonie, Wisconsin by graduate student Kaitlin Clinnin from Ohio State University, recent OSU graduate, Katie DeLuca, and recent Michigan State University graduate, Katie Manthey who discussed using technologies like FitBit, Tumblr (not necessarily a wearable, but a site that does mine meta-data) with talks on bodily control, fat acceptance, and speaking back to patriarchal and neoliberal discourses about female bodies. What excites me about their collaborative work in this area comes from challenging and subverting dominant narratives about female bodies, but more importantly, how data tracking technologies in wearables sustain dominant narratives of Western values of bodily aesthetics--and, how their work with challenging those narratives in wearable spaces can encourage empowerment and civic change.

Lindgren: The Internet of Things brought us into a more recent fad of scholarship surrounding novel tech, but I like how one of your projects, Writing in the Surveillance State (WISS), seems to be exploring this issue of enframing as it unfolds by simply writing in digital environments. Writing is an activity that has always been linked to material technology, so I’m wondering if you could speak to how you envision this newer WISS project will develop: What are its aims, main questions / points of inquiry? Where do you see it going?

Beck: This project is inspired by Colin Beaven’s No Impact Man blog, turned book and film. Beaven’s work on developing a no-net environmental impact by making incremental personal behavioral changes fascinates me intellectually. What does it mean to take a stand and reorder everyday habits? How do such changes lead to other reorientations about the world around us?

While Beaven frames his work as responding to the environmental crisis, I don’t see (at the time of this interview, at least) surveillance and privacy as on par with that level of instability. However, I certainly do view surveillance and privacy online as potentially moving in such a direction, hence the project.

The main aim of this year-long work addresses my personal and professional uses of digital technologies connected to surveillance and privacy, and how I navigate spaces that use tracking technologies or offer little privacy protection. What I’m thinking of with this project is how an everyday person can learn more about the places he/she writes in; how sites and apps collect (or don’t) data; what these companies do with such data; and how governments respond and interact in a digital surveillance state.

As far as where this project will lead? I’m hoping the project will develop into materials and lessons learned for not just educators in rhetoric and writing but for the public as well.

Lindgren: On that note about the potential for public uptake of your work, how do you see your research as something tractable and practical for everyday folks in everyday situations?

Beck: Again, this is what I’m hoping the writing in the surveillance state project will provide--ways everyday people can learn about surveillance and privacy online and make informed decisions on how to participate with a degree of privacy (or not). There’s already a wealth of great work in this area, from the educational materials provided by Electronic Frontier Foundation (EFF) and the work by activists and employees with the American Civil Liberties Union (ACLU). I suppose I’m just starting to figure out how to make educational contributions informed by theories, methods, and practices in writing and rhetoric to a more public audience. And in the process, meet people who work in or at cross-sections who continue to inspire projects in this area, like many of the people noted in this interview. It takes a collective to address and work on large-scale issues, such as surveillance and privacy, and I’m honored and happy to be part of a large network.

(provided by Dr. Estee Beck)

Beck, Estee. (2013). Reflecting upon the past, sitting with the present, and charting our future: Gail Hawisher and Cynthia Selfe discussing the community of Computers & Composition. Computers and Composition, 30(4), 349–357.

Beck, Estee N. (2015). The invisible digital identity: Assemblages of digital networks. Computers and Composition, 35, 125–140.

Beck, Estee N., Blair, Kristine L., Grohowski, Mariana C. (Forthcoming). Subverting virtual hierarchies: A cyberfeminist critique of course management spaces. In Jim Purdy and Dánielle Nicole DeVoss (Eds.), Making space: Writing instruction, infrastructure, and multiliteracies. Sweetland Digital Rhetoric Collaborative/University of Michigan Press.

Bolter, Jay David, & Grusin, Richard. (1999). Remediation: Understanding new media. Cambridge, MA: MIT Press.

Brock, Kevin. (2013). Engaging the action-oriented nature of computation: Towards a rhetorical code studies. Dissertation. North Carolina State University.

Clinnin, Kaitlin, DeLuca, Katherine, Manthey, Katie. (2015, May 29). Building a body: Intersections in technology, literacy, and body in online spaces. Papers presented at Computers & Writing. Menomonie, WI.

Crow, Angela. (2013). Managing datacloud decisions and “big data”: Understanding privacy choices in terms of surveillance assemblages. In Heidi McKee & Dánielle Nicole DeVoss (Eds.), Digital Writing Assessment. (chapter 2). Logan, UT: Computers and Composition Digital Press.

Douglas Eyman. (2012, May 16). On digital rhetoric. [Web log comment]. Retrieved from

Greenwald, Glenn. (2013, June 5). NSA collecting phone records of millions of Verizon customers daily. [Electronic version]. The Guardian. Retrieved from

Hawisher, Gail E., LeBlanc, Paul., Moran, Charles, Selfe, Cynthia L. (1996). Computers and the teaching of writing in American higher education, 1979-1994: A history. Norwood, N.J: Ablex Pub.

Hawisher, Gail E., & Selfe, Cynthia L. (1991). The rhetoric of technology and the electronic writing class. College Composition and Communication, 42(1), 55–65.

Hawkes, Lory. (2007). Impact of invasive web technologies on digital research. In Heidi McKee and Dánielle Nicole DeVoss (Eds.), Digital writing research: Technologies, methodologies, and ethical issues. (pp. 337–352). Cresskill, NJ: Hampton Press.

Introna, Lucas. (2011). The Enframing of Code: Agency, originality and the plagiarist. Theory, Culture and Society, 28(6), 113–141.

Lanham, Richard A. (1993). The electronic word: Democracy, technology, and the arts. Chicago: The University of Chicago Press.

Lessig, Lawrence. (2006). Code: Version 2.0. New York: Basic Books.

Losh, Elizabeth M. (2009). Virtualpolitik: An electronic history of government media-making in a time of war, scandal, disaster, miscommunication, and mistakes. Cambridge, Mass: MIT Press.

Janangelo, Joseph. (1991). Technopower and technopression: Some abuses of power and control in computer-assisted writing environments. Computers and Composition, 9(1), 47–64.

Kramer, Adam D. I., Guillory, Jamie E., & Hancock, Jeffrey T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. PNAS, 111(24) 8788–8790.

McKee, Heidi. (2011). Policy matters now and in the future: Net neutrality, corporate data mining, and government surveillance. Computers and Composition, 28(4), 276–291.

Reyman, Jessica. (2013). User data on the social web: Authorship, agency, and appropriation. College English, 75(5), 513–533.

Toffler, Alvin. (1980). The Third Wave: The Classic Study of Tomorrow. New York, NY: Bantam.

Vankooten, Crystal. (2015, April 10). Methodologies for research in digital rhetoric: A survey of an emerging field. Paper presented at Indiana Digital Rhetoric Symposium. Bloomington, IN.

Vie, Stephanie. (2014). Casual surveillance: Why we should pay attention to Candy Crush Saga and other casual games. First Person Scholar. Retrieved from