Theorycrafting Algorithms: Teaching Algorithmic Literacy

 

Rebekah Shultz Colby—University of Denver

 


 

 

Keywords: theorycraft; algorithm; literacy; ethics; intersectional; critical race theory

 


ABSTRACT

Because algorithms form the audiences that reach us online, students need algorithmic literacy tactics as well as rhetorical awareness when learning to write online. Students need to learn an ethics of relational care: including online groups marginalized not only linguistically but also by algorithms. This article examines student writing to explore how students can use theorycrafting to systematically test an algorithm to gain more critical and ethical awareness of how the algorithm functions and forms publics online. Finally, this article then explores how students can create multimodal intersectional counternarratives in response that they can also more effectively circulate online to more deliberately construct inclusive online counterpublics.

 


CONTENTS

Introduction

 

Online Publics and the Social Sphere

 

Studying How Algorithmic Circulation Forms Online Publics

 

Theorycrafting Algorithms

 

Teaching Ethical Algorithmic Awareness

 

Students Ethically Theorycrafting Algorithms in the Writing Classroom

 

Teaching Students How to Compose Multimodal Counternarratives

 

Notes

 

Works Cited


 

INTRODUCTION

 

John Trimbur argued over 20 years ago that writing teachers need to attend to the ways that writing circulates outside the classroom. With online writing, this means that today we also need algorithmic literacy: understanding how algorithms operate in circulating writing online. In this way, algorithmic literacy goes hand in hand with rhetorical awareness as we need algorithmic literacy to understand how audiences are formed and networked through algorithms online so that our writing reaches its intended audience. For this reason, John Gallagher has gone as far as to say that algorithms become our audience when we write online (“Writing”). Furthermore, by allowing certain actions while foreclosing others, these algorithms construct ethically embodied arguments when we interact with them (Brown, Ethical Programs) as they also actively co-construct our actions and habits, our ethos, online (Holmes, Rhetoric). 

To teach algorithmic literacy, this article explores how first-year students can theorycraft, or systematically test a specific aspect of how an online algorithm circulates writing. More precisely, this article examines student IMRD research papers from a first-year writing research class I taught where students tested algorithms on social media platforms such as TikTok, YouTube, and Instagram or looked at search algorithms such as Google to analyze a particular aspect of how their algorithms circulate. In this way, students also gained a greater rhetorical awareness of how the algorithm they tested forms online publics and the ethics involved in forming such a public. While this class originally only asked students to research and write about algorithms, in this article, I propose assignments I plan to teach in the future for how students can create multimodal intersectional counternarratives in response to the ethos their theorycrafted algorithm constructs. 


ONLINE PUBLICS AND THE SOCIAL SPHERE

 

When we read and write online, there is no doubt that the algorithms, or sets of processes or procedures run by a computer, within social media and search engines play a central role in shaping not only public discourse but how that public sphere is algorithmically formed as well as culturally conceptualized. To see the powerful force algorithms play in constructing political publics, I could point to the increased circulation of misinformation online contributing to mistaken theories of election fraud and mistrust of the Covid vaccine or the fanning of hate speech leading to the genocide of minority Muslims in Myanmar (Asher). I could also point to the positive activism constructing political publics, from the Arab Spring (Howard, Duffy, Freelon, Hussain, Mari, and Maziad) to Black Lives Matter (Mundt, Ross, and Burnett).

As a result, the public sphere is a notion both complicated and fraught and seems to become only more complicated online. However, this complication is to be expected when scholars show how the idea of the public sphere has been culturally formed. For instance, Michael Warner writes that the idea of a public, especially one that can be mobilized through rhetorical discourse for political action, is “a practical fiction” that has sedimented through use into “a cultural form” (8), as most socially useful forms such as genres do. After all, as a practical, cultural fiction, the notion of the public sphere historically can be traced back to the property and slave-owning ancient Greek men of the polis and then, as Habermas also traces, to when it became an important part of economic and political life for the bourgeois during the Enlightenment, a notion of the public sphere that continues to culturally develop today. 

Nancy Fraser further defines the public sphere as encompassing three distinct entities: “the state, the official-economy of paid employment, and arenas of public discourse” (57). Habermas details how the public sphere developed so that the bourgeois could have some political control over the economic sphere, especially as their control varied depending on economic state controls; as a result, because the public can influence the state through voting in democratic societies, the public sphere is not only essential but meant to be a deliberative political space where opinions about politics can be formed and discussed. 

Fraser, however, critiques Habermas for portraying this deliberative public sphere as fairly unified, arguing that all public spheres are made up of many competing groups, some of which are considered subaltern because they exist beyond the boundaries of the normative culture of the official public sphere and tend to be excluded. She further argues that the more stratified a society is, the more normative the public sphere will be, and the less freedom citizens will have to discuss their political views. Instead, media will feed the public their beliefs often offered up as editorial opinion as if the opinion was that of the public. And it could be argued that before the Internet became widely part of household use, the US broadcast mass media operated in this vein: citizens were told what to believe and usually only had local outlets to deliberate or influence others as influencing the larger broadcast mass media as a fully contributing part of the public sphere was almost impossible.

In contrast to the more normatively controlled and nonparticipatory aspects of broadcast mass media, C. Wright Mills constructs a more fully democratically participatory public, defining a public thus: 

(1) Virtually as many people express opinions as receive them. (2) Public communications are so organized that there is a chance immediately and effectively to answer back any opinion expressed in public. Opinion formed by such discussion (3) readily finds an outlet in effective action, even againstif necessarythe prevailing system of authority. (303–04)

In other words, citizens in Mills’s public sphere have access to a forum where they can deliberatively discuss their views and can take rhetorically effective political action. Specifically, Mills argues that in such a public sphere, authoritarian institutions that would attempt to take away this participatory freedom cannot penetrate. In constructing this definition of a public, Mills was writing in the 1950s when, as Matt Barton and Dan Ehrenfeld put it, “mass media broadcasting’s unidirectional flow of opinion and information appeared to be the greatest obstacle to democratic deliberation” (3).

The internet seems to hold out the promise of creating spaces for Mills’s conception of a public where citizens can more widely circulate their views and even attempt political action. As a result, Warner offers a helpful lens for how the public sphere is constructed that can be applied online, writing that a public can be constructed by “conditions that range from the very generalsuch as the organization of media, ideologies of reading, institutions of circulation, text genresto the particular rhetorics of texts” (14). Within this definition, Warner includes “organization of media” and “institutions of circulation,” which, while including traditional institutions of broadcast media, could also include social media and search algorithms online that facilitate larger circulation. Within this framework, Warner argues that counterpublics can form: groups that culturally run counter to the normative values of the larger broadcast public sphere such as queer and African American cultures. Because counterpublics hold a subaltern position to the larger public sphere, they may be silenced by it even though their voices are essential for a fully functioning democratic polis.


STUDYING HOW ALGORITHMIC CIRCULATION FORMS ONLINE PUBLICS

 

In defining the construction of public spheres, Warner stops short of examining how they are formed online and how “institutions of circulation” circulate online discourse to do so. This is unfortunate, as addressing how rhetorical discourse circulates online to form publics is complicated by the fact that online discourse groups are often fragmented and constantly shifting. 

Furthermore, online publics are made even more complex as algorithms control most online circulation, creating a complex collaboration between humans and machine in forming publics. Algorithms are sets of processes or procedures, as Ian Bogost defines them, and sets of tasks users complete, as Kevin Brock defines them. In fact, Bogost goes on to argue that algorithms construct procedural arguments, constructing a procedurally enacted rhetoric through their processes. An algorithm’s procedural logics then co-construct circulation with writers and readers online. In coining the term “rhetorical velocity” for texts that are written and designed to optimize online circulation, Jim Ridolfo and Danielle DeVoss pave the way for further examination of how algorithms shape online circulation. For instance, they mention that Google indexes mailing archives for specific time lengths and that documents should be saved in certain formats to be posted on certain platforms. Furthermore, in addressing how algorithms construct Warner’s “organization of media,” Annette Vee and Timothy Laquintano add that the gatekeeper editors of traditional broadcast publishing and other media have not disappeared online; they have just changed to become algorithmic and now include “corporate giants like Amazon, algorithms that determine bestseller lists, and automated reputation measures” (44).

In studying algorithms and algorithmic circulation, scholars within the digital humanities have constructed the subfield of code studies to study how code constructs meaningfully enacted texts or procedural arguments through algorithms. Mark Marino argues that critical code scholars should not only examine how code operates but also examine it critically to see how it signifies as a social text. Within rhetoric and writing studies, rhetorical code studies has followed Bogost’s lead in examining how code constructs, as James Brown puts it, “compositions and even arguments” (“Crossing” 29). 

Because online circulation is often a complex interplay between human and machine agency, Axel Bruns, Jean Burgess, and Tracy Hayes have studied how the Twitter hashtag constructs online publics that are overlapping, participatory ad hoc publics based on shared beliefs, purposes, or affinities. The hashtag is particularly compelling to study as an interplay of human and machine agency in constructing circulation as it was developed and suggested by Twitter users (Bruns and Burgess). Furthermore, while the hashtag constructs an algorithmic sorting function that controls how tweets circulate, the hashtag is a function that is also actively co-authored by Twitter writers, giving them agency in how their tweets are sorted. As a result, Bruns and Burgess studied how Twitter’s hashtag forms ad hoc publics. They examine how the hashtag allows publics to form around political topics and argue that instead of a fragmented and isolated public forming, the hashtag constructs “a patchwork of overlapping public spheres” (6). Twitter users can then use the hashtag to find any tweet about a related topic. Thus, while users may not agree with everything said in the tweets that result from their hashtag search, they can see the full ecology of opinion that emerges in forming a rhetorical public around their topic with the hashtag. Bruns and Burgess discuss how hashtags are also temporal and can form publics around past, current, or future events as when someone is running for office. Drawing on James Paul Gee and Elisabeth Hayes’ conception of an affinity space after studying how fan groups such as gamers form online, Hayes adds to Bruns and Burgess’ definition of the ad hoc public formed by hashtags by arguing that they also form around a shared purpose or passionate affinity.

Because algorithms play such a central role in forming online publics through how they circulate writing, recent scholars have studied how writers learn algorithmic logics on social medial platforms so that they can optimize how they circulate their writing. In this way, these scholars are examining how online writers gain algorithmic literacy or an understanding of how algorithms circulate writing to certain audiences, constructing a particular public in the process, a literacy both Vee and Douglas Eyman argue effective online rhetors should possess. In fact, Angela Glotfelter coined the term algorithmic circulation to more clearly focus on how algorithms affect online writing circulation. Conducting a study of YouTube, Facebook, and Twitter, Glotfelter found that her participants learned the logics of the platform’s algorithms, gaining algorithmic literacy as they participated within it, often experimenting and learning by trial and error, but that they also learned by reading publisher updates and participating in forums about algorithmic updates, and adapted strategies for boosting their content that worked with the platform’s algorithmic logic. For instance, participants would “avoid genres or phrases known to be algorithmically downgraded” (6), use the meta-data available within the platform to target specific audiences, figure out how to optimally interact with others’ posts to increase engagement with them and their target audience, and use the platform’s built-in metrics to gauge who their audiences were and how many were engaged with their posts.

To help students gain the algorithmic literacy these scholars have studied, many writing teachers have explored how to help students gain algorithmic literacy in their writing classes so that students can better use them for their rhetorical ends in circulating their writing online. In “Writing for Algorithmic Audiences,” Gallagher has argued that students should be taught to rhetorically compose not only for an audience of people, but also should gain algorithmic literacy so that they can write for algorithms. Thus, students can “think their audience as the processes and procedures by which YouTube prioritizes their videos” (27) in its circulation algorithm. To teach students algorithmic literacy so that they learn to use algorithms to circulate writing in rhetorically effective ways, some writing teachers offer reflective heuristic questions (Edwards, “Circulation”) or offer open reflective opportunities for students to analyze how algorithms work through their own experiences using them (Gallagher; Koenig). 


THEORYCRAFTING ALGORITHMS

  

However, while getting students to reflect on their experiences using an algorithm is helpful, as students have undoubtedly spent many hours on social media platforms and search engines, algorithms are often proprietarily black boxed, or hidden so that users cannot see them, often, as Vee and Laquintano argue, to prevent users from “gaming” them. Consequently, students may need more information about how algorithms are coded and function to understand them more fully. To that end, in one assignment, Gallagher asks his students to research and read any information online about the algorithm, even asking them to find documentation about that algorithm online (“Ethics”). In another assignment adapted from useability testing, he asks them to test out an algorithm such as Google’s PageRank algorithm with their own websites (“Writing”).

However, Gallagher’s assignment (“Writing”) might be too ambitious, at least for most first-year writing students with limited coding experience. Algorithms parse an immense amount of data, giving them an almost mystical quality, but they are often doing very simple tasks, just at massive scale (Watcher-Boettcher). However, as users may not be able to read an algorithm’s code because of proprietary black boxing or they do not have the literacy to understand the code, many algorithms may seem more complex than they are. Consequently, Gallagher’s assignments and struggles with them (“Writing”) inspired me to develop an assignment asking first-year writing students to theorycraft an aspect of an algorithm. Instead of learning about the entire algorithm, students ran focused experiments that tested one specific aspect of the algorithm to understand how it worked within online circulation. 

Theorycrafting became widely used online within World of Warcraft (WoW) and League of Legends (LoL) gaming communities (Colby and Shultz Colby; Reimer). Theorycrafting, which can also be referred to as meta-gaming, also comes out of a larger tradition of using metrics to game or optimize play within sports. For instance, Michael Lewis details how Billie Beane was able to optimize his baseball team using computer-run statistical analyses (Paul, “Optimizing Play”)However, both Brown in Ethical Programs and Christopher Paul in The Toxic Meritocracy of Video Games argue that theorycrafting sports meets resistance in sports culture for one main reason: players are human. They are prone to error and luck. They do not run on algorithms or machines. Videogames, however, do run on algorithms. While players are still prone to idiosyncratic behavior and mistakes, the algorithms running the outcomes of their play are not; as a result, the algorithms can be tested more systematically than human sports behavior can. Within gaming theorycrafting, players use theorycrafting to find the most optimal gear, talent tree, or spell/ability combination. When trying to find an optimal weapon such as a sword, players can theorycraft by only measuring one independent variable, the sword, to find the dependent variable, the damage per second (dps). As much as possible, players attempt to control for the outside influence of external variables by wearing the same gear (except for the sword) in the same zone fighting the same type, level, and number of creatures or mobs. They repeat this process for each sword. Then they look at their dps score for each sword, picking the sword that gives them the highest dps on average.

Testing social media algorithms, while slightly more complicated, especially as students may not have readily available metrics on hand such as dps meters, is still roughly comparable. As an illustrative example I gave my class, if students want to measure how much “liking” a specific type of content will affect the types of content that appears on their “For You” feed on TikTok, they can open two new identically created TikTok accounts that use different email addresses. On one account, the control, students do not interact at all with their feed. On the other account, students actively like a particular type of content, a video of a specific dance for instance, recording how many times they liked it. Then, after waiting a day, students can record how many times that particular type of dance video appeared in their feed, averaging it with the other content that appeared, in both the control account and the test account in which the student actively liked the dances. In this way, students can measure how much interacting with their feed in a specific way will affect what type of content circulates within it. While students will not fully understand the complexity of the “For You” TikTok algorithm, they will still gain a fundamental algorithmic literacy tactic about a core aspect of it: how likes affect circulation of specific videos within their feed, which they can then better leverage rhetorically when they post on TikTok.

Theorycrafting an aspect of an algorithm works because, in an algorithm, a user interacts with the computer, inputting a value, which the algorithm processes and spits out as output. However, code itself can also be seen as a grammar. Code works by assigning a value to an object variable, the noun, and then calling a method: making the object variable “do” something, the verb. In other words, the code is performing a function, which is often running an algorithm, or a set of procedural steps that often involves some mathematical calculation to that object variable. Theorycrafting works as an experiment because it attempts to control for as many variables and functions as possible in order to test for one specific variable and how a specific function affects it as output.

        While students are not learning how to code by theorycrafting, they are still learning algorithmic literacy tactics. By learning how the code enactively functions in a systematic way, students are partially unboxing the black box of proprietary code the algorithms run from. Furthermore, theorycrafting shows students how to optimize and possibly even game the algorithm for their own rhetorical circulation purposes.


TEACHING ETHICAL ALGORITHMIC AWARENESS

 

However, it is not enough to understand how an algorithm functions. Brown argues that to be an ethical rhetor online means to also understand the ethics an algorithm constructs with its functions as they powerfully shape our rhetorical choices online and co-construct our ethos as rhetors as a result (Ethical Programs), while also constructing our publics online as well. In fact, Steve Holmes takes algorithmic ethics a step further with procedural habits, arguing that algorithms procedurally form our habits, or, drawing from Aristotle, our hexis, which, over time, form the ethos of our character online (Rhetoric).

To this end, several scholars within rhetoric and writing studies have examined the ethics of specific algorithms and interfaces. Andrew Pilsch examined Facebook’s Flux algorithm and argues that its feed-forward architecture consistently asks for a response, encouraging “hot takes” or knee-jerk, emotional reactions rather than thoughtful, deliberative reflection. Jennifer Sano-Franchini examined how the algorithm works together with design features of the interface to also encourage emotional knee-jerk reactions. She found that the interface emphasized concision, speed, and limited perspectives, creating a filter bubble (Parsier). Further building this reactive filter bubble, identities were decontextualized, as Dana Boyd also found, so that nuances of political belief or complexities of identity were left out. In fact, Pilsch and Sano-Franchini’s studies are confirmed by the whistle blower Frances Haugen’s release of files from Facebook, as reported in the Wall Street Journal (“Facebook”). Dustin Edwards analyzed Content ID, a database determining copyright and video circulation on YouTube. He found that, while all YouTube creators must submit their videos to Content ID, Content ID only protects creative ownership for corporate entities; ordinary people must defend their authorship from Content ID instead or have their videos removed. This controls who is authorized to circulate online content on YouTube and sets the stage to reinstate the uniflow networks of mass broadcast media (“Circulation”). 

As these more fine-grained examinations of Facebook and YouTube algorithms illustrate, algorithms are often not programmed with ethics in mind in forming online ad hoc publics. In fact, these studies illustrate the dangers algorithms pose in forming filter bubbles, whether corporate, political, or both, that further alienate and polarize the perspectives of ad hoc publics as they restrain our ability to hear diverse voices. At best, algorithms reify corporate interests, as Edwards argues with YouTube’s enactment of copyright in Content ID (“Circulation”), reinstating an older form of the public sphere in which corporate interests control broadcast mass media and ordinary citizens have little to no input. At worst, algorithms such as those on Facebook have the potential to destabilize democracy, especially if misinformation, disinformation, and hate speech are left to circulate unchecked. Algorithms tend to reflect the biases of their programmers or, if algorithms are self-learning and learn from modeling user behavior, they reflect the biases of users (Kearns and Roth; Noble). For instance, Google’s search algorithm models the search phrase trends of users, which can be racist and sexist, further marginalizing minority groups and inhibiting them from forming online ad hoc publics, as Safiya Noble discovered when she typed in the phrase “black girls,” looking for activities for her young nieces, and found porn instead. 

Because algorithms are often not procedurally ethical, scholars within writing studies have conducted studies of ethical interventions of circulation algorithms, either conducting an intervention themselves or studying how others conducted one to leverage it for future rhetorical action within online publics. Steve Holmes and Rachel Graham Lussos programmed a Twitter bot to spout rhetorical protests championing feminist game scholars and journalists and female game designers within GamerGate discussions on Twitter, creating an example of how a machine algorithm can be a rhetorically ethical actor within online discourse (“Cultivating”). Lavinia Hirsu studied how tech savvy Romanians used the folksonomic logic of popular Google search term tags to create a campaign in which their ethnic identity was changed from search terms labeling Romanians as “racist,” “thieves,” and “stupid,” to “smart” or simply “Roma.” Ryan Shepherd examined how Trump supporters during the 2016 election gamed Reddit by gaining access to the Reddit moderator sticky function so that their Trump posts circumvented the normal sorting algorithms, and then quickly up voted the posts so that they remained at the top of the Reddit feed. While the actions of the Reddit Trump supporters could be considered unethical, the patterns Shepherd’s Reddit study found could still be leveraged for more ethical purposes.

However, because algorithms form enactive procedural arguments that construct our online publics and our actions and resulting character online, teachers should go one step further in teaching algorithmic literacy and should teach students how to ethically examine an algorithm’s procedural logics. In other words, teachers should teach students to be ethically critical of algorithms instead of merely “functional” users of them, as Stuart Selber argues, so that they have the rhetorical power to use them for more ethical ends. Fortunately, many rhetoric and writing scholars have also taken up the challenge of asking students to be ethically aware of how algorithms operate as they learn about them. Gallagher asks his students to write algorithmic narratives that critically analyze the ethical values algorithms reproduce. Additionally, Abby Koenig asks students to reflect on their experiences using algorithms in journals, which she argues moves students from a surface-level understanding of how the algorithms functionSelber’s functional literacyto Selber’s critical literacy, in which students also understand the ethics the algorithms were reinforcing. Similarly, Edwards poses a set of heuristic questions to help students better understand how an algorithm’s function creates ethical effects (“Circulation” 72).

However, much like Quintilian exhorting his pupils to be “good” so that they could use rhetoric effectively, rhetoric teachers must also teach ethics if they wish to teach students how to use algorithms ethically. This also begs the question of what type of ethics to teach, a question made even more complicated by the fact that ethics is at best a set of flexible guidelines that need to be applied with the practical wisdom of phronesis to the varied contingencies of specific situations. To help solve this conundrum, Shannon Vallor argues for twelve technomoral virtues, such as humility, justice, courage, care, and empathy. Using a virtue ethics framework developed from Aristotle’s Nichamachean Ethics, she argues that because we live in a complex interdependent social network with each other, for eudaimonia, or flourishing, to happen, we need to live in harmony as the flourishing of one group affects the flourishing of another. We can do this by using reciprocal relational care with each other as much as possible. Using a critical race framework alongside Vallor, I argue that an ethics of relational care means not only being aware of marginalized groups online, but also being aware of how technology such as algorithms further marginalizes them. An ethics of relational care also means working to disrupt these marginalizing algorithms so that writing by minority groups becomes as much of a part of the public sphere online as writing from those in the majority.[i]

This definition of ethics as relational care that treats marginal voices equitably within discursive spaces stems from Nedra Reynolds’ definition of ethos as place. Ethos by definition is situated. It is our reputation within a community; however, ethos is also defined by our physical place and how we dwell within that space. In other words, our ethos is constructed by the affordances of our place, whether these are the affordances of a social position within a community or, as Thomas Rickert argues, the material affordances of a specific place, which further construct our actions, and over time, our hexis, or habits (Hawhee; Holmes, Rhetoric). 

In invoking ethos as social reputation within a community that is also constructed within a physical place, Reynolds also reminds us that communities are seldom utopic or monolithic. After all, both women and slaves were forbidden from speaking in public within the Greek polis. As a result, in arguing for a definition of ethics as relational care that works to equitably treat marginal voices, I recognize that I am also echoing 50 years of linguistic struggle within rhetoric and writing studies. The Conference on College Composition and Communication’s (CCCC) Students’ Right to Their Own Language was published in 1974, stating: “We affirm the students’ right to their own patterns and varieties of language—the dialects of their nurture or whatever dialects in which they find their own identity and style.” However, rhetoric and writing studies still struggles to fully affirm marginalized dialects and rhetorical forms within our own classrooms as both April Baker-Bell and Carmen Kynard lament. This struggle for linguistic justice within the writing classroom also persists despite decades of CCCC chair addresses arguing to allow equal rhetorical space for minority voices within our classrooms from scholars such as Shirley Wilson Logan, Keith Gilyard, Jacqueline Jones Royster, Vershawn Ashanti Young, Malea Powell and many others. For instance, in her 2003 CCCC chair address, Logan critiques the superficial multiculturalism often embraced by university administrators by stating, “What we are doing is substituting some version of ‘diversity’ for the hard work of acting affirmatively to correct the consequences of past discrimination and denial of rights, particularly of African Americans. That said, what attention are we paying to our changing linguistic demographics?” (334). In other words, within the university, the myth of a monolithic, unchanging Standard English discourse that is deemed more grammatically correct than the grammatical rules governing other dialects has always worked to displace minority dialects and rhetorical forms within the writing classroom despite extensive work from linguists (Canagarajah; Gee; Pennycook) and the scholars noted above within rhetoric and writing studies.

Furthermore, defining ethos as place calls attention to material affordances and how they construct our ethos within a specific space by affording or limiting our actions. However, ethos as material affordance also makes us pay attention to subjectivity and positionality, and how some materials may be economically afforded by some, easily “near to hand (Loc 84),” but not for others as Sarah Ahmed argues, which is also critically important when discussing technology, access, and computer literacies such as algorithmic literacy, a reminder to pay attention consistently made by Cynthia Selfe. For instance, in his book, Adam Banks heartbreakingly retells how his high school could not afford a computer lab full of functional computers.

Unfortunately, as materially discursive spaces, online spaces are also hardly utopic and counterpublics often engage in even more pronounced struggles for linguistic justice as GamerGate illustrates. In her study of discourse in online spaces, Lisa Nakamura argues that the default subjectivity online is always male, white, middle-class, and heterosexual. In other words, the default body of Habermas’ dominant public sphere is still white, male, and heterosexual, a holdover from the larger constitutive forces of Enlightenment logics. Writers online are held up to this norm and judged against it, especially as this norm is often so culturally normalized that it is subconscious. Sadly, as a result of this “standardized” norm, it is not that surprising that counterpublic groups such as women, BIPOC, and queer writers often experience harassment online (Gelms; Reyman and Sparby). 

Online, furthermore, the white, male, heterosexual default does not end at discourse. It also includes the design of online spaces: from the programing of circulation algorithms to the usability designed within interfaces. As far back as 1994, Cynthia and Richard Selfe critiqued Apple and Windows for using files as the default visual interface, which privileges Western users, specifically white men who have traditionally been thought the default office worker. Similarly, Noble found that Google’s search engine was not designed with needs of young black women in mind.

Consequently, teaching an ethics of relational care means making students not only aware of these default online norms but also asking them to interrogate and disrupt them, creating discursive spaces for their own transformational counterpublics as a result. Much as Selfe and Selfe discuss with respect to interfaces, this means introducing algorithms by discussing with students who technology, including algorithms, is not designed for and how these non-default users could be better included by the design. It also means having students read and discuss technological counterstories (Martinez; Solorzano and Yosso) to these design biases such as Noble’s Algorithms of Oppression, which operates as a technological counterstory. It means asking students to theorycraft algorithms, running targeted and systematic studies of one aspect of how the algorithm functions, to partially uncover how it circulates writing online so that they can better understand the ethics of how it functions and the publics it creates. Finally, it means helping students design rhetorically ethical social media posts, which also fosters Selber’s rhetorical technological multiliteracy in response, and using their algorithmic awareness gained through theorycrafting to circulate them, helping them build counterpublics so that we have more diverse and equitable ad hoc publics online. With these social media posts, students are designing critically intersectional counternarratives in response to white, male, heterosexual, middle-class biases in design.[ii]

In the rest of this article, I examine student papers from a first-year writing class I taught, which was IRB-exempt, in which students theorycrafted algorithms to illustrate how students can find out more about how circulation algorithms function, gaining algorithmic literacy tactics and further ethical understanding of how the algorithm functions.[iii] While my students did not post multimodal rhetorical intersectional counternarratives to the algorithmic logics they discovered by theorycrafting, the last section proposes a future class in which students do so.


STUDENTS ETHICALLY THEORYCRAFTING ALGORITHMS IN THE WRITING CLASSROOM

 

The purpose of my first-year writing research class is to introduce students to traditional academic textual research using peer reviewed sources and primary, field-based research using quantitative and qualitative research methods. In my class, students theorycrafted, or conducted systematic quantitative studies, on circulation algorithms on social media platforms and search engines. To do qualitative research, students also conducted interviews with users or even the algorithm’s programmers if students had access to them. In conducting their theorycrafting, students looked at a range of algorithms, such as looking at what factors influence the search function on Google, how what users listen to on Spotify influences what music it recommends, what personal information dating apps privilege, and what influences circulation algorithms on more traditional social medial platforms such as Twitter, TikTok, and Instagram. They then wrote IMRD-style papers about their research, including brief literature reviews using academic sources.

To introduce students to theorycrafting algorithms, I not only demonstrated for students how to functionally theorycraft TikTok, but I also situated this algorithmic examination within a larger critical discussion of the ethics of algorithmic functions, drawing on critical race theory and computers and writing scholarship such as Selber and Selfe and Selfe within these discussions. Students were first shown that no technology is neutral: its design always conveys a specific set of ethical and cultural values. Students examined how technology design constructs our habits, which directly constructs a specific ethos for ourselves, especially over time (Holmes, Rhetoric). Finally, students discussed how design often privileges some users while marginalizing or completely excluding others. For instance, to engage students in examining the implicit values embedded within technology’s structure and use and how this may marginalize minority users, I first had students read an excerpt from Algorithms of Oppression, and we watched The Social Dilemma as a class.[iv] We then discussed problems with social media and how its design, especially its algorithms, may foster some of these problems.

To further introduce students to ethically theorycrafting algorithms, students engaged in low-stakes class activities modeling theorycrafting, which is important because, while most games are made of quantitative data such as levels and scores, unless students have previously dug into a social media platform’s analytics, they do not think about social media platforms in quantitative ways. For instance, to model theorycrafting Google’s search algorithm, I had students do a Google search for the phrase “global warming” and email me the top five websites Google gave them. On Excel, I then charted and graphed the class’s Google results. While this charting and graphing has the added benefit of showing them how to theorycraft a well-known algorithm, the data also began a conversation about the ethics of Google’s search algorithm when I asked them why they thought they received the top five websites they did and what ethical values this constructs for them as Google users. In this way, while students were tasked with discovering Selber’s functional literacy within the algorithm they were studying, they often critically analyzed how the algorithms functioned ethically within their discussion findings, also demonstrating Selber’s critical literacy.

Unsurprisingly, within their research projects, students were quite interested in using theorycrafting to partially unbox search engine and engagement algorithms on social media platforms, especially as engagement algorithms might be the most powerful in forming online publics. To help students consistently and reliably track engagement, a solid theorycraft project does not need to investigate more than one variable. This did not mean, however, that more ambitious students did not try to track more. In my class, April[v] tracked five variables on TikTok: trending topics, songs, and hashtags, video length, and the length of time a video was online. She supported her hypothesis that short videos posted about trending topics, with trending songs and hashtags, had the most interactions and views within the first 12 hours of the post, showing that the TikTok algorithm privileged these variables the most. To gauge engagement, she used the engagement metrics TikTok already includes for users looking at the number of views, likes, comments, and shares. She tracked the same videos, looking at the same trending and non-trending hashtags, songs, and content, over 12, 24, and 48-hour periods.

By tracking TikTok’s engagement algorithms, April was learning algorithmic literacy tactics to understand how to write for the TikTok engagement algorithm in a way similar to how Gallagher argues students should learn to write for algorithms as well as people (“Writing”). By better understanding the posting variables that influenced TikTok’s engagement algorithm, April could also better employ what Edwards terms tactical rhetoric (“On Circulatory”), deliberately working with what she knew optimized engagement in her posts to optimize the rhetorical velocity of her post’s circulation on TikTok. She also exhibited tactics that Glotfelter’s study participants used, such as rhetorically utilizing trending hashtags and content to optimize algorithmic circulation. While April did not completely understand everything there is to know about TikTok’s engagement algorithm or its underlying code, she still has gained enough algorithmic literacy to optimize the circulation of her post in a rhetorically savvy way for her core audience, TikTok’s algorithm, so that she can reach the most human viewers possible. 

Furthermore, by becoming aware of what factors most influence TikTok’s circulation algorithm, April also gains some critical awareness of these factors. For instance, she discovered that videos that are 45 seconds or longer do not circulate as well as shorter content does, writing in her discussion of her research paper that “[t]he shorter videos have an advantage over the longer ones in that TikTok users do not have to focus on them.” She corroborated this finding with a user interview who admits “that shorter videos have a better chance at keeping her attention long enough to finish and interact with them.” Consequently, April critically understands that the TikTok algorithm is catering to users with short attention spans and that longer videos that are more substantive and nuanced will not circulate as well.

Another student, Julio, wanted to know if he could uncover shadowbanning on TikTok’s engagement algorithm. Many Black creators have protested TikTok for downplaying, or shadowbanning, their content on the algorithm when it is about race or when a white content creator will sing or dance the same content and receive thousands of more views. To make matters worse, the Black artist is often the creator of the song or dance, but the white content creator often does not credit the Black artist (Pruitt-Young). While shadowbanning of Black content has yet to be definitively proven, with TikTok’s executives protesting that they support Black creators (Mitchell), executives have admitted to suppressing queer content (Botella). However, Black content such as Black Lives Matter (BLM) may get shadowbanned in a similar way as queer content because of TikTok’s fear of harassment, as content from both groups may promote offensive reactions from racist and homophobic users. However, instead of targeting those who are racist and homophobic and banning them for their racist and homophobic writing and actions, often algorithms will downgrade engagement, or moderators will take down content that is deemed “controversial,” because the content deals with race or sexual identity (McCluskey). However, taking down or algorithmically downgrading content deemed “controversial” effectively silences those of difference from the ad hoc public sphere of social media and promotes a false public of universal white heteronormativity on social media instead. 

Consequently, to theorycraft shadowbanning, Julio wanted to see if #BLM tags were shadowbanned on TikTok and found that, at least in his limited theorycraft project, they were not. He created two accounts: one where he actively clicked on content with #BLM and another where he did nothing but randomly scroll. After the third day of actively counting BLM content and averaging it with the other content that came across his feed, he found that 61.7% of the “For You” posts were related to BLM on the account where he had actively clicked on content with #BLM tags. In contrast, the account where Julio had done nothing only had 8.3% of BLM content. So, while BLM content is not that popular on the TikTok algorithm normally, users can actively seek out this content by clicking on it and the algorithm will reflect their BLM interests. 

In uncovering how the TikTok algorithm privileges content, Julio also gained ethically critical awareness about it. He argued that the algorithm still suppressed Black content as his study showed that the algorithm did not naturally promote it unless a hashtag like BLM was attached. In his discussion, he wrote, “This discovery I think would cause turmoil across the platform and especially with Black content creators as only a select amount of people are able to see their posts. They would feel suppressed as their information they post cannot be accessed worldwide and [can] only [be] looked at if specifically searched for.” Consequently, Julio ethically interrogates his theorycraft findings, critiquing ways that the algorithm still engages in shadowbanning. 

Students were also interested in theorycrafting Google’s algorithm, specifically how searches varied between different users. August created a new Google account and compared it to his usual account, looking specifically at his search results for shoes. In his usual account, he found an entire line of pictures advertising specific shoes from the shoe store Zalando—in total nine out of 11 ads were from Zalando, his favorite shoe store. On his new account, his shoe search contained no ads and only three websites of shoe stores, of which Zalando was only one. However, this theorycrafting showed August how Google prioritizes personal advertising based on past search history and effectively monetizes the search function, which many users may erroneously think of as a neutral function. 

While August’s theorycraft was limited to two accounts, the algorithmic literacy tactics he gained from it still point to the fact that Google’s ads, while clearly marked, can blur Fraser’s economic and state public sphere. As such, August critically discovered that the Google algorithm generates ad content based on previous searches, rhetorically alerting him to the fact that the Google search engine is not neutral: it prioritizes certain search results over others based on a complex formula derived from a history of a user’s previous searches and how many websites are hyperlinked to the search result using the PageRank formula (Parsier; Vaidhyanathan). Furthermore, these findings made August critically aware that the Google search engine is preying on customer demographic information derived from user search histories, which users give away for free in exchange for the convenience the search algorithm provides. August wrote in his discussion section that “This means that the data is essentially willingly given away in order to make the search experience better.”


TEACHING STUDENTS HOW TO COMPOSE MULTIMODAL COUNTERNARRATIVES 

 

As my examination of student theorycrafting demonstrates, teaching students to partially unbox an algorithm to learn how it circulates not only teaches valuable algorithmic literacy tactics but also teaches students how to use these same logics to circulate their own rhetoric online effectively. However, because theorycrafting can hand students a great deal of rhetorical power, it also entails teaching how algorithms construct an ethical ethos for users and how to ethically use these algorithmic literacy tactics. Finally, teaching students how to theorycraft allows them to use what algorithmic literacy they have gained to construct rhetorically effective, critically intersectional counternarratives in response to the unethical ways algorithms can operate, which is a critical step in fostering counterpublics that create a more diverse ad hoc public online. 

In reflecting about what their theorycrafting research projects revealed about their algorithm’s ethics, students could also be asked to construct critically intersectional counternarratives that act as counterarguments to the procedurally enacted ethics the algorithms construct. In this way, students are not just ethically critiquing the algorithms in their studies, engaging in Selber’s critical literacy, but they are also engaging in Selber’s rhetorical literacy in using what they have learned about specific algorithmic circulations to rhetorically effect ethical change. Unfortunately, because of the time limitations of a 10-week quarter, I did not ask students to compose multimodal counternarratives in my class. In this section, I outline a pedagogy for a future class that teaches students to construct multimodal critically intersectional counternarratives for their intended online audiences as a rhetorical intervention to the algorithm’s unethical processes.

To prepare students to design rhetorically effective multimodal posts and videos, I will introduce visual design concepts such as Robin Williams’s contrast, repetition, alignment, and proximity, as well as Gunther Kress and Theo von Leeuwen’s design salience and information value so that students understand how images, text, and layout work together for rhetorical effect and purpose. Using these principles, we will analyze social media videos and picture posts. Then we will turn class into a studio space where students design their videos and posts. While I will offer tutorial resources for common video editing applications, I will also allow students to design with whatever image and video editing software they feel comfortable using while also encouraging students to help each other. Finally, students will create social media campaign plans where they identify what social media platform they will use, who their ideal target audience will be, why this is their ideal audience, how they will use the algorithm to target this audience, what their ethical message is in their campaign, and what multimodal written and visual design elements they will use to rhetorically make their ethical message clear. However, because counterpublics (i.e., women, BIPOC, LGBTQ) tend to be targets of online harassment, as Gelm, Jessica Reyman and Erika Sparby show, students who do not feel comfortable widely circulating their message online can also explain why and how they will minimize this circulation instead within their social media campaign plan.

Students can then use their algorithmic literacy to form counternarratives to the silencing norms circulating more widely within the social media platform, creating spaces for counterpublics within the more dominant online public sphere visible on that online platform. For instance, after Julio tested shadowbanning with his theorycraft, he could create a multimodal post that shared his findings that he did not find direct evidence of shadowbanning on TikTok; however, if he worked together with April’s theorycraft on TikTok’s engagement metrics, he could still design video posts deliberately promoting BLM content using TikTok’s engagement algorithm to ensure that more mainstream white adolescent audiences were exposed to such content, targeting them by pairing a BLM hashtag with a currently trending hashtag, using a currently trending songs in his video, and making sure that his BLM content was integrated with currently trending content to optimize circulation. In this way, he would be leveraging the algorithmic literacy tactics he gained from theorycrafting, using Edward’s tactical rhetorics, to ensure that BLM content is not being shadowbanned on the TikTok platform and that it is receiving optimal circulation instead. He could also include a shoutout to Black artists, whose songs and dances are the most popular but who have not received the credit they deserve. For instance, Julio could utilize a song created by a Black artist in his video, deliberately citing the name of the artist who created it to rhetorically counter the negative appropriation tactics of shadowbanning. In using the rhetorical knowledge gained about algorithmic circulation from his theorycrafting, Julio can also address a corrective to the false white heteronormative TikTok ad hoc public created by shadowbanning practices and promote a more realistic, healthier ad hoc public that includes minority voices. In this way, this student example illustrates how asking students to compose ethical multimodal intersectional counternarratives is critically important in constructing an equitable ad hoc online public in which counterpublic minority voices are heard.

Finally, students can reflect on what happened when they circulated their multimodal counternarrative posts, using their algorithmic circulation knowledge gained from theorycrafting by answering a series of reflective questions. Were they successful in creating online spaces for counterpublics and what did they learn in attempting to do so? For instance, were students able to reach their intended audience effectively? If not, what else should they find out about how the algorithm operates so that they will be better equipped to reach this audience in the future? If they did reach their intended audience, what was the rhetorical impact? How did audiences react? Why did students find these audience reactions rhetorically effective or not? If audiences did not respond in ways students wanted, how could students rhetorically change their message with their writing and visuals in the future? Thus, students can reflect on how to be more rhetorically effective in future posts.

In conclusion, theorycrafting can be an accessibly powerful way for students to partially demystify the black box of how social media algorithms operate. As such, students gain algorithmic literacy tactics in using the algorithms to their own rhetorical ends. For instance, through theorycrafting, students can discover how algorithms circulate writing and use this knowledge to circulate their writing to their intended audiences. However, students also need to be taught to theorycraft within an ethical framework: they need to know how algorithms construct an ethos for their users and they also need to be guided to use their algorithmic literacy tactics in rhetorically ethical ways, especially ways that value the ad hoc publics of minority counterpublic voices online.


NOTES 


[i] While Aristotle infamously believed that only Greek men were capable of eudaimonia, excluding all other groups such as women, slaves, and foreigners, Vallor argues that this is false and true flourishing can only happen if relations of care are extended to all groups.

[ii] Because most of my students are also white and middle to upper class, I do not want to call their critical narratives counterstories, even if they examine race. They are critical responses to the biases of design, so I call them critically intersectional counternarratives instead as they still follow the critical race theory work of decentering white male privilege in design and creating larger spaces online for marginalized groups.

[iii] The Institutional Review Board at the University of Denver determined this study, #1859394-1, exempt from full IRB oversight. 

[iv] Students could also listen to the Facebook Files from the podcast The Journal, produced by The Wall Street Journal, which explores how Facebook’s engagement algorithms foster online hate groups and the spread of misinformation.

[v] I am using pseudonyms for my students’ names.


WORKS CITED 

Ahmed, Sara. Queer Phenomenology: Orientations, Objects, Others. Duke UP, 2006.

Asher, Saira. “Myanmar Coup: How Facebook Became the ‘Digital Tea Shop.’” BBC News, 4 Feb 2021, https://www.bbc.com/news/world-asia-55929654.

Baker-Bell, April. Linguistic Justice: Black Language, Literacy, Identity, and Pedagogy. Routledge, 2020.

Banks, Adam. Race, Rhetoric, and Technology: Searching for Higher Ground. National Council of Teachers of English, 2008.

Barton, Matt, and Dan Ehrenfeld. “Online Public Spheres in the Era of Fake News: Implications for the Composition Classroom.” Computers and Composition, vol. 54, 2019, https://doi.org/10.1016/j.compcom.2019.102525.

Bogost, Ian. Persuasive Games: The Expressive Power of Videogames. MIT P, 2007.

Botella, Elena. “TikTok Admits It Suppressed Videos by Disabled, Queer, and Fat Creators.” Slate, 4 Dec. 2019, https://slate.com/technology/2019/12/tiktok-disabled-users-videos-suppressed.html.

Boyd, Dana. It’s Complicated: The Social Lives of Networked Teens. Yale UP, 2014.

Brock, Kevin. “Enthymeme as Rhetorical Algorithm.” Present Tense: A Journal of Rhetoric in Society, vol. 4, no. 1, pp. 1–8.

Brown, James. Ethical Programs: Hospitality and the Rhetorics of Software, U of Michigan P, 2015.

---. “Crossing State Lines: Rhetoric and Software Studies.” Rhetoric and the Digital Humanities, edited by Jim Ridolfo and William Hart-Davidson, U of Chicago P, 2015, pp. 20–33.

Bruns, Axel, and Jean Burgess. “The Use of Twitter Hashtags in the Formation of Ad Hoc Publics.” Proceedings of the 6th European Consortium for Political Research (ECPR) General Conference 2011. The European Consortium for Political Research (ECPR), 2011. 

Canagarajah, Suresh. Translingual Practice: Global Englishes and Cosmopolitan Relations. Routledge, 2013.

Colby, Richard, and Rebekah Shultz Colby. “A Pedagogy of Play: Integrating Computer Games into the Writing Classroom.” Computers and Composition, vol. 25, 2008, pp. 300–12.

DeVoss, Danielle, and Jim Ridolfo. “Composing for Recomposition: Rhetorical Velocity and Delivery.” Kairos, vol. 13, no. 2, 2009, http://kairos.technorhetoric.net/13.2/topoi/ridolfo_devoss/intro.html.

Edwards, Dustin. “Circulation Gatekeepers: Unbundling the Platform Politics of YouTube’s Content ID.” Computers and Composition, vol. 47, 2018, pp. 61–74.

---. “On Circulatory Encounters: The Case for Tactical Rhetorics.” Enculturation, vol. 25, 2017, http://enculturation.net/circulatory_encounters.

Eyman, Douglas. Digital Rhetoric: Theory, Method, Practice. U Michigan P, 2015.

“Facebook Files.” The Journal, from The Wall Street Journal, 8 December 2021, https://www.wsj.com/articles/the-facebook-files-a-podcast-series-11631744702.

Fraser, Nancy. “Rethinking the Public Sphere: A Contribution to the Critique of Actually Existing Democracy.” Social Text, vol. 25/26, 1990, pp. 56–80.

Gallagher, John. “The Ethics of Writing for Algorithmic Audiences.” Computers and Composition, vol. 57, 2020, 102583.

---. “Writing for Algorithmic Audiences.” Computers and Composition, vol. 45, 2017, pp. 25–35.  

Gee, James Paul. How to Do Discourse Analysis: A Toolkit. Routledge, 2011.

Gee, James Paul, and Elisabeth Hayes. Women and Gaming: The Sims and 21st Century Learning. Palgrave Macmillan, 2010.

Gelms, Bridget. “Volatile Visibility: How Online Harassment Makes Women Disappear.” Digital Ethics: Rhetoric and Responsibility in Online Aggression, edited by Jessica Reyman and Erica Sparby, Routledge, 2020, pp. 179–94.

Gilyard, Keith. “Literacy, Identity, Imagination, Flight.” College Composition and Communication, vol. 52, no. 2, 2000, pp. 260–272.

Glotfelter, Angela. "Algorithmic Circulation: How Content Creators Navigate the Effects of Algorithms on their Work." Computers and Composition, vol. 54, 2019, 102521.

Hawhee, Debra. Bodily Arts: Rhetoric and Athletics in Ancient Greece. U of Texas P, 2004.

Habermas, Jurgen. The Structural Transformation of the Public Sphere: An Inquiry into a Category of Bourgeois Society, translated by Thomas Burger, MIT P, 1991.

Hirsu, Lavinia. "Tag Writing, Search Engines, and Cultural Scripts." Computers and Composition, vol. 35, 2015, pp. 30–­40.

Holmes, Steve. The Rhetoric of Videogames as Embodied Practice: Procedural Habits. Routledge, 2018.

Holmes, Steve, and Rachel Graham Lussos. “Cultivating Metanoia in Twitter Publics: Analyzing and Producing Bots of Protest in the #GamerGate Controversy.” Computers and Composition, vol. 48, 2018, pp. 118–38.

Howard, Philip, Aiden Duffy, Deen Freelon, M. M. Hussain, Will Mari, and Marwa Maziad. “Opening Closed Regimes: What Was the Role of Social Media During the Arab Spring?” Social Science Research Network, 2011, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2595096.

Jones Royster, Jacqueline. “When the First Voice You Hear Is Not Your Own.” College Composition and Communication, vol. 47, no. 1, 1996, pp. 29­–40.

Kearns, Michael, and Aaron Roth. The Ethical Algorithm: The Science of Socially Aware Algorithm Design. Oxford UP, 2019.

Koenig, Abby. “The Algorithms Know Me and I Know Them: Using Student Journals to Uncover Algorithmic Literacy Awareness.” Computers and Composition, vol. 58, 2020, 102611.

Kress, Gunther, and Theo von Leeuwen. Reading Images: The Grammar of Visual Design. Routledge, 1996.

Kynard, Carmen. Vernacular Insurrections: Race, Black Protest, and the New Century in Composition-Literacy Studies. State UP, 2013.

Laquintano, Timothy, and Annette Vee. “How Automated Writing Systems Affect the Circulation of Political Information Online.” Literacy in Composition Studies, vol. 5, no. 2, 2017, pp. 43–62, https://doi.org/10.21623/1.5.2.4.

Marino, Mark. Critical Code Studies, MIT P, 2020.

Martinez, Aja. Counterstory: The Rhetoric and Writing of Critical Race Theory. National Council of Teachers of English, 2020.

McCluskey, Megan. “These TikTok Creators Say They’re Still Being Suppressed for Posting Black Lives Matter Content.” Time, 22 July 2020, https://time.com/5863350/tiktok-black-creators/.

Mills, C. Wright. The Power Elite. Oxford UP, 1999.

Mitchell, Taiyler Simone. “Black Creators Say TikTok’s Algorithm Fosters a ‘Consistent Undertone of Anti-Blackness.’” Insider, 24 Aug. 2021, https://www.insider.com/a-timeline-of-allegations-that-tiktok-censored-black-creators-2021-7.

Mundt, Marcia, Karen Ross, and Charla Burnett. “Scaling Social Movements Through Social Media: The Case of Black Lives Matter.” Social Media and Society, vol. 4, no. 4, 2018, pp. 1–14.

Nakamura, Lisa. Cybertypes: Race, Ethnicity, and Identity on the Internet. Routledge, 2002.

Noble, Safiya Umoha. Algorithms of Oppression: How Search Engines Reinforce Racism. New York UP, 2018.

Parsier, Eli. The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think. Penguin Books, 2011.

Paul, Christopher. “Optimizing Play: How Theorycraft Changes Gameplay and Design.” Game Studies, vol. 11, no. 2, 2011, http://www.gamestudies.org/1102/articles/paul.

---. The Toxic Meritocracy of Video Games: Why Gaming Culture Is the Worst. U of Minnesota P, 2018.

Pennycook, Alastair. Language as a Local Practice. Routledge, 2010.

Pilsch, Andrew. “Events in Flux: Software Architecture, Detractio, and the Rhetorical Infrastructure of Facebook.” Computers and Composition, vol. 57, 2020, 102584.

Powell, Malea. “2012 CCCC Chair’s Address: Stories Take Place: A Performance in One Act.” College Composition and Communication, vol. 64, no. 2, 2012, pp. 383–406.

Pruitt-Young, Sharon. “Black TikTok Creators Are on Strike to Protest a Lack of Credit for Their Work.” National Public Radio, 1 July 2021, https://www.npr.org/2021/07/01/1011899328/black-tiktok-creators-are-on-strike-to-protest-a-lack-of-credit-for-their-work.

Reimer, Cody. “Toward a Broader Conception of Theorycrafting.” The Ethics of Playing, Researching, and Teaching Games in the Writing Classroom, edited by Richard Colby, Rebekah Colby, and Matthew S. S Johnson, Palgrave Macmillan, 2021, pp. 295–314.

Reyman, Jessica, and Erica Sparby. “Introduction: Toward an Ethic of Responsibility in Digital Aggression.” Digital Ethics: Rhetoric and Responsibility in Online Aggression, edited by Jessica Reyman and Erica Sparby, Routledge, 2020, pp. 1–14. 

Reynolds, Nedra. “Ethos as Location: New Sites for Understanding Discursive Authority.” Rhetoric Review, vol. 11, no. 2, 1993, pp. 325–38.

Rickert, Thomas. Ambient Rhetoric: The Attunement of Rhetorical Being. U of Pittsburgh P, 2013.

Solorzano, Daniel, and Tara Yosso. “Critical Race Methodology: Counter-Storytelling as an Analytical Framework for Education Research.” Qualitative Inquiry, vol. 8, no. 23, 2002, pp. 23–44.

Sano-Franchini, Gina. “Designing Outrage, Programming Discord: A Critical Interface Analysis of Facebook as a Campaign Technology.” Technical Communication, vol. 65, no. 4, 2018, pp. 387–410.

Selber, Stuart. Multiliteracies for a Digital Age. Southern Illinois UP, 2004.

Selfe, Cynthia. Technology and Literacy in the Twenty-first Century: The Importance of Paying Attention. Southern Illinois UP, 1999.

Selfe, Cynthia, and Richard Selfe. “The Politics of the Interface: Power and Its Exercise in Electronic Contact Zones.” College Composition and Communication, vol. 45, no. 4, 1994, pp. 480–504.

Shepherd, Ryan. “Gaming Reddit’s Algorithm: r/the.donald, Amplification, and the Rhetoric of Sorting.” Computers and Composition, vol. 56, 2020, 102572.

“Students’ Right to Their Own Language.” College Composition and Communication, vol. 25, no. 3, 1974, pp. 1–32.

Trimbur, John. “Composition and the Circulation of Writing.” College Composition and Communication, vol. 52, no. 2, 2000, pp. 188–219.

Vaidhyanathan, Siva. Antisocial Media: How Facebook Disconnects Us and Undermines Democracy. Oxford UP, 2018.

Vallor, Shannon. Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting. Oxford UP, 2016.

Vee, Annette. Coding Literacy: How Computer Programming Is Changing Writing. MIT P, 2017.

Warner, Michael. Publics and Counterpublics, Zone Books, 2002.

Watcher-Boettcher, Sara. Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. W. W. Norton & Company, 2017.

Williams, Robin. The Non-designer’s Design Book: Design and Typographic Principles for the Visual Novice. Peachpit Press, 1994.

Wilson Logan, Shirley. “Changing Missions, Shifting Positions, and Breaking Silences.” College Composition and Communication, no. 55, vol. 2, 2003, pp. 330–42. 

Young, Vershawn Ashanti. “2020 CCCC Chair Address: Say They Name in Black English: George Floyd, Breonna Taylor, Atatiana Jefferson, Aura Rasser, Trayvon Martin, and the Need to Move Away from Writing to Literacies in CCCC and Rhetoric and Composition.” College Composition and Communication, vol. 72, no. 4, 2021, pp. 623–39.