Writing

Thoughts on
Educator
Responsibilities

Writing Educator Responsibilities for

Discussing the History and Practice of Suveillance & Privacy in Writing Classrooms

Estee N. Beck

Opening Remarks

Admittedly, I'm a new kid on the block when it comes to talking about issues within writing infrastructures (cf. DeVoss, Cushman, & Grabill, 2005). However, because of the media attention with the Edward Snowden disclosures and an interest in how surveillance operates under any front-end interface, surveillance and privacy flips a switch for me. Thus, I was honored to speak with fellow digital writing studies colleagues—Angela Crow, Jennifer DeWinter, Laura Gonzales, Heidi McKee, Colleen Reilly, and Stephanie Vie—at a 2015 Computers & Writing Town Hall on the topics of surveillance, privacy, and net neutrality. In the talk, I asked the audience members two questions:

  1. What responsibility, if any, do writing teachers have in teaching their students about surveillance and privacy in writing classrooms?
  2. In addition, if writing educators do have responsibility to do so, how might they and we integrate such talks and projects into existing curricula?

I want to answer these two questions—in this section for the 20th anniversary issue of Kairos—by speaking back to the legacy of scholarship addressing surveillance and privacy in writing studies research. I also want to point a way forward for readers when integrating such discussions into virtual and physical classrooms.

The first section plays with a provocation of an imagined surveillance state, taken from the fictional vision of Dave Eggers's The Circle, but teases out how such a future may be in the making. In the next section, I provide a brief literature review to piece together the work in this area for future researchers and educators. Finally, I answer the two questions that frame this section by providing concrete pathways of action for the classroom, the community, and global digital spaces.

A Provocation

". . . But you're now the ambassador. You're the face of it. The benign, friendly face of it all. And the closing of the Circle—it's what you and your friend Francis made possible. Your mandatory Circle account idea, and his chip. TruYouth? It's sick, Mae. Don't you see? All the kids get a chip embedded in them, for safety, when they're infants. And yes, it'll save lives. But then, what, you think they suddenly remove them when they're eighteen? No. In the interest of education and safety, everything they've done will be recorded, tracked, logged, analyzed—it's permanent. Then, when they're old enough to vote, to participate, their membership is mandatory. That's where the Circle closes. Everyone will be tracked, cradle to grave, with no possibility of escape" —Dave Eggers, The Circle, p.481.

In The Circle, Eggers imagines a post-information age revolution, where a megalith silicon valley company, Circle, provides an all-inclusive storage of individual personal data. Eggers fictionalizes a company with tentacles in every market, holding over 88% of search, mobile, and free-mail Internet services. Examples include a service named "TruYou," a treasure trove of data from bank accounts, credit cards, emails, passwords, social media accounts, and other such items connected with consumerism, and later a service named "TruYouth," a type of tag and track technology for children, marketed to assist law enforcement officials and concerned parents with locating missing children. The large-scale tracking and storage of human movement, habits, and values feels unsettling, right? Is it possible that one company with vast resources could (or, would) conceive, engineer, and socially market such all-encompassing products? Well, consider the following:

Blackboard and Canvas, two commonly used learning management systems within many universities, use data analytics to track student engagement, including the amount of time logged into their systems and clicks across modules. This data is available for faculty to use, but students do not get to view the data collected.

Facebook, the social media giant, collects millions of records of metadata from its users for advertising revenue. According to Facebook's Data Policy (2015a), the company collects myriad data points, including data related to location, file creation, content engagement, frequency and duration, network usage, payment arrangements, device information, third-party data from partnering websites and apps, and more. While Facebook uses the data to provide and develop services and assures safety and protection of data, the company also provides nonpersonally identifiable information to advertising partners. In the first quarter of 2015, Facebook (2015b) reported making $3.54 billion. 73% of those earnings came from mobile advertising revenue.

Amazon, an electronic commerce company, and Netflix, a provider of on-demand streaming of media, provide what they call recommendations. Amazon uses algorithms to provide content based on customer's ratings of products, likes, and purchases. Netflix's "Taste Preferences" additionally employs a recommendation algorithm accounting for streaming history, ratings, availability of movies and television shows, along with providing content from other members with similar viewing interests. Combined, the site's algorithms provide data to the companies to drive profit.

screen shot of Estee's Amazon recommendations on August 15, 2015
A screen shot of Estee's Amazon recommendations based upon recent book orders and her Amazon wish list as of August 15, 2015.

Google, a multinational technology company, uses location services, search tracking, and other methods to tailor user experience across their Internet-related services and products. Google's tracking antennae reach far and deep, so much so that just googling "how Google tracks people" yields over 84,000,000 results (as of August 7, 2015) with pages of reporting on how, when, where, and why the company surveils its global customer base.

Now, undeniably, I'm framing this hard data with a dash of a fallacy for attention. After all, the fictionalized media company Eggers visualizes is a totalitarian regime and reportedly (according to Eggers) not based on any real-life tech company (cf. McSweeney's). However, these six companies—all online—use digital technologies and have multiple commonalities: servers upon servers of big data, vast storage space for people to share content and ideas, and services marketed socially as ways to share and connect online.

Since 2013, the U.S. public has had a multivested interest in surveillance and privacy because of the revelations about the National Security Agency (NSA) telephone metadata collection from former private contractor Edward Snowden—reported by then Guardian journalist, Glenn Greenwald (2013). The story of the NSA—harnessing and exploiting a regulated and controlled network—provides a touch point for contemporary educators to explore issues of digital surveillance and privacy within and outside of classrooms.

However, just as international researchers in the field of surveillance studies have been discussing issues of video and digital surveillance and privacy for decades, so have scholars and educators in computers and writing, technical communication, and writing and rhetoric studies.

Overview of Surveillance and Privacy Scholarship in Writing Studies

In 1991, Gail Hawisher and Cynthia Selfe planted a seed for future research into surveillance and privacy practices connected with course management systems. These systems—later developed into repackaged learning management systems or LMSs that have more bells and whistles than the CMS predecessors do—allowed instructors to deliver course content in virtual spaces to students in both distance and hybrid courses. Programmers of early CMS technologies coded a surveillance apparatus in the platforms, allowing educators methods of tracking student performance.

Noting this technological advancement, Hawisher and Selfe (1991) warned writing educators about the effects surveillance might have upon relationships with students:

Instructors inspecting electronic spaces and networked conversation have power that exceeds our expectations or those of students. In addition, many students who know a teacher is observing their conversation will self-discipline themselves and their prose in ways they consider socially and educationally appropriate. (p. 63)

This early investigation, perhaps one of the first examinations in writing studies about surveillance in online writing spaces addressed a responsibility of educators: protecting student learning from potential corruption. In that same year, Joseph Janangelo (1991) also warned educators about the abuses of power and control in computer-mediated writing spaces because of the then surveillance techniques available to teachers through the monitoring of composing and computer time. These early works are compelling because they describe the modification of behaviors and shifting relations between students and teachers.

image of a panoptic prison structure
A panoptic prison

Since those early conversations, scholars have continued to address surveillance and privacy concerns with course and learning management systems (cf. Beck, Blair, Grohowski, 2015; Payne, 2005). In addition, there have been connections to the effects of surveillance upon the student/teacher relationship when academic integrity engulfs the panoptic logic of plagiarism-detection systems (Zwagerman, 2008), along with the acknowledgment of how TurnItIn, for example, views a surveillance apparatus in a digital archive of student papers (Purdy, 2009).

Further discussions have emerged about the potential harm digital researchers may face when collecting data for scholarly projects, as Lory Hawkes (2007) has chronicled. Heidi McKee (2012) discussed tracking technologies, data mining, and government surveillance and called for teachers to educate each other and students about data-mining practices. Jessica Reyman (2013) presented a case for the affects data mining has upon intellectual property and agency on the Web. Angela Crow (2012) illustrated the intersections of digital surveillance and big data connected to writing program portfolios and concerns educators and students may have about student identity in online spaces. Stephanie Vie (2014) has discussed the role of privacy policies—often long legal documents most people do not read, because of the cumbersome nature of the documents—in connection with computer games. In addition, I have talked about the invisible digital identity computer algorithms create about people, and how those data points shape what people experience online (Beck, 2015).

We've been having these conversations for some time, but as advances with technology continue and revelations about how companies and governments use metadata arise, educators and researchers in computers and writing, technical communication, writing studies, and rhetoric must be at the forefront of those discussions in our classrooms and in our scholarship. I think we can all agree that surveillance, privacy, and even net neutrality are large political and social issues, especially in connection with Cynthia Selfe's comments during an interview:

I also think that people's relationships to technology will continue to be important because technology is disappearing in terms of being naturalized. In a sense, technology disappears into the background. When the technology disappears, ideologies are working the most strongly. We must re-attend every time to technology, and how we use it in our endeavors. (Beck, 2013, p. 353).

In many ways, this message is similar to her powerful call to the field to pay attention to technology and literacy (Selfe, 1999). Nearly fifteen years after Selfe's work, I ask how educators are continuing to pay attention to technology in ways that explore digital surveillance and privacy in connection with writing infrastructures.

Paying Attention to Surveillance and Privacy in Writing Infrastructures

As educators, the responsibilities we face when talking about surveillance and privacy in our writing classrooms includes a host of historical, social, political, and financial discussions. We need to have frank talks with our students, foremost, about how power and the potential for abuse occurs when any type of surveillance apparatus—be it cookie and web beacon tracking technologies embedded in Google search and location history or the mechanisms available in LMSs—embeds in any infrastructure where writing and communication happens. We also need to read through, just as Stephanie Vie (2014) has discussed, the privacy and use policy statements about data collection and tracking in just about every type of software we use in our classrooms. But, more than that, we need to develop what Aaron Toscano (2010) called a "critical technological awareness" to make needed systemic changes in surveillance and privacy practices (p. 16).

When developing such awareness, practices, and techniques, local and institutional learning outcomes and norms will, of course, drive how much (or little) such talks happen during class time. Some suggestions for integration include

  • • exploring websites during a class period (e.g., takethislollipop.com, an instructional app that connects with Facebook to illustrate the dangers of leaking personal information in visible spaces is rather revealing, and large data-broker collection firms like Acxiom and BlueKai that offer consumer portals to show people the amount of personal data collected about them from public and third-party sites),
  • • summarizing and analyzing privacy policy statements from large social media companies,
  • • examining data usage statements from large Internet companies with an integrated proposal project to make ethical changes to protect the privacy of consumers,
  • • engaging in community activism with social media, developing multimodal public service announcements about the effects of surveillance and privacy upon the general public, and
  • • supporting organizations like the Electronic Frontier Foundation or the American Civil Liberties Union in their efforts to fight corporate and government surveillance practices.

The list could go on to include watching and critiquing documentaries about these issues and developing browser plug-ins to block certain features connected with harmful surveillance and privacy practices.

As we take stock of where English studies and allied fields have been for this 20th anniversary issue of Kairos, honoring the scholarship and editorial work of those past and present keeps me energized with the promise of what's to come over the next 20 years. If the collective energies, discussions, and works show us anything, it's that the broader discipline of writing studies contributes a critical perspective to the political, social, and economic infrastructures in writing spaces. It is up to us to continue this important work.

References

Amazon. (n.d.). Retrieved April 29, 2015, from http://www.amazon.com

Beck, Estee. (2015). The invisible digital identity: Assemblages of digital networks. Computers and Composition, 35, 125–140.

Beck, Estee. (2013). Reflecting upon the past, sitting with the present, and charting our future: Gail Hawisher and Cynthia Selfe discussing the community of Computers and Composition. Computers and Composition, 30(4), 349–357.

Beck, Estee, Grohowski, Mariana, & Blair, Kristine. (2015). Subverting virtual hierarchies: A cyberfeminist critique of course management spaces. In Jim Purdy & Dánielle Nicole DeVoss (Eds.), Making space: Writing instruction, infrastructure, and multiliteracies. Sweetland Digital Rhetoric Collaborative/University of Michigan Press. Retrieved Jan. 1, 2016, from http://www.digitalwriting.org/ms/index.html

Blackboard. (n.d.). Retrieved August 7, 2015, from http://www.blackboard.com

Canvas. (n.d.). Retrieved August 7, 2015, from http://www.canvaslms.com

Crow, Angela. (2013). Managing datacloud decisions and "big data": Understanding privacy choices in terms of surveillance assemblages. In Heidi McKee & Dánielle Nicole DeVoss (Eds.), Digital writing assessment. (chapter 2). Logan, UT: Computers and Composition Digital Press. Retrieved Jan. 1, 2016, from http://ccdigitalpress.org/dwae/02_crow.html

DeVoss, Dànielle Nicole, Cushman, Ellen, & Grabill, Jeffrey T. (2005). Infrastructure and composing: The when of new-media writing. College Composition and Communication, 57(1), 14–44.

Eggers, Dave. The Circle: A novel. New York, NY: Alfred A. Knopf.

Facebook. (2015a). Retrieved April 28, 2015, from http://www.facebook.com/about/privacy/update

Facebook. (2015b). Facebook reports first quarter 2015 results. Retrieved from http://investor.fb.com/releasedetail.cfm?ReleaseID=908022

Greenwald, Glenn. (2013, June 6). NSA collecting phone records of millions of Verizon customers daily. The Guardian. Retrieved from http://www.theguardian.com/world/2013/jun/06/nsa-phone-records-verison-court-order

Google. (n.d.). Retrieved August 5, 2015, from https://www.google.com/

Hawisher, Gail E, & Selfe, Cynthia L. (1991). The rhetoric of technology and the electronic writing class. College Composition and Communication, 42(1), 55–65.

Hawkes, Lory. (2007). Impact of invasive web technologies on digital research. In Heidi McKee & Dánielle Nicole DeVoss (Eds.), Digital writing research: Technologies, methodologies, and ethical issues. (pp. 337–352). Cresskill, NJ: Hampton Press.

McKee, Heidi. (2011). Policy matters now and in the future: Net neutrality, corporate data mining, and government surveillance. Computers and Composition, 28(4), 276–291.

McSweeney's. (n.d.). A brief Q&A with Dave Eggers about his new novel, The Circle. Retrieved July 30, 2015, from http://www.mcsweeneys.net/articles/a-brief-q-a-with-dave-eggers-about-his-new-novel-the-circle

Netflix. (n.d.). Retrieved April 29, 2015, from https://www.netflix.com/

Payne, Darin. (2005). English studies in Levittown: Rhetorics of space and technology in course-management software. College Composition and Communication, 67(5), 483–507.

Purdy, Jim. (2009). Anxiety and the archive: Understanding plagiarism detection services as digital services. College Composition and Communication, 26, 65–77.

Reyman, Jessica. (2013). User data on the social web: Authorship, agency, and appropriation. College English, 75(5), 513–533.

Toscano, Aaron A. (2010). Using I, Robot in the technical writing classroom: Developing a critical technological awareness. Computers and Composition, 28(1), 14–27.

Vie, Stephanie. (2014). "You are how you play": Privacy policies and data mining in social networking games. In Jennifer DeWinter & Ryan Moeller (Eds.), Computer games and technical communication: Critical methods and applications at the intersection.(pp. 171–187). Burlington, VT: Ashgate.

Zwagerman, Sean. (2008). The scarlet p: Plagiarism, panopticism, and the rhetoric of academic integrity. College Composition and Communication, 59(4), 676–710.