Using Critical-Reflective Methods
Sections: Critical Reflective Methodologies | Methods for Current Project
As Patricia Sullivan and James E. Porter (1997) noted, writing and researching practices "all require a reflectiveness and critical awareness to be done well" (p. 21).
For this reason, my discussion of methods for this project is a bit nontraditional: I will first discuss the critical-reflective methodologies that are central to my research method (and were born from a previous project), and then I will discuss the methods for this project.
Critical-Reflective Methodologies
I lean heavily on the work of Patricia Sullivan and James E. Porter (1997) and their discussion of critical-reflective research methodologies to shape my own theoretical perspectives and methodologies. Sullivan and Porter argued that often researchers view methods as something they "acquire" and then continue to use as almost a one-size-fits-all approach. In response to this attitude toward research, Sullivan and Porter called for researchers to realize the situatedness of their methods, to realize that methodology "is itself an act of rhetoric, both with our participants in research studies and with our colleagues in a given research field" (p. 13). Method is not just a step in the journey but is inherently tied to current discussions in the field, the questions we ask, and the answers we receive.
Drawing on feminist and postmodern perspectives, Sullivan and Porter detailed what they consider a "critical research practice" that involves critically considering research methods, aims, and discussions. Their goal was to encourage researchers to reflect on their own research and research practices—a methodology that promotes postmodern critique. In this methodology, researchers' own situatedness, as well as the situatedness of their participants, departments/workplaces, fields, and so on, will always inform the research.
In my case, my work on a previous project (West, 2016), also related to Yik Yak, inherently informed my methods for this project. In that project, I struggled to collect data from a platform that was both ephemeral and anonymous — and even more, I struggled to find research methods on which I could base my own. Methods from previous research in composition studies and technical and professional communication (TPC) did not map onto anonymous spaces easily.
In order to answer my research question for that project—how are students communicating and interacting on Yik Yak?—I approached as a researcher who Sullivan and Porter would call a "traditionalist," using a "select-and-then-apply-a-method" approach (p. 65). I sought to ground my methods as best I could in the methods of my chosen fields, composition studies and TPC. I dutifully collected a corpus of Yik Yak posts, coded and categorized them, and was (and still am) happy with the work that I did.
But with further reflection, I came to understand that though my research revealed much about Yik Yak and the community of users I studied, I still had another, perhaps larger, question: not how are users communicating on this online platform, but why are users communicating on anonymous platforms at all, especially in the face of all the justifiable criticisms of such applications? To answer this question, I needed to reflect on my previous research and the survey of social media research practices I had done to that point, and I needed to figure out a method to answer that question. But the method would need to be unlike my previous collection and analysis of posts, which had only answered the question of how — I would need to go to the users themselves.
Project Methods
Sections: Mixed Methods Research Design | Mixed Methods in Current Project | Survey Design | Interview Design
In my project, I seek to address the following research questions:
- Q1. Why are students using anonymous applications?
- Q2. What are students' perceptions of the characteristic of anonymity in these applications?
- Q3. How can researchers navigate these applications?
To find the answers to my questions, I went to the student-users themselves. While I could interpret some data as a user myself, I recognized that I approached the research space from a different rhetorical and hierarchical situation (that of academic researcher and instructor) than most users. To develop a more complex understanding of use and communication, I gathered both quantitative and qualitative data from participants in the form of an IRB-approved mixed-methods study.
Mixed Methods Research Design
To answer my research questions, I employ a sequential mixed methods strategy, born from critical reflection on my previous project. Mixed methods research (MMR) design combines both quantitative research and qualitative research to produce multiple means of data collection (Creswell, 2008).
For the most part, qualitative studies still dominate the research landscapes of composition studies and TPC, though TPC noted a turn toward "big data" research that has made mixed methods and quantitative approaches more prominent (McNely, Spinuzzi, & Teston, 2015). Similarly, research in composition—especially in computers and writing—has begun to take note of a lack of quantitative research studies. In Jennifer Bowie and Heather McGovern's (2013) study of the field's scholarship from 2003-2008, they argued, "Given the prevalence of and respect for quantitative research in other fields, investigating potential for and current use of quantitative research in computers and writing seems wise" (p. 245). In my research, I hope that bringing together both qualitative research with quantitative methods will help increase the overall validity of my research in a variety of fields.
Mixed Methods in Practice
To gather data, I used a survey for quantitative data and then sought to expand that data with qualitative data from interviews with four participants. My survey forms the basis of more general data to support my preliminary ideas about the prevalence of anonymous platforms, and in addition it gives me a better idea of how and why these platforms are being used (see Ferro & Zachry, 2014, and their survey on professional social media use).
The survey results shape my interviews, and these interviews give me the opportunity to communicate directly with student-users and ask more questions about the ways and contexts of their compositions. Without the added benefit of the interviews, I would have only my own interpretation of the survey's results. While this is an often-used method of interpretation, I sought to work with participants to gain a deeper understanding.
I acknowledge that my choice to combine methods does, at least to some extent, stand in opposition to Sullivan and Porter's postmodern feminist approach. Feminist researchers have most often championed qualitative methodologies because quantitative methods, the dominant research model in many fields, are often associated with masculinist bias (Brannen, 1992). As Julia Brannen (1992) pointed out, however, some feminists believe that there should be no method set aside as purely feminist. Instead, the researcher should seek to ensure that the results of the research are interpreted in such a way that acknowledges and demonstrates a variety of perspectives. Brandy Dieterle (2021) has recommended "a feminist digital research ethic that draws on an ethic of care" for social media research rather than a particular method. Sullivan and Porter (1997) lauded feminist methodologies but noted that feminist researchers are often not "continuously critical of their methods" (p. 63). In that vein, I seek to be critical both of my methods and those that have come before mine, and I see that both qualitative and quantitative modes of inquiry can be beneficial for answering my research questions.
Reflective Audio: Survey Demographics
TRANSCRIPT: To get an idea of the range of student-users I surveyed, I included questions for demographic information. Shortly after completing my project, I observed other researchers talking about the placement of demographic questions in surveys. Many agreed that the information should be placed at the end of the survey. Doing so can help reduce survey fatigue and foreground the most important research questions (which, in my case, didn't include demographic information). If I were to do this survey again, I would include these demographic questions at the end of the survey, as has been suggested by other researchers.
Survey Design
In my survey approach, I sent an electronic survey to the instructors of composition and technical writing classes at the University of Arkansas (Basic Writing, Reading Strategies, Composition 1 and 2, including online and honors, Technical Composition 2, Advanced Composition, Essay Writing, and Technical and Report Writing). The purpose of this survey was to acquire some contextual data about students' use of anonymous platforms.
The electronic survey was hosted by Google Forms because user accounts at the University of Arkansas at this time were associated with Google, and thus all users had access to Google Forms. The survey had four main sections: Demographic Information, Yik Yak, Snapchat, and Whisper (Snapchat was included because the larger project included ephemeral social media). The survey opened with a welcome screen that briefly described the survey and outlined the instructions for the survey process (Sue & Ritter, 2012). In total, the survey included 27 multiple-choice questions on a multiple-page layout.
The survey was distributed to all instructors of composition and technical writing, and these instructors were asked to proctor the survey during class (for face-to-face courses) or post the link for a 24-hour period on their course management site (for online courses). Neither the instructors nor their students were under any obligation to participate in the survey. This survey was provided to 625 students and yielded 558 responses, an 89.2% overall response rate.
For this webtext, I am focusing on anonymous applications; for that reason, most of the survey responses that I will discuss come from specifically from the Yik Yak section.
Reflective Audio: Self-Reporting
TRANSCRIPT: Sullivan and Porter (1997) noted that the research method often evolves as constraints—both from the method and the environment—influence the situation. They argued that computers and composition studies, in particular, contributes to the field by "challeng[ing] the sanctities of method" and encouraging researchers to "use the heuristic quality of method to aid them in dealing with shifts over time" (p. 66). Though they primarily focused on the ways in which qualitative methods shift throughout the research project, I saw a need for this heuristic quality in the quantitative portion of my project. In my survey design process, I came across many limitations that had to be considered as I worked through the design of this instrument, especially in asking students to self-report about their uses of anonymous social media platforms. Self-reporting likely led some students to choose different answers than they normally would because they might have been afraid of the implications of their answers (for example, if they admitted that they used Yik Yak to bully someone). To temper this tendency, the survey was anonymous and email addresses left for interview contact were not connected with the survey answers. As a researcher, though, I should be aware that some discrepancies may occur with more sensitive questions, and when users are choosing to be in an anonymous space, they may not always be willing to share what they do there.
Interview Design
To follow up on the results of my survey process, I conducted interviews with four participants who had left their contact information in an interview contact form attached to the survey. In designing my interview process, I began by composing traditional question-and-answer questions. As John Creswell (2008) noted, this type of interview design is well-suited for the collection of background information on participants. I followed these questions by incorporating a discourse-based interview method (Odell, Goswami, & Herrington, 1983), a method designed specifically for nonacademic writing. Though usually employed for workplace writing, these types of interviews have been useful for inquiry about the discourse conventions that frequently occur in social media spaces.
I combine this interview method with Liza Potts's idea of reflexive (auto)ethnography (2015). As Potts noted, reflexive autoethnography is a way of articulating the experiences of non-active participants. Though I was a user of many of the social media platforms I studied, I am not part of my own target population, and thus will likely have different ideas and experiences. However, as Potts explained, her status as a user of the fan-space she studied served "as a way to understand the structure in which the research takes place" ("Sussing Out, Reflexive (Auto)Ethnography," para. 2). From my prior research on Yik Yak, my access to campus-local social media spaces, and my position as user, I felt capable of leveraging my knowledge of these spaces in designing interview methods for student-users.
Reflective Audio: Interview & "Natural Field Settings"
TRANSCRIPT: Interviews were the best way to communicate one-on-one with my research population, because I did not have the ability to observe participants in a more natural field setting&emdash;but this is still a limitation of this method. My presence as a researcher could have biased responses. In my study, however, the "natural field setting" could be anywhere. Capturing students composing on these platforms in a natural way would be nearly impossible because even in another location, I would still be acting as researcher. In addition, the setting of the platform itself does not allow for very easy data collection, or much information about the users' identities, motives, or context. These interviews, then, even with the caveat of asking participants to consider their use of these platforms outside of a more natural setting, were the best way I had of observing and asking questions about participants' practices.
In addition, at any time researchers rely on interview data, there is a question of the validity of that data. Odell, Goswami, and Herrington (1983) noted that interview data is not always regarded as the most conclusive form of evidence. However, they responded to these criticisms by noting, "We are using interviews to identify the kinds of world knowledge and expectations that informants bring to writing tasks and to discover the perceptions informants have about the conceptual demands that functional, interactive writing tasks make on them" (p. 228). Since my survey provides general context, I structured my interviews to explore these kinds of "world knowledge and expectations" that my participants held about anonymous platforms.
Interview Process
Reflective Audio: Participants
TRANSCRIPT: In following with technical and professional communication's emphasis on user-centered and participatory design (Salvo, 2001; Spinuzzi, 2003, 2005), I see value in going directly to student-users to find answers to my research questions. All participants come from the University of Arkansas's composition and technical writing classes. These classes serve many students in different years and majors, and, because my research holds implications for both compositionists and technical and professional communicators, these students' ideas toward anonymous platforms are most likely to align with the ideas of students frequently found in these types of courses. I recognize that by choosing to reach out to student-users, I am first viewing participants in this role, rather than as unique personalities (Sullivan & Porter, 1998, p. 31). Thus, while I hope to draw larger implications for students, instructors, and other researchers, I wish to first acknowledge that there is no "representative" student-user in these platforms, and many situational, social, economic, institutional, and ideological factors are at play among participants.
Interviews were approximately 30 minutes and were held in an on-campus conference room. Interview questions were divided into three major sections.
The first set of questions were focused generally on the participants' use of anonymous applications, their personal feelings about these types of applications, and the benefits and drawbacks they saw in using these applications.
The middle section of the interview followed discourse-based interview practices and asked participants to comment on posts from two anonymous applications: Yik Yak (2 posts) and Whisper (3 posts). Not all posts were shown to each participant; posts were chosen based on the participants' familiarity with the platform and the time remaining in the interview. If the participant had never used one of the applications, they were given a brief explanation of the platform and then asked to comment based on the context they could gather from a static screenshot. These screenshots were obtained from campus or campus-adjacent locations: Yik Yak's campus feed and Whisper's "nearby" feed. Participants were asked to talk through their thoughts about the post, including how they would respond if they saw the post, how their own posts may be similar, and whether this post met their expectations for what should be posted on the application.
The final section of the interview focused more specifically on rhetorical inquiries such as audience and purpose, how anonymous applications differ from other social networking sites, and how anonymous applications function in community creation and maintenance.
Conclusion
By using a mixed methodology that combines both surveying and interviewing, I explore anonymous and ephemeral platforms in a different way than my previous research project. My current methodology combines both composition studies' proclivity for exploring sites of informal writing and technical and professional communication's emphasis on participatory and user-centered research design. Additionally, I use the idea of reflexive (auto)ethnography to both acknowledge my positioning as researcher and to leverage my knowledge as a user of the platform.
My critical-reflective approach allows me to build on research methods from a previous project; I discovered how complex these systems could be, and how developing a corpus of data could not tell me all the things I wanted to know. Sullivan and Porter's (1997) work encouraged me to view my methodology not just as a process that eventually yields an end goal. Developing a sound methodology is a research process in itself. There is no perfect methodology; after all, if there were, we would all use the same process. There are many popular methodological tools on which we rely in the field (North, 1987), but an exciting prospect is seeing how we can further develop and evolve these methodological approaches for new and different research spaces. Methodology is a process, and this project makes use of critical reflective practices to improve that process.