Bite-Sized Science: Student Interactions with Public Science Communication

Caitlin Martin, PhD
Sandy Branham, PhD
Jenna Hejnar, M.A.
Jennifer Wojton, PhD
Jessica Snitko, PhD

Developer:
Luke Mauro
Web Concept Designer:
Kristiyan Stefanov

Understanding Public Science Communication Through Literacy and Media Studies

Our research sits on the nexus between literacy studies and media studies. We see these as two overlapping fields that provide lenses for examining literacy development across personal and professional contexts in an increasingly connected, always-online U.S. culture. These areas suggest that learners develop literacy via sociocognitive processes as they engage with texts and tools in educational and extracurricular settings, a framework that is key for documenting how individuals engage with public science communication and how educators might leverage extracurricular experiences to enhance media and literacy studies in the classroom. Below, we overview how central tenets of scholarship in public science communication and literacy studies, including information literacy, scientific literacy, digital literacy, and news literacy, inform and contextualize our study.

Public Science Communication

There are many ways to understand science communication (Burns et al., 2003), and our multidisciplinary backgrounds initially led to differing ideas and assumptions about what public science communication is and how our students might engage with it in some way. T. W. Burns et al. (2003) defined science communication as "the use of appropriate skills, media, activities, and dialogue to produce one or more . . . personal responses to science," such as awareness, enjoyment, interest, opinions, or understanding (p. 191). Ultimately, public science communication is a rhetorical process that involves not just accurate transmission of data, but also the consideration of the audience's values, beliefs, and knowledge base to engage them meaningfully (National Academies of Sciences, Engineering, and Medicine, 2017). This broad definition encapsulates our interest in STEM students' engagement in public science communication, particularly the impact that any engagement may have on their understanding of science or decision to pursue STEM careers and professions.

Because we teach communication classes in which students learn to communicate complex ideas to specific audiences, we operationalize a somewhat transactional definition of science communication: the practice of conveying scientific information, research findings, and the implications of scientific work to nonexpert audiences. This includes a broad spectrum of activities, from public lectures, news media articles, and educational outreach to social media posts and documentaries. The goal is to make scientific concepts accessible, understandable, and relevant to the public, policymakers, and other stakeholders. Effective science communication builds trust between the scientific community and society, making complex topics more approachable and sparking informed discussions about scientific advancements and their impacts. This definition, unfortunately, reinforces the deficit view of public science communication, which "presumes that the public is ignorant about science and that, if people were more educated by scientists, they would be supportive" (Gigante, 2018 p. 5).

However, science communication scholars have since rejected this model in favor of more robust models that "focus on public engagement with science issues, . . . [though] the ideals of the deficit model remain intact" (Gigante, 2018, p. 5). There has been significant pressure from the public on the scientific/academic community to communicate findings to the lay public in a way that empowers them to understand and use what is sometimes critically important information. These pressures reinforce the idea that experts should transmit clear and understandable information to those with less knowledge/expertise. As Peter Weingart and Lars Guenther (2016) succinctly noted, "Science communication, in the general sense of the term, is the crucial link between the world of knowledge production and the general public. Thus, the credibility of science is actually dependent on the credibility of science communication" (p. 2). This is particularly important to consider as venues for public science communication change and adapt alongside communication technologies.

We intentionally provided a simplified definition to participants in our study: Public science communication refers to communication about scientific subjects that occurs in popular press or on social media sites. This definition stems from our assumption that college students were likely to engage with public science communication on social media sites. We also wanted to distinguish between the science communication they encounter in their academic settings and the kinds of science communication a broader, more general public may also have access to.

The proliferation of digital and social media has made public science communication efforts more complex, while also increasing access to information, to communicators, and to audiences. Carmen Pérez-Llantada and María-José Luzón (2023) explained that "knowledge claims expressed in a scientific article are reformulated, simplified, adapted, expanded and, more generally speaking, transformed into, say, online news, lay summaries, and blogs, all of them falling outside the domain of expert-to-expert communication of science" (p. 21), such that the creators of public science communication may not be scientists themselves. Previously, sites for public science communication would include popular press magazines and books, as well as white papers and public service announcements produced by government agencies. These documents were produced and/or vetted by experts and provided their credentials. In social media spaces, however, users who view public science communication content are provided with little information about creators beyond their names and follower counts. We focus on TikTok in the following examples because individuals between 16 and 24 comprise 60% of their 80 million monthly users (Doyle, 2024), increasing the likelihood that our students will be familiar with the platform and perhaps encounter public science communication there. Neil deGrasse Tyson, who holds a PhD in astrophysics from Columbia University, had 5.7 million followers on TikTok as of 2025, but his profile identified him only as the host of StarTalk and author of To Infinity and Beyond: A Journey of Cosmic Discovery. Hank Green, who has degrees in biochemistry and environmental studies, had 8 million TikTok followers as of 2025. But unless users watch his pinned intro video, they may only see that his stand-up special is available on DropoutTV. Like other social media sites, TikTok has a verification feature, but that verification only ensures the authenticity of an account, not any other credentials or qualifications a user may have.

Other social media sites, personal blogs, and websites can also provide access to a plethora of scientific communication without emphasizing the creators' credibility. The ease of online production and the virality of online information, particularly on platforms such as YouTube and TikTok, may increase opportunities for the public to encounter misinformation and skepticism about issues that require widespread public understanding like climate change and public health crises. In this digital age, encountering public science communication (like all forms of communication online) requires different literacy practices than were necessary to understand or determine the credibility of public science communication in the previous print-based culture. The need to understand the difference between a piece of information that is popular (algorithmically and by "likes") and a piece of information that is accurate is one such shift.

Literacy Studies

As a field, literacy studies has also moved from a limited notion of literacy to a more complex one, shaping how scholars understand the interrelated acts of reading and writing. "Earlier research on literacy by psychologists, historians, and anthropologists," Julie Lindquist (2015) has explained, "was largely motivated by the question of what literacy, once it is acquired by individuals and societies, makes happen" (p. 99). Literacy studies' roots in psychology led to a cognitive framework for literacy acquisition, in which "reading and writing were treated as things people did inside their heads" (Gee, 2015, p. 35). Over time, however, researchers have moved the discipline to a sociocultural view, defining literacy as a set of negotiations within particular contexts resulting in "distinctive ways of participating in social and cultural groups" (p. 35). In the New Literacy Studies view, individuals acquire literate abilities through an apprenticeship model as they enter new activity systems and take up those systems rules, norms, tools, and behaviors (Gee, 2015; see also Ito et al., 2009; boyd, 2014; Jenkins, 2006).

When looked at in this way, literacy is highly influenced by the tools and technologies available to individuals and communities, and when considering public and social media spaces, it can be difficult to separate from digital literacy, or "the ability to access, manage, understand, integrate, communicate, evaluate and create information safely and appropriately through digital technologies for employment, decent jobs and entrepreneurship" (United Nations Educational, Scientific and Cultural Organization [UNESCO], 2018, p. 6). As tools change cultures, their tools, and the individuals that use them, literate practices also change. Henry Jenkins et al. (n.d.) championed this idea in their white paper Confronting the Challenges of Participatory Culture: Media Education for the 21st Century, emphasizing that the evolution of digital media has created a "participatory culture" in which most people are both consumers and producers of media content. This view of literacy studies is particularly useful for studying public science community. Jenkins et al. stressed the need for identifying new literacy practices that have come from digital culture and provided a framework for identifying some prevalent new literacy practices. As researchers, we were curious to understand how students enrolled in a STEM institution engaged with public science communication, imagining that their engagement would position them as both consumers and creators in social media spaces. Ultimately, understanding this multifaceted view of literacy practices could help educators better understand the practices used to assess public science communication and develop additional ways to teach these methods.

Information Literacy

Information literacy is critical to understanding how individuals encounter and understand public science communication by focusing attention on the extent to which an individual is able to critically consider the messages being sent via various media. The American Library Association (2024), quoting a 1989 Association of College and Research Libraries (ACRL) report, defined information literacy as "a set of abilities requiring individuals to 'recognize when information is needed and have the ability to locate, evaluate, and use effectively the needed information.'" The rise of social media has led to a shift from a competency-based view of information literacy to one that offers "a richer, more complex set of core ideas" by focusing on the "foundational ideas" of an information ecosystem (ACRL, 2015, p. 7). The first foundational idea from the ACRL (2015) is that "Authority is constructed and contextual," which emphasizes that information literate individuals recognize that there are differing types of expertise that are valued differently by different people; part of developing expertise in information literacy is develop strategies for recognizing expertise that move beyond "basic indicators of authority, such as types of publications or author credentials" (p. 12).

As our previous TikTok science communicator examples suggest, determining the credibility and authority of information in online environments is a crucial component of information literacy. Understanding the source and credibility of information is especially important for considering public dissemination of scientific information because of the crucial decisions that citizens, governments, and institutions make based on scientific discoveries. The COVID-19 crisis brought this importance starkly into focus, as the world depended on public science information to make decisions about how to protect themselves from infection and how best to treat symptoms when infected (Scheufele et al., 2021). The rapidly changing available information, especially in the early months of the pandemic, illustrate the need for responsible research and innovation, and, according to Maria Loroño-Leturiondo and Sarah R. Davies (2018), it presupposes the need for scientists "to be active in considering the implications of their work" and to communicate those findings, and the process of scientific research itself, clearly (p. 171).

Lateral Reading

As misinformation continues to run rampant on the internet, and the deployment of AI to create misinformation not only spreads misinformation more quickly but also makes it harder to identify, scholars have begun to investigate how reading practices might make individuals more or less susceptible to misinformation. Several studies have found that lateral reading, the process of quickly moving across websites to assess the credibility of information, is a more effective way of verifying credibility than is offered by linear reading, the process of staying in one website and attempting to verify the credibility of the information being presented (Breakstone et al., 2021; Wineburg et al., 2022; Wineburg & McGrew, 2019).

In "Lateral Reading and the Nature of Expertise: Reading Less and Learning More When Evaluating Digital Information," Sam Wineburg and Sarah McGrew (2019) investigated the differences in reading practices between historians with PhDs, professional fact checkers, and undergraduate students at Stanford, finding that fact checkers were able to assess the credibility of websites more accurately and in less time than the historians or undergraduate students. One significant difference between these populations is that historians and college students were more likely to read linearly, attempting to determine the accuracy of the information while remaining on the site. However, fact checkers were more likely to employ lateral reading strategies, where "one leaves a website and opens new tabs along the browser's horizontal axis, drawing on the resources of the Internet to learn more about a site and its claims" (p. 31). While students and historians were more likely to practice close reading strategies, "checkers ignored massive amounts of irrelevant (or less crucial) text in order to make informed judgments about the trustworthiness of digital information. In short, fact checkers read less but learned more" (p. 32). Lateral reading as literacy practice in digital media shows the extent to which literacy practices are changing and where expertise in this area might reside.

Scientific Literacy

Understanding and acting on public science communication necessitates a degree of scientific literacy, or a capacity to understand, critically evaluate, and engage with scientific concepts, processes, and evidence in a way that enables them to make informed decisions about scientific issues and phenomena. Jon D. Miller (1998) argued that scientific literacy "might be defined as the ability to read and write about science and technology" (p. 204). However, Miller went on to complicate this notion, noting that "given the wide array of scientific and technical applications in everyday life, scientific literacy might include everything from reading the label on a package of food, to repairing an automobile, to reading about the newest images from the Hubble telescope" (p. 204). Thus, science literacy encompasses the ability to comprehend basic scientific principles, recognize the nature of scientific inquiry, evaluate the reliability and credibility of scientific information, and apply scientific knowledge to interpret and address real-world problems.

Susanna Priest (2014), in "Critical Science Literacy: What Citizens and Journalists Need to Know to Make Sense of Science," argued, "Science literacy is often defined (or at least measured) as awareness of a collection of important scientific facts," but that this conception of science literacy, as well as the conception of science literacy as a set of discrete skills, is incomplete because the skills that people require to evaluate scientific claims differ from the skills scientists use to conduct research and develop scientific claims (p. 138). Priest (2014) advocated for this reframing of science literacy as "critical science literacy," which requires an awareness and "understanding of how science actually works, as a set of social institutions and practices that are governed by a set of normative and procedural assumptions" (p. 144). Thus, the responsibility for understanding science communication is in the hands of both the scientists and the citizens; while scientists and science communicators "are responsible for describing research in context, explaining details about the risks, controversies, and uncertainties surrounding the research process [and] engag[ing] with citizens respectfully as stakeholders" (Gigante, 2018, p. 5), citizens are also responsible for engaging with scientists and science communicators as citizens work to understand the scientific process.

In order for citizens to effectively assess the credibility and accuracy of scientific information, in addition to scientific knowledge and an understanding of how to assess the credibility of an author or publication venue, individuals must also "know something about the sociology of science, as well as something about the philosophy of science, to navigate a world full of competing truth claims about science" (Priest, 2014, p. 144). As academics and researchers, we may recognize and understand "the risks, controversies, and uncertainties surrounding the research process," (Gigante, 2018, p. 5), but others may assume that scientific knowledge shared is rooted in some objective, unquestionable fact—a point underscored by research suggesting that although the general public's ability to recall scientific facts has remained high, "knowledge of scientific methods and thinking appears to be less widespread" (National Academies of Sciences, Engineering, and Medicine, 2017, p. 30). Researchers have found that only 26% of Americans "could explain 'what it means to study something scientifically,' and only half of Americans (53 percent) had a correct understanding of randomized controlled experiments" (p. 30). As Priest (2014) pointed out, misconceptions about how science generates and views knowledge may impact the general public's ability to effectively evaluate public science communication.

Scientific literacy is difficult for non-scientists to develop, and assessing the extent of someone's scientific literacy is complex and challenging. Studies typically assess students' understanding of a particular scientific concept or "individual aspects of scientific literacy skills" rather than "scientific literacy as a whole" (Gormally et al., 2017). To address this challenge, Cara Gormally et al. (2017) developed the Test of Scientific Literacy Skills, which Pamela M. Propsom et al. (2023) employed in a longitudinal study to understand the impact of science general education. They found "rather small benefits of science general education overall, though there were larger improvements for some demographic groups (i.e., women, first-generation college students)," while STEM majors showed much greater development in their scientific thinking skills than non-STEM majors (p. 2). The challenge of developing scientific literacy is concerning because "we are more dependent than ever on the individual consumer of scientific information to have the wisdom to decide which claims are not credible, which are as yet tentative, and which should be accepted at face value as 'the truth'—or, at the very least, as today's truth" (Priest, 2014, p. 140). Less capacity for scientific literacy, then, may contribute to views of "science" as a source of stable, objective truth.

Digital Literacy

It would be difficult to understand how individuals engage with public science communication that occurs on social media sites without also considering the role of Digital Literacy. UNESCO's (2018) Global Framework for Reference on Digital Literacy defined digital literacy as "the ability to access, manage, understand, integrate, communicate, evaluate and create information safely and appropriately through digital technologies for employment, decent jobs and entrepreneurship. It includes competences that are variously referred to as computer literacy, ICT [Information and Communication Technology] literacy, information literacy and media literacy" (p. 6). Literacy practices in the digital age are constantly developing alongside technology as new media creates new opportunities to create and consume information.

A complicating factor for digital literacy is an individual's ability to understand digital genres, an idea contested by linguistics scholars (Belcher, 2023). There is consensus, however, that digital genres require rhetors to undergo a process that "reorganize(s), reinterpret(s), and refocus(es) [information] to achieve their communicative intention and satisfy the expectations of their imagined audiences" (Pérez-Llantada & Luzón, 2023, p. 12). Diane D. Belcher (2023) noted that digital genres have entirely blurred the lines between audience/author and general consumer/critic (2023, p. 36). This has created a mega-audience that is trying to consume information not actually intended for them. Digital genres are muddy, especially since social media influences perceptions of who gets to create public science communication and how. Scientists might generally argue that only scientists should convey scientific information to the public. In fact, Sara E. Brownell et al. (2013) argued for formalized communication training for college student scientists to transform information effectively for a lay audience. However, current issues in public scientific information can be perceived as having little to do with poorly articulated information from scientists, but rather the overabundance of well-articulated information from non-scientists.

Social media enables the decentering of expertise, for better or for worse. The open-access nature of the internet allows anyone to produce knowledge, so that unreliable and noncredible people can produce knowledge just as publicly available as trained researchers (Weinberger, 2011). The Wisdom of the Crowd phenomenon assumes that a large group of ordinary people can be wiser than any one expert (Arazy et al., 2006; Sunstein, 2006). Unfortunately, not all crowds are wise and the Internet "contains people who know much less than they think" (Weinberger, 2011, p. 63). In fact, Zizi Papacharissi (2002) argued that much of what is shared online are "hasty opinions" rather than knowledge (p. 16). Groups whose members demonstrate systematic bias can produce knowledge that is, at best, incorrect and, at worst, dangerous. Due to the ease of sharing information online, these falsehoods can quickly gain many believers (Weinberger, 2011). The extent to which students in our interviews would consider the danger of misinformation and would acknowledge the potential consequences of scientific misinformation being spread is of particular interest to us.

News Literacy

The spread of misinformation across the Internet has renewed interest in news literacy as a distinct literacy. Melissa Tully et al. (2022) argued that news literacy is comprised of "five knowledge and skills domains—context, creation, content, circulation and consumption" (p. 1593). This definition is placed in contrast to a perspective of news literacy that "is sometimes narrowly framed as the transfer of verification skills so consumers can check facts and sources and identify misinformation" (p. 1591). That is, news literacy includes fact-checking steps but also involves situating information in broader contexts that understand the creators and consumers of this information.

When considering how news literacy relates to engagement with public science communication, it is important to consider what "counts" as news, particularly in our contemporary digital landscape. While news may have once come from an established newspaper or a news program on network television, cable, the Internet, and streaming media have widened our options for receiving news. Pew Research Center (2024a) found that, in 2024, 86% of U.S. adults "at least sometimes get news from a smartphone, computer or tablet, including 57% who say they do so often." In contrast, only 33% of Americans "often get news from TV," with 26% "often or sometimes get[ting] news in print" (Pew Research Center, 2024a). When looking at which platforms U.S. adults use to get their news on digital devices, news apps or websites and search engines were the most common, with two-thirds of respondents sometimes getting news in these ways. However, 54% sometimes get news from social media, and 27% sometimes get news from podcasts (Pew Research Center, 2024a). Facebook and YouTube are the most common social media sites for U.S. adults to get news, with "about a third of U.S. adults say[ing] they regularly get news on each of these two sites" (Pew, 2024b).

This shift in accessing news to include a wider variety of digital sources is particularly important in light of social media's role in the spread of misinformation. In a study on the spread of information on Twitter (now X), researchers found that misinformation travels significantly faster than truth, with true stories rarely reaching more than 1,000 users, while the top 1% of false stories reached between 1,000 and 10,000 people, with fake news traveling approximately six times faster than true news (Vosoughi et al., 2018). Additionally, Soroush Vosoughi et al. (2018) found that "falsehoods were 70% more likely to be retweeted than the truth" (p. 1149). It is also important to note that this study was conducted after removing bots from the sample analysis, determining that "false news spreads farther, faster, deeper, and more broadly than the truth because humans, not robots, are more likely to spread it" (p. 1150). Thus, news literacy and science literacy are inexorably linked, as much of the scientific information consumed by the general public is distributed via news media.

It remains necessary to consider what new literacy practices make it possible to consume and create (scientific) information in digital and social media saturated societies. In this way, we expect our study to reveal a bit about which sources for scientific information students counted as sites for public science communication and which they counted on as reliable. We return to the relationship between literacies and understanding public science communication in our analysis, where we explore more fully the strategies that study participants shared in their interviews.

Methodology and Methods

Our research team initially wanted to better understand the role that public science communication has in our students' lives. We set out to use TikTok-length interviews to learn more about how students enrolled at a STEM-focused institution consume or disseminate scientific information in public spaces. Our methodology and approach were more heavily influenced by two members of our team who planned and conducted the majority of the research. Caitlin Martin and Sandy Branham are rhetoric, composition, and writing studies scholars who took a more exploratory approach to the project with a broad research question. Jennifer Wojton studies digital culture and is especially interested in media literacy. Jenna Hejnar has a literary studies background, and this study was her first foray into human subjects research. Jessica Snitko is a communications scholar who conducts primarily quantitative research on social media user interaction. This interdisciplinarity means that the frames commonly accepted without question in one area of study may be complicated by or in conflict with the frames of research from other areas. By stating this directly, we hope to model and address a frequent challenge of interdisciplinary study.

Our research questions changed over time as we settled on our methods, conducted our research, and began to analyze our data. Ultimately, this webtext shares our findings on three research questions:

  1. Has the prevalence of public science information contributed to students' decisions to pursue specific degrees?
  2. What sources of public science information do our students encounter, and how do they assess their accuracy?
  3. How do our students think the general public determines the accuracy of public science information?

To answer these questions, we conducted short semistructured interviews with 21 students enrolled at our private, STEM institution in Florida. We chose interviews partially to replicate the social media content we wanted to ask students about. We held interviews in our Center for Communication and Digital Media (CCDM), which provides writing and communication support and has a variety of digital recording and production tools. One member of our team read and discussed the consent form with our participants. As part of this process, participants could choose to give us permission to use the video recording of their interview in our publication. After they consented, another member of the team conducted the interview. We first provided a definition of public science communication as "communication that occurs in popular press courses or on social networking sites," and then we asked the following questions:

  1. What is your year and major?
  2. Where do you get scientific information?
  3. How do you determine its accuracy?
  4. How does the public determine the accuracy of scientific information?
  5. Why does this matter?
  6. If you've experienced public science communication (before your time at ERAU), what part did that play in your decision to pursue a STEM degree—or not a STEM degree?

Interviews ranged from 2–6 minutes, depending on the length of participant responses and whether the interviewer asked any follow-up questions. We held two days of interviews, with some team members recruiting students and others conducting interviews.

To begin the coding process, each member of the research team watched the videos and took notes on themes they noticed in participant responses. After discussing these general trends, three team members took primary responsibility for analyzing video data for our three research questions. After analyzing and coding on our own, we met to develop consensus on the thematic codes we present in our analysis. Three members of the research team then took responsibility for drafting scripts of the videos in our analysis section, which received feedback by other members of the team before final recording.

Participant Demographics

Of the 21 participants included in our study, 20 were undergraduate students and one was a master's student (refer to Table 1).

Table 1. Participants by Academic Year
Year Number of Students
First-year 6
Sophomore 1
Junior 11
Senior 2
2nd-year master's 1

In addition, 18 of the students were in STEM fields (refer to Table 2), such as aeronautical science and engineering. Only three participants represent fields that are more closely aligned with the humanities: Two participants majored in communication and one in homeland security and intelligence studies. Though some students reported double majors or specific program tracks within a major, we used institutional records to determine the broadest name of their primary major, which we report in the table below. We decided to exclude double majors and minors from our reporting both to simplify our analysis and because the naming conventions at our institution can sometimes cause confusion between degree plans in multiple areas: For instance, our College of Aviation offers a major in aeronautics, while our College of Engineering's Aerospace Engineering program has an aeronautics track, and the two are not always distinguished in short conversations like those of our interviews. We found it easier to use everyone's official major or program of study once we realized we needed to verify others' responses with institutional records.

Table 2. Participant Majors
Major Number of Students
Aeronautical Science 3
Aeronautics 1
Aerospace Engineering 6
Astronomy & Astrophysics 3
Civil Engineering 1
Communication 2
Engineering Physics 1
Homeland Security & Intelligence 2
Mechanical Engineering 1
Occupational Safety Management 1

We also used institutional data to determine participants' self-reported race, ethnicity, and gender identity. Of our 21 respondents, the majority reported their race as white (refer to Table 3). Race and ethnicity information, then, is limited by the way our institution collects data. For instance, two students were labeled "not Hispanic" on campus records and three were labeled "Hispanic," but 17 had no entry for ethnicity. In addition to race and ethnicity, we also found that two of the 21 respondents were international students, something both had discussed in their interviews.

Table 3. Participant Race and Ethnicity
Race and Ethnicity Number
Asian 2
Black/African American 3
White 14
Multiple 1
Not specified 1
Hispanic 3

Finally, 12 of our participants identified as male and nine as female. Our study, then, has a higher percentage of female representation than our overall institution, which reported 27% female enrollment on our campus in fall 2023.

Limitations

Our small sample size does limit our study in that our findings are not generalizable. In addition, choosing to study only students at our STEM institution may also be a limitation. We thought this could be an interesting site for a pilot study of public science communication given that our students may have more desire to engage in public science communication. Certainly, we expected that STEM students may have different responses about the accuracy and credibility of public science communication given their interest and developing expertise. We see our study as a valuable starting point for continued research.

One widely expressed concern in interview-based research relates to social desirability theory, which suggests that some individuals may give responses that present themselves in ways that they see as socially desirable (Bergen & Labonté, 2020; Huang et al., 1998). Because we conducted interviews ourselves and introduced ourselves as faculty members, using more formal titles, participants may have wanted to provide the answers they believed we wanted. That is, some participants may have tried to anticipate and say what we "wanted" rather than sharing their genuine processes or thoughts. The influence of social desirability theory may have been exacerbated for some participants, who were previous students of one or more members of the research team. Like other teacher–researchers who study students they've worked with closely, we recognize that "students might feel they needed to please [us] in their interviews," as Elizabeth Wardle (2007, p.71) put it. Researchers, however, are fundamentally "invisible interlocuters" (Harding et al., 2022, p. 86) that shape responses through the design of their research, regardless of previous relationships or research methods. Here, we present our analysis with the recognition that our complex role as teacher–researchers influences not only our participants' responses but also our analysis and discussion of the findings. In our conclusion, we suggest strategies for continued study that help mitigate some of these effects.

Analysis

Where do respondents "get" scientific information?

Watch video

How did participants determine the accuracy of scientific information?

Watch video

How does the public determine the accuracy of scientific information?

Watch video

Why does it matter how the public assesses the accuracy of scientific information?

Watch video

Did students experience public science communication before enrolling at our university?

Watch video

How did public science communication influence their decision to earn a STEM degree?

Watch video

Discussion &
Conclusion

References

  • American Library Association (2024). Information literacy. Retrieved 2024 from https://literacy.ala.org/information-literacy/
  • AP News (2023, April 8). Report: Florida officials cut key data from vaccine study. AP News. https://apnews.com/article/florida-ladapo-covid19-vaccines-c498ffcb2393a1fffd692e1687e62e4e
  • Arazy, Ofer, Morgan, Wayne, & Patterson, Raymond. (2006). Wisdom of the crowds: Decentralized knowledge construction in Wikipedia. In 16th Annual Workshop on Information Technologies & Systems (WITS) Paper. https://doi.org/10.2139/ssrn.1025624
  • Association of College and Research Libraries. (1989, January 10). Presidential committee on information literacy: Final report. American Library Association. https://www.ala.org/acrl/publications/whitepapers/presidential
  • Association of College and Research Libraries. (2015, February 2). Framework for information literacy for higher education. American Library Association. https://www.ala.org/sites/default/files/acrl/content/issues/infolit/framework1.pdf
  • Bergen, Nicole, & Labonté, Ronald. (2020). "Everything is perfect, and we have no problems": Detecting and limiting social desirability bias in qualitative research. Qualitative Health Research, 30(5), 783–792. https://doi.org/10.1177/1049732319889354
  • Belcher, Diane D. (2023). Digital genres: What they are, what they do, and why we need to better understand them. English for Specific Purposes, 70, 33–43. https://doi.org/10.1016/j.esp.2022.11.003
  • Bonus, James Alex. (2019). The impact of pictorial realism in educational science television on U.S. children's learning and transfer of biological facts. Journal of Children and Media, 13(4), 433–451. https://doi.org/10.1080/17482798.2019.1646295
  • boyd, danah. (2014). It's complicated: The social lives of networked teens. Yale University Press.
  • Breakstone, Joel, Smith, Mark, Connors, Priscilla, Ortega, Teresa, Kerr, Darby, & Wineburg, Sam. (2021). Lateral reading: College students learn to critically evaluate internet sources in an online course. Misinformation Review. https://misinforeview.hks.harvard.edu/article/lateral-reading-college-students-learn-to-critically-evaluate-internet-sources-in-an-online-course/
  • Brownell, Sara E., Price, Jordan V., & Steinman, Lawrence. (2013). Science communication to the general public: Why we need to teach undergraduate and graduate students this skill as part of their formal scientific training. Journal of Undergraduate Neuroscience Education, 12(1), E6–E10.
  • Burns, T. W., O'Connor, D. J., & Stocklmayer, S. M. (2003). Science communication: a contemporary definition. Public Understanding of Science, 12(2), 183–202. https://doi.org/10.1177/09636625030122004
  • Buturoiu, Raluca, Durach, Flavia, Udrea, Georgiana, & Corbu, Nicoleta. (2017). Third-person perception and its predictors in the age of Facebook. Journal of Media Research, 10(2), 18–36. :10.24193/jmr.28.2
  • Caulfield, Mike. (2019, June 19). SIFT (the four moves). Hapgood. https://hapgood.us/2019/06/19/sift-the-four-moves/
  • Calvert, Sandra L., & Kotler, Jennifer A. (2003). Lessons from children's television: The impact of the Children's Television Act on children's learning. Journal of Applied Developmental Psychology, 24(3), 275–335. https://doi.org/10.1016/S0193-3973(03)00060-1
  • Chang, Chingching. (2021). Fake news: Audience perceptions and concerted coping strategies. Digital Journalism, 9(5), 636–659. https://doi.org/10.1080/21670811.2021.1923403
  • Corbu, Nicoleta, Oprea, Denisa-Adriana, Negrea-Busuioc, Elena, & Radu, Loredana. (2020). "They can't fool me, but they can fool the others!" Third person effect and fake news detection. European Journal of Communication, 35(2), 165–180. https://doi.org/10.1177/0267323120903686
  • Davison, W. Phillips. (1983). The third-person effect in communication. The Public Opinion Quarterly, 74(1), 1–15.
  • Doyle, Brandon. (2024, May 7). TikTok statistics—Updated May 2024. Wallaroo Media. https://wallaroomedia.com/blog/tiktok-statistics/
  • Gee, James Paul. (2015). The new literacy studies. In Jennifer Rowsell & Kate Pahl (Eds.), The Routledge handbook of literacy studies (pp. 35–48). Routledge.
  • Gigante, Maria E. (2018). Introducing science through images: Cases of visual popularization. University of South Carolina Press. https://doi.org/10.2307/j.ctv6sj8kf
  • Gondwe, Gregory. (2023). Audience perception of fake news in Zambia: Examining the relationship between media literacy and news believability. International Communication Research Journal, 57(3), 47–61. https://scholar.harvard.edu/files/ggondwezunda/files/audience_perception_of_fake_news2.pdf
  • Gormally, Cara, Brickman, Peggy, & Lutz, Mary. (2017). Developing a Test of Scientific Literacy Skills (TOSLS): Measuring undergraduates' evaluation of scientific information and arguments. CBE—Life Sciences Education, 11(4), 364–377. https://doi.org/10.1187/cbe.12-03-0026
  • Harding, Lindsey, King, Joshua, Bonanno, Anya, & Powell, Joseph. (2022). Feedback as boundary object: Intersections of writing, response, and research. Journal of Response to Writing, 8(2), 73–105. https://scholarsarchive.byu.edu/journalrw/vol8/iss2/4/
  • Huang, Chiou-yan, Liao, Hsin-ya, & Chang, Sue-Hwang. (1998). Social desirability and the clinical self-report inventory: Methodological reconsideration. Journal of Clinical Psychology, 54(4), 517–528. https://doi.org/10.1002/(SICI)1097-4679(199806)54:4%3C517::AID-JCLP13%3E3.0.CO;2-I
  • Huttner-Koros, Adam. (2015, August 21). The hidden bias of science's universal language. The Atlantic. https://www.theatlantic.com/science/archive/2015/08/english-universal-language-science-research/400919/
  • Ito, Mizuko, Baumer, Sonja, Bittanti, Matteo, boyd, danah, Cody, Rachel, Herr-Shephardson, Becky, & Tripp, Lisa. (2009). Hanging out, messing around, and geeking out: Kids living and learning with new media. MIT Press.
  • Jenkins, Henry. (2006). Convergence culture: Where old and new media collide. New York University Press. http://www.jstor.org/stable/j.ctt9qffwr
  • Jenkins, Henry, Purushotma, Ravi, Clinton, Katherine, Weigel, Margaret, & Robison, Alice J. (n.d.). Confronting the challenges of participatory culture: Media education for the 21st century. Macarthur Foundation.
  • Kruger, Justin, & Dunning, David. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134. https://psycnet.apa.org/doi/10.1037/0022-3514.77.6.1121
  • Learn about Consensus. (2025). Consensus. https://web.archive.org/web/20250115000136/https://consensus.app/home/about-us/
  • Lindquist, Julie. (2015). Literacy. In Paul Heilker & Peter Vandenberg (Eds.), Keywords in writing studies (pp. 99–102). Utah State University Press; University Press of Colorado.
  • Loroño-Leturiondo, Maria, & Davies, Sarah R. (2018). Responsibility and science communication: Scientists' experiences of and perspectives on public communication activities. Journal of Responsible Innovation, 5(2), 170–185. https://doi.org/10.1080/23299460.2018.1434739
  • Martin, Bella, & Hanington, Bruce. (2012). Universal methods of design: 100 ways to research complex problems, develop innovative ideas, and design effective solutions. Rockport.
  • Meyer, Jan H. F., & Land, Ray (Eds.). (2006). Overcoming barriers to student understanding: Threshold concepts and troublesome knowledge. Routledge
  • Miller, Jon D. (1998). The measurement of civic scientific literacy. Public Understanding of Science, 7(3), 203–223. https://doi.org/10.1088/0963-6625/7/3/001
  • National Academies of Sciences, Engineering, and Medicine. (2017). Communicating science effectively: A research agenda. National Academies Press. https://doi.org/10.17226/23674
  • Odell, Lee, Goswami, Dixie, & Herrington, Anne. (2022). The discourse-based interview: A process for exploring the tacit knowledge of writers in nonacademic settings. Composition Forum, 49. https://compositionforum.com/issue/49/discourse-based-interview.php (Original work published 1983)
  • Papacharissi, Zizi. (2002). The virtual sphere: The internet as a public sphere. New Media & Society, 4(1), 9–27. https://doi.org/10.1177/14614440222226244
  • Perloff, Richard M. (1999). The third person effect: A critical review and synthesis. Media Psychology, 1(4), 353–378. https://doi.org/10.1207/s1532785xmep0104_4
  • Pérez-Llantada, Carmen, & Luzón, María-José. (2023). Genre networks: Intersemiotic relations in digital science communication. Routledge.
  • Pew Research Center. (2019, March 26). For local news, Americans embrace digital but still want strong community connection. https://www.pewresearch.org/journalism/2019/03/26/for-local-news-americans-embrace-digital-but-still-want-strong-community-connection/
  • Pew Research Center. (2024a, September 17). News platform fact sheet. https://www.pewresearch.org/journalism/fact-sheet/news-platform-fact-sheet/
  • Pew Research Center. (2024b, September 17). Social media and news fact sheet. https://www.pewresearch.org/journalism/fact-sheet/social-media-and-news-fact-sheet/
  • Pew Research Center. (2022, June 6). Americans’ views of government: Decades of distrust, enduring support for its role. https://www.pewresearch.org/politics/2022/06/06/americans-views-of-government-decades-of-distrust-enduring-support-for-its-role/
  • Priest, Susanna. (2014). Critical science literacy: What citizens and journalists need to know to make sense of science. Bulletin of Science, Technology & Society, 33(5–6), 138–145. https://doi.org/10.1177/0270467614529707
  • Propsom, Pamela M., Tobin, William M., & Roberts, Jacqueline R. (2023). Test of Scientific Literacy Skills (TOSLS) indicates limited scientific thinking gains as a result of science and mathematics general education. Interdisciplinary Faculty Scholarship. https://scholarship.depauw.edu/interdisciplinary_facpubs/1/
  • Reuter, Christian, Hartwig, Katrin, Kirchner, Jan, & Schlegel, Noah. (2019). Fake news perception in Germany: A representative study of people's attitudes and approaches to counteract disinformation. In Wirtschaftsinformatik Proceedings 2019 (pp. 1069–1083). Association for Information Systems. https://aisel.aisnet.org/wi2019/track09/papers/5/
  • Scheufele, Dietram A., Hoffman, Andrew J., Neeley, Liz, & Reid, Czerne M. (2021). Misinformation about science in the public sphere. PNAS, 118(15), e2104068118. https://doi.org/10.1073/pnas.2104068118
  • Sunstein, Cass R. (2006). Infotopia: How many minds produce knowledge. Oxford University Press.
  • Tully, Melissa, Maksl, Adam, Ashley, Seth, Vraga, Emily K., & Craft, Stephanie. (2022). Defining and conceptualizing news literacy. Journalism, 23(8), 1589–1606. https://doi.org/10.1177/14648849211005888
  • United Nations Educational, Scientific and Cultural Organization Institute for Statistics. (2018). A global framework of reference on digital literacy skills for indicator 4.4.2. https://unesdoc.unesco.org/ark:/48223/pf0000265403.locale=en
  • Van Noorden, Richard. (2023). How big is science's fake-paper problem? Nature, 623, 466–467. https://doi.org/10.1038/d41586-023-03464-x
  • Vosoughi, Soroush, Roy, Deb, & Aral, Sinan. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151. https://doi.org/10.1126/science.aap9559
  • Wardle, Elizabeth. (2007). Understanding "transfer" from FYC: Preliminary results of a longitudinal study. WPA: Writing Program Administration, 31(1–2), 65–85. https://associationdatabase.co/archives/31n1-2/31n1-2wardle.pdf
  • Weinberger, David. (2011).Too big to know: Rethinking knowledge now that the facts aren't the facts, experts are everywhere, and the smartest person in the room is the room. Basic Books.
  • Weingart, Peter, & Guenther, Lars. (2016). Science communication and the issue of trust. Journal of Science Communication, 15(5), C01. https://doi.org/10.22323/2.15050301
  • Williams, Matt N., Marques, Matthew D., Hill, Stephen R., Kerr, John R., & Ling, Mathew. (2022). Why are beliefs in different conspiracy theories positively correlated across individuals? Testing monological network versus unidimensional factor model explanations. British Journal of Social Psychology, 61(3), 1011–1031. https://doi.org/10.1111/bjso.12518
  • Wineburg, Sam, & McGrew, Sarah. (2019). Lateral reading and the nature of expertise: Reading less and learning more when evaluating digital information. Teachers College Record, 121(11), 1–40. https://doi.org/10.1177/016146811912101102
  • Wineberg, Sam, Breakstone, Joel, McGrew, Sarah, Smith, Mark D., & Ortega, Teresa. (2022). Lateral reading on the open Internet: A district wide field study in high school government classes. Journal of Educational Psychology, 114(5), 893–909. https://doi.org/10.1037/edu0000740
  • Yang, Fan, & Horning, Michael. (2020). Reluctant to share: How third person perceptions of fake news discourage news readers from sharing "real news" on social media. Social Media + Society, 6(3), 1–11. https://doi.org/10.1177/2056305120955173