A Multisensory Literacy Approach to Biomedical Healthcare Technologies: Aural, Tactile, and Visual Layered Health Literacies

by Kristin Marie Bivens, Lora Arduser, Candice A. Welhausen, & Michael J. Faris

Healthcare Technology Echoing: An Aural Health Literacy
Kristin Marie Bivens

Exposing Expectations of Sound in an Intensive Care Unit

On a surprisingly warm January day in 2016, I visited the Library of Congress in Washington, D.C. After filling out some paperwork, providing my identification, and finding the entrance (after checking in my belongings in a cloak room), I was allowed to enter the reading room.

As you can imagine, it was quite an amazing experience to be in the Library of Congress.

After I entered the reading room, I sat down behind a computer, not planning on researching. Instead, I thought I would observe.

I closed my eyes to heighten my sense of hearing. And I listened.

I heard the quiet tap tap-tap-tap of people typing on laptops.

I heard a patron cough.

It was quiet. As you know, libraries tend to be quiet spaces. Quiet in libraries is the expectation. But what about quiet in Neonatal Intensive Care Units (NICUs)? What level of quiet is expected in those spaces? To offer a comparison, please listen to the sounds I recorded on one day during data collection in a NICU in the southwestern United States.1

Note: This recording was from an empty pod; all sounds and noises are heard in spaces adjacent to the empty pod. Additionally, any clearly heard voices have been edited out. Throughout, the "beep beep beep" from a physiological monitor can be heard.
[Small metal objects clinking together] / [Louder ding, ding, ding] / [Rustling metal—a screw and a screwdriver] / [Rustling metal] / [Background knocking around noise like metal on metal] / [Metal clinking and sounds of clothes shifting with movement] / [Muffled voices in the background] / [Sounds of people moving] / [Door creaking and closing] / [Ping-ding, ping-ding, ping-ding, ping-ding from a monitor] / [Muffled voice over walkie-talkie] / [A man's voice, but words are muffled] / [Ding ding ding ding ding ding ding ding (fast-paced) of a healthcare monitor]

The Library of Congress Reading Room is shown again to juxtapose the sounds heard in the NICU recording. Both spaces are expected to be quiet. The library is quieter than the NICU.
Library of Congress Reading Room (photograph by Carol M. Highsmith, 2009, in the public domain)
An area of a NICU that's empty, showing empty pods with incubators and a desk with a computer and chair
Empty pod in a NICU (photograph by author)

I recorded almost 25 minutes of audio. I was not in an occupied nursery or pod. Instead, I sat by myself. In fact, I was granted permission to record audio only from an empty pod. So, there were no babies or nurses or anyone in the nursery with me.

I juxtapose the expectation of sound at the Library of Congress and the actual sounds and noises recorded in a NICU in the United States because assumptions about quiet in both a library and a NICU can lay bare our expectations regarding reasonable levels of quiet or sound. Patients may expect quiet in a NICU, but my experiences show these spaces are full of sounds. Why are expectations of quiet met in a library, yet violated in a NICU?

Further, since intensive care units (of any kind) account for stimulation, like noises and sounds, the juxtaposition is helpful to think about how a non-expert user might experience sound and noise in an intensive care unit, like a NICU. And finally, using an echo as methodology, I listen to two NICUs within that echo: one in the United States and the other in Denmark. Through the case study I present, I use the research question "what user is privileged by physiological monitors in both NICUs?" to guide my analysis of healthcare technology and physiological monitors from these two research sites.

Using Aural Communication and Healthcare Technologies

Aural communication (or for this case study, the sounds produced in a NICU) is an underexamined layer of healthcare communication. Aural communication includes both sound and noise. Sound has a purpose, including alerting healthcare professionals—expert users—of an important clinical event. Noise has no purpose, and it disrupts the space, and the care in that space. However, as the research I present highlights, often noise is indistinguishable from purposeful sound to the non-expert user or listener.

As healthcare research has shown, intentional sounds can be interpreted as noise, even by those who are explicitly trained to listen for those intentional sounds. Studies of the effects of auditory alarms from physiological monitors on nurses show constant alarming leads to alarm fatigue, in which nurses experience "sensory overload when… exposed to an excessive number of alarms, which can result in desensitization to alarms and missed alarms"—a problem that has led to patients' deaths (Sendelbach & Funk, 2013, p. 378; see also Sowan, Tarriela, Gomez, Reed, & Rapp, 2015). Other studies have shown that constant alarms from physiological monitors lead to infants' hearing loss and sleep disruption (Antonucci, Porcella, & Fanos, 2009; Hsin-Li Chen et al., 2009; Krueger, Horesh, & Crossland, 2012). These studies, which are all based in U.S. NICUs, suggest negative impacts of alarming physiological monitors on these users.

Technical communicators and rhetoricians specifically design for users' experiences. My intention is to further bring awareness to users', especially non-expert users', experiences with physiological monitors and other healthcare technologies in NICU settings.2 I argue that aural communication is a layer of health literacy—a multisensorial health literacy that offers a new consideration of healthcare communication more closely linked to non-experts' experiences with healthcare technologies. The multisensory awareness suggests that embodied health literacy practices are necessary for non-experts to navigate healthcare technologies—an argument explored in this piece.

Using research from a NICU in the United States and a NICU in Denmark, I sonically compare and analyze instances of aural literate practices from these two NICU settings. To do so, I use echoes metaphorically as a methodology to analyze the aurality of healthcare technologies in these NICUs. Sonar involves original sounds echoing off objects in order to sense the shape and form of those objects at a distance. Take for example sonar as navigation. When animals like bats, dolphins, and whales use echolocation as a means to navigate, they emit purposeful sounds to bounce off objects to determine distance and detect objects within their sonar range. Echolocation is used when other senses—like the visual—cannot contribute to navigation.

Methodologically, then, I use echoes metaphorically to listen to one NICU through the sounds of another NICU. The sounds originate from the NICU in the southwest United States. As a researcher, I found those sounds—the physiological monitors—echo (or resonate) in the Danish Neonatalklinikken. Thus, my interpretations of sounds in one NICU are influenced by my experiences in another, comparatively different, NICU. As an aural method to compare and contrast two seemingly disparate intensive care units, sound reflection—the primary echoic property—helps me to consider how one NICU sounds to and within the other, specifically via healthcare technologies.

The echo methodology is appropriate to compare two distinct healthcare spaces in two different countries, too. Although comparisons are helpful to delineate and explore similarities and differences, finding comparable measures to place side-by-side is challenging under such healthcare research circumstances. Specifically, since each NICU exists within its own culturally distinct healthcare system, which is sustained, enabled, and constrained by different economic frameworks for supporting each healthcare system, the variables to contextualize a traditional comparative analysis are more numerous and, in turn, onerous. As such, it is unlikely to provide a compelling comparative analysis of differently geographically situated NICUs. In fact, upon first glance, these healthcare spaces appear to be incongruent.

However, it is the seeming incongruence or incommensurability (Derkatch, 2016) that provided the opportunity to determine what these NICUs had in common: the physiological monitors. In both NICUs, the physiological monitors produced similar sounds and noises. The physiological monitors' sounds and noises were heard by nurses, parents, and infants. Finally, those nurses, parents, and infants have different levels of expertise to determine what those physiological monitors sounds and noises mean, as the two cases below demonstrate. Once the physiological monitors were identified as a common element between these two seemingly incongruent research sites, the research question became, "What user is privileged by healthcare technologies in both NICUs?"

Returning to the echo methodology, I also suggest that aural health literacies are echoic in nature, and I use this echoic property to attend to the sounds in these healthcare technologies. As Anne Frances Wysocki (2010) has argued, "our sensuous perceptions of the world do not just happen 'naturally' but come to their shape in our varying, complex, and socially embedded environments" (p. 104). This "sensuous training" (p. 104) plays out in how nurses learn to interpret alarms from physiological monitors as noise and how non-experts respond to alarms with unnecessary urgency. And, as I show, our interpretations of sounds and noises are informed by echoes of prior aural experiences—akin to how literacy is "haunted" by prior experiences with technologies, as Sarah J. Sloane (1999) has shown. In other words, our interpretations of sounds in a NICU are directly tied to expertise (i.e., "sensuous training") and are influenced (i.e., "haunted") by our prior aural experiences.

Physiological monitors track vital signs.

Vital signs are constantly surveilled.

The physiological monitors make sounds and noise.

This discussion contributes to further understanding the role of healthcare technologies in the care of premature and sick infants and those technologies' effects on non-expert users. In the cases I examine, the biomedical interpretation of bodies through technology-enabled physiological monitors are misunderstood by non-expert users. The misunderstanding suggests an inattention to embodied, sensuous attention to health literacy. And to that end, I contribute to our argument for layered healthcare communication for non-expert users to reflect embodied, multisensory health literate practices, specifically aurality.

A Baby's Sneeze as a Biomedical Event

During data collection in the United States NICU, I specifically sought to observe communicative exchanges between parents and nurses. Most of the observations included communication around the routine care of babies: changing diapers, collecting vital sign information, and feeding or nursing. The routine care was clustered, and it was dependent on the baby’s eating schedule, which was normally every 3 hours.

In between these clustered care routines, which typically lasted 1–2 hours, I waited in a NICU pod in the southwest United States. One day as I waited, a baby—a grower—who was surveilled with a physiological monitor was sleeping in an open crib and sneezed. Her physiological monitor alarmed. As the monitor alarmed, she stirred and briefly woke.

In the United States, alarm fatigue research has suggested that alarm settings for clinical events are too sensitive (Sendelbach & Funk, 2013; Sowan et al., 2015). Moving from patient to patient for unimportant clinical events exhausts nurses. The baby's alarm was not attended to because almost as fast as it sounded, it stopped—it created distracting alarm noise for an unimportant clinical event, not purposeful alarm sound for an important clinical event.

Nurses are expert and experienced users of healthcare technologies, but not all listeners of healthcare technology alarms are experts. What happens when something as simple as a baby's sneeze is turned into a biomedical event, sounding an alarm and creating noise? How can non-expert and inexperienced users in healthcare spaces distinguish between a serious biomedical clinical event and a less serious clinical event? What happens if users do not or can not distinguish important from unimportant clinical events?

A Father's Concern as a Biomedical Event

On a different day of data collection in the U.S. NICU, a new father sat behind a drawn curtain with the mother and their new baby. The mother, as I observed and heard, wanted privacy while she breastfed their new baby. Since mom was not yet discharged from the hospital, she was connected to an intravenous fluid (IV) pump, which is common for women who have recently delivered babies. Suddenly, an alarm sounded from behind the curtain: a sharp ding–pause, ding–pause, ding–pause filled the pod from behind the closed curtain.

Not immediately, but after several minutes, the naturally concerned father responded to the alarm. Dad opened the curtain and asked me to find his baby's nurse (a man). I immediately jumped up, found his baby's nurse, and told him to come to the pod (he was helping another nurse in a different pod).

The father spoke quickly to his baby's nurse and asked for a woman nurse (since his wife was breastfeeding and presumably had exposed breasts). The baby's nurse located a woman nurse to attend to the IV pump's alarm—the reason for the father's concern.

Behind the closed (not soundproof) curtain that didn't reach the floor, the woman nurse explained that it was only the IV pump alerting to indicate mom's IV fluid bag would be empty soon—not an emergency that needed immediate attention—and the sound stopped. The woman nurse opened the curtain and exited. Within ten minutes, the alarm for the IV pump was sounding again. This time, after loudly complaining about the alarm to the mother, the father moved quickly to go find a nurse, causing him to trip and fall on the ground (as I witnessed through the gap between the curtain and floor).

I Heard It in the Echo: Two Countries' Use of Healthcare Technologies

I hesitate to draw direct comparisons between the two NICUs in my study.3 As you can imagine, each culture was different, with one in Copenhagen, Denmark and the other in the southwestern part of the United States.4 The languages were different, the healthcare systems the NICUs functioned within were different, the hospitals were different, and the cultures were different.

Comparing two dissimilar and nuanced healthcare spaces was challenging. The analytical movement I used to compare both research sites can be understood in the work of echoes. Echoes involve sound bouncing off material and filling a space with the resonances of an originary sound. If the sound is not reflected off a material, then the original sound cannot be heard as an echo. The echo is not a replication of the original sound, but a reflection of the original sound with a difference that can be heard.

Using sound (and echoes) in healthcare technologies is not a new phenomenon. Medical sonography and ultrasound technologies have been used in healthcare for decades. Acoustic sciences, rooted in physics, has helped to develop sonar, doppler, echolocation, and ultrasound technologies. Sonography and ultrasound use echolocation to locate a fetus and project a digitized visualization of the fetus outside a mother's womb onto a screen. Echoes, and echolocation technologies, require an attunement in listeners to location, to attending to the architecture of spaces and to the original sources of the echoes.

As a methodological framework, echoes, too, offer positioning benefits. In other words, the United States NICU can be more loudly and more clearly heard in the Danish NICU. Consequently, without as much aural strength or loudness, the Danish NICU can be more clearly understood in the United States NICU as a quieter space. Each research site echoes in the other. Thus, each NICU can be heard more clearly, and in one case, more loudly, in the echo. In the echo framework, as noted earlier, I heard a similarity: the healthcare technologies and physiological monitors.

The United States NICU Soundscape

Before answering my research question—"what user is privileged by healthcare technologies in both NICUs?"—I need to more deeply contextualize each of the research sites.

These signs greet you as you enter the NICU in the southwestern United States.

A large poster with a photo of a young boy with one finger to his lips gesturing shhh. Identifying marks (names, faces) are blurred out. Text on the poster reads: Shhh! (NICU name) is committed to creating a quiet environment for you to rest and heal.
A sign that greets visitors right before they enter the United States NICU (photograph by author)
A small sign has a notice that reads: Attention!! Daily from 1:30-3:00 PM is Siesta Time. This is a time when the lights will be dimmed and NICU staff and visitors will limit noise levels to simulate daily nap time in the home environment. It will provide a deep, quiet sleep time for the babies to help them achieve a day-night sleep schedule.
A sign explaining quiet time that greets visitors when they check into the United States NICU (photograph by author)

It is clear that quiet is understood as essential for the healing and growing of infants in the NICU environment (even by non-experts, regardless of geography). Siesta time is designated as quiet time, and "NICU staff and visitors will limit noise levels to simulate daily nap in the home environment," according to the sign.

One day upon entering the NICU, the soundscape was dominated by the testing of the unit's alarms throughout the entire NICU. I noted the time of my arrival at 1:40 p.m. Although I am not sure what time the audible alarm testing began, it did not end until after siesta time had begun.

What message is sent by the alarm testing during siesta time? In other words, if hospital practice does not abide or respect siesta or quiet time, what message does that send to the non-expert parent or other legal caregivers who are being trained aurally to exist—even temporarily—in this space? How are these non-expert users being "sensory trained," and how will they be haunted by their NICU experience?

I recorded this audio during an interview with Cassie (a pseudonym), a respected and clinically skilled transport nurse. It should be noted that there was an extremely premature baby in the room during the interview and the baby was on High Frequency Oscillating Ventilation (HFOV). You can hear the HFOV throughout the interview.

Note: The audio file has been edited to remove the nurse's voice. [High Frequency Oscillating Ventilator (HFOV) heard in the background throughout the entire audio recording]
Bivens: … busy, so I really appreciate you taking the time. … I wanted to ask, and if I'm in the way at any point just please tell me to move. … Um, … [Healthcare monitor beeps] the other day, …
Nurse: Uh huh
Bivens: … in Pod C,
Nurse: Mmmhmm
Bivens: the mom and dad that you got the consent for [Monitor beeps] for the PICC [Monitor beeps] …
Nurse: Uh huh
Bivens: so I was curious about that process. [Monitor beeps]. Like, I couldn't hear everything, … [Monitor beeps]
Nurse: uh huh
Bivens: um, but when you get the consent from parents, for the PICC line, what do, I mean what do you—do you go off the sheet, or do you talk about something …

It is apparent that the sounds of biomedicine, heard through healthcare technology in the United States NICU, like the HFOV and physiological monitor, present healthcare information aurally. Can the non-expert understand those sounds? Can the non-expert distinguish between important sounds and unimportant noise from healthcare technologies?

The Danish Neonatalklinikken Soundscape

Although at times it was noisy, the NICU in the southwestern United States was mostly quiet. But comparatively, the Danish neonatal ward was noticeably quieter. Almost hourly in my field notes I recorded how quiet it was in the Danish Neonatalklinikken, or neonatal ward. I was sensitive to the non-language sounds in the Neonatalklinikken because I did not speak Danish and thus attended to the non-discursive: visuals, gestures, and sounds.

I had begun my research project in Denmark and then moved to the NICU in the southwestern United States. It hadn't occurred to me to audio record sounds in the Danish neonatal ward because the sounds seemed unremarkable—similar to quiet you would expect to hear in a library, like the Library of Congress.

I did not realize how quiet the Danish Neonatalklinikken actually was until I listened to and transcribed the digitally recorded interviews from the United States NICU, like in the audio clip with Cassie, almost nine months after I completed data collection in the Danish hospital. In other words, I heard the echoic aurality. In Denmark, the sounds were more purposeful, the physiological monitors did not alarm as frequently, and in general, there was less echoing of the purposeful sounds or disruptive noises.

When I did note the noise produced by physiological monitors in Denmark, most of my observations included both a mom (I didn't observe any fathers) and a nurse at the baby's bedside watching and interpreting the physiological monitor's reading of the baby's vital statistics. When a monitor alarmed, the nurse immediately switched the alarm to silent.

In the Danish Neonatalklinikken, the sounds were more purposeful. I surmised the purposefulness of the sounds because I did not observe a nurse (or parent) ignoring any alarms. Comparatively, in the United States NICU, the physiological monitors often went ignored. I suspect these alarms were disruptive noises signaling unimportant clinical events, like the baby’s sneeze.

From the metaphorical echo framework, I realized the physiological monitors did not alarm as frequently in the Neonatalklinikken. It might be that the ranges set to trigger an alarm were more liberal, or perhaps the babies were not as sick (or vital signs were more stable). Regardless, the physiological monitors did not alarm with the same frequency or duration in the Neonatalklinikken as they did in the United States NICU.

And the layout of the Danish NICU meant there was less echoing of both purposeful sounds and disruptive noises. Unlike the open bay design of the United States NICU, the Danish NICU was designed for privacy, not openness. In other words, there were doors that could be shut to limit the movement of both purposeful sounds and disruptive noises from one baby’s room to the next; thus, the soundscape was quieter (though, of course, not silent).

An illustration of the layout of a rectangle-shaped room showing the door directly across from infant bed 2 and parent bed 2, which are diagonal from infant bed 1 and parent bed 1. Infant bed 1 and parent bed 1 are on the same wall as the door.
Layout of a room in the Danish NICU (illustration by Gustav K. H. Wiberg, 2015, used with permission)
An infant bed and a parent bed in the Danish neonatal ward. The clear, window-transparent infant bed or incubator is covered with baby blankets to darken the baby's space
In the Danish neonatal ward, the physiological monitors above the babies' beds or incubators (photograph by author)

Vital signs, in both the United States and Denmark, are surveilled through physiological monitors—a healthcare technology. Vital sign surveillance, through physiological monitors, produce purposeful sounds and disruptive noises and non-verbal communication in the NICU soundscape that privilege healthcare technologies' users in different ways.

Based on my research in the United States and Denmark, aural communication, and the biomedical technologies that produce it, send messages to experts and non-experts alike. But the messages to parents, other legal caregivers, nurses, and infants in NICUs are received and interpreted differently based on the frequency of the message (the alarming monitor), the programming of the physiological monitors, and their own prior, "haunted" experiences with these technologies and spaces. Additionally, there are fewer purposeful sounds and disruptive noises for users to interpret in the Danish Neonatalklinikken in comparison to the United States NICU.

Alarming Alarms: Purposeful Sounds and Disruptive Noises

What struck me, using the echo framework, is that an alarm to some users is a reason to, well, be alarmed. There is no distinction, for some users of biomedical technology like the father responding to the IV pump alarm, between important or serious, purposeful alarm sounds and unimportant, disruptive alarm noise. In other words, all alarms cause alarm.

After witnessing the father fall to the ground, I recognized aurality as a multisensorial, embodied, and technological layer of health literacy. Further, experts and non-experts use and interpret aural cues differently. Whereas nurses have been sensorially trained to ignore some alarms, non-experts consider the alarms purposeful sounds, not disruptive noises. If the father had known that the alarm noise was for an unimportant clinical event (a soon-to-be empty IV bag), he might have prevented himself from pulling two nurses from the care of other infants and from falling to the ground. In a worst case scenario, the father’s fall could have meant tripping over his wife's IV, ripping it from her body, and causing the mother to drop her baby she was breastfeeding.

Regardless of who is privileged by the aural message, the healthcare technologies—the physiological and other monitors that persistently surveil the infants' bodies—send aurally delivered messages to expert and non-expert users alike. Whatever that message, for whomever the user, there is an embodied, aural layer to health literacy that needs to be attended to in healthcare communication.

It is clear that physiological monitors, while certainly life saving in particular instances, might be hindrances in other occurrences, especially situations involving non-expert users. Currently, when a baby is first admitted into a NICU, packets of orientation materials are provided for the parents and other legal caregivers. It might be beneficial for parents to receive auditory orientation materials, too. These materials might be mp3 recordings of the different sounds, along with narrated explanations, a non-expert user might hear in a NICU. Further, re-programming physiological monitors' default factory settings for more appropriate ranges to signal important clinical events is recommended. When coupled with reconfiguring default factory settings, sensory/aural training might better prepare parents and other legal caregivers to enact aural healthcare literacy in NICUs. If our bodies are easily and readily heard, by both experts and non-experts, through physiological monitors, it is beneficial to provide aural training to distinguish purposeful sounds from disruptive noises in NICUs.

Footnotes

1 This project was approved by the Institutional Review Board at Texas Tech University Health Sciences Center.

2 "Users" are more than just those who explicitly use these healthcare technologies. Users include parents, guardians, infants, grandparents, aunts, uncles, nurses, respiratory therapists, and physicians. For consistency with technical communication terminology, I use the term "user," rather than role/relationship or profession, throughout this webtext.

3 My study examined communication between the nurses and parents (and other legal caregivers) of babies born too soon or sick (premature babies) in two cities in two countries.

4 This project was approved by the Institutional Review Board at Texas Tech University and the Data Protection Agency (Denmark).