Accessible Oceans: Exploring Ocean Data Through Sound
Building knowledge about effective design and use of auditory display for inclusive inquiry in ocean science
Data Literacy heavily relies on visual learning tools, often excluding those with vision impairments or who have trouble interpreting visual information. Using sound to explore data will facilitate participation by these communities, increasing interest in STEM and data literacy.
Authentic datasets, like those generated by ocean observatories and research activities, provide a wealth of information about the natural world but are often difficult to engage with for those without disciplinary expertise. By scaffolding authentic data to support sensemaking in leaners from a wide range of backgrounds and abilities, we aim to broaden participation in STEM and showcase to all audiences sonification as a way to perceive scientific information.
Data sonification is for our ears what data visualization is for our eyes. Through mapping data to one or more parameters of sound with the intent of communicating aspects of scientific phenomena, sonification is an acknowledged method for sharing information. “Accessible Oceans” aims to develop auditory displays focused on data sonification for advancing data literacy in the sciences. We listen in to learn about our world.
To overcome barriers to perceiving, understanding, and interacting with authentic data, we must take an inclusive design approach to developing educational materials. We are undertaking a co-design process with oceanographers, educators of blind and visually impaired learners, and science communicators to understand the needs and priorities of multiple stakeholder groups. We draw on knowledge from the learning sciences and human-computer interaction to create interactions with ocean data that are accessible to learners of varying abilities and backgrounds.
- Inclusively design and pilot auditory display techniques (data sonification) to convey meaningful aspects of ocean science data.
- Empirically evaluate and validate the feasibility of integrated auditory displays to promote data literacy in ILEs.
- Establish best practices and guidelines for others to integrate sonifications into their own ILEs.
- Lay the foundation for a larger research studyfocused on educational technology design and an innovative and immersive ocean literacy exhibit
Based upon data sonifications I made for the Daily Vertical Migration Gets Eclipsed! Data Nugget, I mocked up an audio display prototype. I still need to get team member reviews and feedback before creating a final version we can integrate into the evaluation phase of the pilot project. Thinking of the audio display as an…Read More
I’ve been a fan of Audio Descriptions (AD) in film and TV for awhile now. Most Netflix original content includes audio descriptions and I used to listen to more shows than I watch. I never have seen Daredevil, but I listened to every minute of it. And Stranger Things never sounded so good – and…Read More
From creating data sonifications in Kyma, one sonification that I was excited to synchronize audio with the data plot was the Zooplankton daily vertical migration and solar radiation during the August 21, 2017 total solar eclipse. This post outlines the data and solar eclipse phenomenon, various sonifications, and the video sync process. The chart and…Read More
As I work on data sonifications and look ahead to fitting these sounds within contextual audio displays, I recognize there could be an outlined structure for what the final result may look and sound like. From experience with my Data Sonification course in creating audio displays, mixing dialogue and sound effects in music and media,…Read More
Earlier in 2022, as I was working on data sonifications for several of the OOI Data Nuggets, I was becoming overwhelmed in design variation. I would create a frequency-mapped version; a filter mapping version; a synthesized and a sample-based sonification; a sonification with all three streams of data; three sonifications with separate data streams that…Read More
Another takeaway from the co-design session with teachers from the Perkins School for the Blind involved the use of earcons and auditory icons with devices they use in the classroom. The main tool that struck me was LabQuest, a thermometer that sonifies temperature in real-time, and that produces two beeps to signal the turning on…Read More
From early sonification development with lead-PI Amy Bower and our co-design session at Perkins School for the Blind, we discovered that sounds for location and context are essential. I started including spearcons, sped-up speech as auditory icons, to indicate x-axis ticks in the data as part of sonification prototypes.* Beyond contextual cues, there is some…Read More
Having taught classes on sonification prior to the NSF pilot, I have found many working definitions for sonification. The differences in language around the term make a difference, and placing definitions side-by-side reveals some of their nuances and shading. The complexity and nuances are important, so I thought it would be a good idea to…Read More
While reading “Rich Screen Reader Experiences for Accessible Data Visualization” by Zong, et al., two items struck me related to data sonification. First, the literature review and study’s co-design experience amplified the message that screen reader users desire “an overview,” followed by user exploration as part of “information-seeking goals” (Zong et al. 2018). Even though…Read More
As part of the NSF pilot, I am designing many of the data sonifications using Kyma by Symbolic Sound. Symbolic Sound was founded and is owned by Carla Scaletti and Kurt Hebel. I like to describe Kyma as a sound design workstation, a recombinant sound language, a live-performance machine, a data sonification toolkit, and a…Read More