Blog
Check out our Blog!
Data mapping refers to the design choices of applying values from a data set onto any number of controls of sound. Controls may include modifications of digital sound synthesis, audio samples, audio effects, and spatialization, among many others. Data mapping and choices around mapping play a critical role in sonification. In sonification, choices in data…
Read MoreData sonification commonly involves taking data-numeric values-and assigning those values onto sound parameters to highlight particular aspects of the data. Making choices about the ordering and controlling of sound parameters over time is as old as music notation. In fact, standard music notation highlights certain aspects of sound in its writing. For example, music notation…
Read MoreSonification involves the field of sound design, which is the craft of developing sounds to meet a variety of needs. Sound design is common to video game and media industries, which require the creation of sound effects, human sounds, ambiance, and dialogue. Data sonification involves mapping data values to sound parameters, and we often have…
Read MoreI held a basic assumption that sonifications should only be comprised of “non-speech sound” (Sterne and Akiyama 2012), based upon a few definitions of and experiences listening to sonification. So, I made every effort to avoid the use of speech in sonification. While working on the initial grant with PI Amy Bower, however, I questioned…
Read MoreIn February, 2022, our team held a co-design session with two teachers from Perkins School for the Blind. While there were many great topics and items discussed, we had a few key takeaways from the session that are included below. Students use silence. The space of silence can become as important as the sounds themselves.…
Read MoreIn January and February 2022, the entire team gave a presentation to and heard from students in Jon’s MUS 479/579 Data Sonification course at the University of Oregon. The Data Sonification course involves undergraduate and graduate students and explores data sonification broadly, including authentic datasets, sonification methods, auditory display, accessibility, data mapping, and even grant…
Read MoreOne of the most direct ways to sonify a data stream is that of audification, which is “a direct translation of a data waveform to the audible domain” (Kramer 1994). Turning data into a waveform consists of treating data points as amplitude values in an audio signal. The direct correlation of data-as-audio signal has advantages…
Read MoreAs part of the NSF pilot, I am designing many of the data sonifications using Kyma by Symbolic Sound. Symbolic Sound was founded and is owned by Carla Scaletti and Kurt Hebel. I like to describe Kyma as a sound design workstation, a recombinant sound language, a live-performance machine, a data sonification toolkit, and a…
Read MoreWhile reading “Rich Screen Reader Experiences for Accessible Data Visualization” by Zong, et al., two items struck me related to data sonification. First, the literature review and study’s co-design experience amplified the message that screen reader users desire “an overview,” followed by user exploration as part of “information-seeking goals” (Zong et al. 2018). Even though…
Read MoreHaving taught classes on sonification prior to the NSF pilot, I have found many working definitions for sonification. The differences in language around the term make a difference, and placing definitions side-by-side reveals some of their nuances and shading. The complexity and nuances are important, so I thought it would be a good idea to…
Read MoreFrom early sonification development with lead-PI Amy Bower and our co-design session at Perkins School for the Blind, we discovered that sounds for location and context are essential. I started including spearcons, sped-up speech as auditory icons, to indicate x-axis ticks in the data as part of sonification prototypes.* Beyond contextual cues, there is some…
Read MoreAnother takeaway from the co-design session with teachers from the Perkins School for the Blind involved the use of earcons and auditory icons with devices they use in the classroom. The main tool that struck me was LabQuest, a thermometer that sonifies temperature in real-time, and that produces two beeps to signal the turning on…
Read MoreEarlier in 2022, as I was working on data sonifications for several of the OOI Data Nuggets, I was becoming overwhelmed in design variation. I would create a frequency-mapped version; a filter mapping version; a synthesized and a sample-based sonification; a sonification with all three streams of data; three sonifications with separate data streams that…
Read MoreAs I work on data sonifications and look ahead to fitting these sounds within contextual audio displays, I recognize there could be an outlined structure for what the final result may look and sound like. From experience with my Data Sonification course in creating audio displays, mixing dialogue and sound effects in music and media,…
Read MoreFrom creating data sonifications in Kyma, one sonification that I was excited to synchronize audio with the data plot was the Zooplankton daily vertical migration and solar radiation during the August 21, 2017 total solar eclipse. This post outlines the data and solar eclipse phenomenon, various sonifications, and the video sync process. The chart and…
Read MoreI’ve been a fan of Audio Descriptions (AD) in film and TV for awhile now. Most Netflix original content includes audio descriptions and I used to listen to more shows than I watch. I never have seen Daredevil, but I listened to every minute of it. And Stranger Things never sounded so good – and…
Read MoreBased upon data sonifications I made for the Daily Vertical Migration Gets Eclipsed! Data Nugget, I mocked up an audio display prototype. I still need to get team member reviews and feedback before creating a final version we can integrate into the evaluation phase of the pilot project. Thinking of the audio display as an…
Read MoreThis post outlines the creation of sonification click tracks for synchronization of additional media in sonification design. The development and use of click tracks provide accurate yet flexible mixing design, especially for related media events like axis markers, spearcons, and event-based earcons. After creating a lot of sonifications in Kyma for this project, I started…
Read MoreThis post is the second of two posts that outline the design of sonification click tracks for synchronization of additional media in sonification design. Sonification click tracks provide accurate yet flexible sound design, especially for related media events like axis markers, spearcons, and event-based earcons. Feel free to continue reading about an example sonification mix…
Read MoreBased on data sonifications I made for the 2015 Axial Seamount Eruption Data Nugget, I mocked up an audio display prototype, which I have included below. The audio display has three tracks that roughly span four minutes in length. I included the graph of the 35 days of bottom pressure sensor data alongside a playlist…
Read MoreBased on core lessons from the 2015 Axial Seamount Eruption Data Nugget, I created data sonifications and mocked up an audio display prototype, which I have included below. Please note that the Ocean Labs data nugget does not include the graph (included below) that we sonified and developed an audio display for; yet, the graph…
Read MoreThis post outlines lessons learned from creating an accessible Qualtrics survey involving media players. While we were excited to learn about general user impressions of sounds to help guide our sonification design, we learned a good deal in the creation of an accessible Qualtrics survey involving audio. We wanted to share our lessons learned. Feel…
Read MoreBased on core lessons from the Extratropical Storm Hermine Data Nugget, I created data sonifications of the various meteorological instruments and mocked up an audio display prototype, which I have included below. Please note that the Ocean Labs data nugget does not include sea wave height as part of its graph (below). Storms also impact…
Read MoreWe are fortunate to have an amazing advisory board filled with diverse expertise in many sectors and who bring diverse lived experiences. On a recent advisory board meeting, I learned about a new term for how we use and define spearcons – an invited term called “nearcons.” To backup and provide context, a spearcon “is…
Read MoreThe Accessible Oceans project was featured in an LA Times article February 3, 2023 “The Sounds of Science!” The article includes interviews with Co-PI’s Jon and Amy as well as a clip of one of our data sonifications. Several other sonification projects were also described in the article including the work of Carla Scaletti, one…
Read MoreThe Accessible Oceans project was featured on the Ocean Observatories Initiative website in March, 2023. The “Making the Ocean Accessible Through Sound” article includes interviews with Co-PI’s Amy as well as a few clips of our data sonifications. Check out the full article here – https://oceanobservatories.org/2023/03/making-the-ocean-accessible-through-sound/ “Adding sound to science allows more people to experience…
Read MoreBased on input from our team, research with blind and low-vision (BLV) students in schools, and insights learned over the past year, we have updated our auditory display for the Extratropical Storm Hermine Data Nugget. We updated data sonifications of some meteorological instruments, re-recorded narration audio, and mixed underlying music, sound effects, and data sonification…
Read MoreBased on input from our team, research with blind and low-vision (BLV) students in schools, and insights learned over the past year, we have updated our auditory display for the 2015 Axial Seamount Eruption Data Nugget. We updated data sonifications, re-recorded narration audio, and mixed underlying music, sound effects, and data sonification snippets as part…
Read MoreBased on cognitive load in informal learning environments, we knew we needed to reduce the length of our auditory displays. We started this process with the Long term Seafloor Inflation Record. We reduced the script and tightened up the auditory key, or legend, for the data sonification. The audio media player is below. For more…
Read MoreBased on input from our team, research with blind and low-vision (BLV) students in schools, and insights learned over the past year, we have updated our auditory display for the 2015 Axial Seamount Eruption Data Nugget. We added start and end beep earcons to the data sonifications, re-recorded narration audio, and mixed underlying music, sound…
Read MoreBased on input from our team, research with blind and low-vision (BLV) students in schools, and insights learned over the past year, we have updated our auditory display for the Daily Vertical Migration Gets Eclipsed! Data Nugget. I re-recorded narration audio, and mixed underlying music, sound effects, and data sonification snippets as part of the…
Read MoreBased on input from our team, research with blind and low-vision (BLV) students in schools, and insights learned over the past year, we have updated our auditory display for the Carbon Net Flux Between Ocean and Atmosphere data nugget. I re-recorded narration audio, and mixed underlying music, sound effects, and data sonification snippets as part…
Read MoreWe invite you to participate in a survey for the “Accessible Oceans” research project, funded by the National Science Foundation (NSF 2115751). Our project focuses on designing effective auditory displays to enhance the perception and understanding of ocean data for visitors to museums, aquariums and science centers, including those who are blind or have low…
Read More