Georgia Tech Sonification Lab
School of Psychology - Georgia Institute of Technology

The Accessible Aquarium Project


The Professors Bruce Walker and Gil Weinberg examine fish in a research aquarium goal of the GT Accessible Aquarium Project is to make dynamic exhibits such as those at museums, science centers, zoos and aquaria more engaging and accessible for visitors with vision impairments by providing real-time interpretations of the exhibits using innovative tracking, music, narrations, and adaptive sonification. It is a truly interdisciplinary collaboration between GT researchers Bruce Walker, Tucker Balch, Gil Weinberg, Carrie Bruce, Jon Sanford, and Aaron Bobick, bringing together the fields of Psychology, Computing, Music, and Assistive Technology. See the Georgia Tech Press Release, and Project Overview Videos.

Zoos and aquaria are in the business of educating and entertaining the visiting public. However, as the number of people with disabilities living in the community has grown, and as public environments have become more accessible to them, such informal learning environments (ILEs) are faced with accommodating an increasingly diverse visitor population with varying physical and sensory needs. This is even more challenging for ILEs with dynamic exhibits, where the movements, changes, and interactions are extremely difficult to describe to individuals who lack vision. The behavior of fish and other sea life in an aquarium, the play of monkeys or lion cubs in a zoo habitat, and the cosmic dance of the planets in a science center's model of the solar system are all examples of dynamic exhibits that have until now been completely inaccessible to visitors with vision impairments.

In GT Accessible Aquarium Project tracking and music team members this project, we are developing cutting edge bio-tracking and behavior analysis techniques that can provide input for sophisticated, informative, and compelling multimedia auditory displays, music, and sonifications (sounds used to convey information about some kind of data to a listener). We are focusing first on the aquarium domain, since almost every exhibit in an aquarium is dynamic. Further, the aquatic nature of these exhibits provides unique and interesting challenges to the use of typical sensing technologies and techniques for tracking. The principles and techniques we develop will be immediately applicable to zoos, museums, and other ILEs with dynamic exhibits, leading to a dramatic increase in the opportunities for people with vision impairments to experience these types of exhibits. The developments in tracking and multimedia display will also have broader applicability in a range of fields, including tracking people for many purposes.

Dynamic exhibits, such as those in aquaria, zoos, science museums, etc., are exciting to a visitor largely because of the active, changing natire of the exhibit. However, a visually impaired or blind visitor generally has a very difficult time knowing what is in the exhibit, and, more importantly, what is happening. Are the big fish chasing the littel fish? Are the monkeys playing or sleeping? Are all the critters in one corner, or are they running around the entire exhibit?

The aim of this project is to track the movements of the objects (fish, monekeys, planets) in a dynamic exhibit, and then convey to the visitor some sense of what is happening, via sound (and possibly other senses such as touch).


The Accessible Aquarium Project is a truly interdisciplinary collaboration:

Faculty Researchers
PI: Bruce N. Walker, Associate Professor in the School of Psychology and School of Interactive Computing
Tucker Balch, Associate Professor in the School of Interactive Computing
Aaron F. Bobick, Professor and Chair of the School of Interactive Computing
Carrie M. Bruce, Research Scientist in the Center for Assistive Technology and Environmental Access (CATEA) and PhD student in the School of Interactive Computing
Gil Weinberg, Associate Professor in the School of Music

Graduate Students
Psychology: Myounghoon "Philart" Jeon

Computing: Carrie M. Bruce, Jinhan Lee

Music: Ryan Nikolaidis, Sriram Viswanathan, Mark Godfrey, Jonathan Kim, Jason Orlosky

HCI: Mary Frances Jones, Stephen Garrett, Anandi Pendse, Michael Pate

Sample Videos

To give a general idea of what we are trying to do, and a sort of chronological story of the project, the following videos show different fish tanks, and have different audio associated with the fish movement. Note that even though the project has already come a long way, these are still just the beginning, and we have lots of different things to try, before we actually implement this in a real setting.

Computer Generated Fish, version 1. CGFish-Version1 (QuickTime 54MB)
Three fish are animated (shark, puffer fish, small school) with different paths and behaviors. In version 1 the audio is based on the "The Blue Danube Waltz". Each fish is mapped onto a separate musical instrument. As the fish moves toward the front of the tank, the corresponding instrument gets louder; as it recedes into the back of the tank, the instrument gets softer. As the fish moves left or right in the tank, the instrument is panned to stereo left or right. Finally, as the fish moves up or down, the sound is filtered with a high or low-pass filter.

Computer Generated Fish, version 2. CGFish-Version2 (QuickTime 28MB)
The exact same fish video is used as in version 1. Instead of the MIDI-based waltz, the audio is based on an auto-generated music pattern. Again, the front-back, left-right, and top-bottom mappings are used, but the notes that are played are generated by the speed of the fish. Slow fish have a "normal" "natural" chord progression. Faster moving fish have more irregular note changes.

Computer Generated Fish, version 3. CGFish-Version3 (QuickTime 28MB)
The exact same fish video is used as in version 1. This time, the tempo is driven by the speed of the fish. Slow fish have "regular" tempos, while faster fish have more irregular, syncopated tempos.

Real Fish Video, version 1. RealFish-Version1 (QuickTime MP4 32MB)
Video of real fish at the Georgia Aquarium, is used in this demo. In this case, there are small, medium, and large fish defined. As a fish of a given gategory enters the scene, it starts to play a computer-generated sound pattern. For example, if a medium sized fish enters the scene, one of the music patterns corresponding to medium fish starts to play. The front-back, left-right, and top-bottom mappings are used, as in the previous demos. If another fish of the same size enters the scene, a second sound pattern plays, making for a more complex "duet" in that frequency range. If a fish of a different size enters the scene, then a sound plays in a different part of the scale. Generally, big fish are represented by low-pitch sounds, and smaller fish are represented by higher-pitch sounds.
Note that this demo shows "local tracking", where we are only concerned with what is visible in this local window. This is in contrast to "global tracking", in whih we track (and sonify) a fish at all times, in all parts of the environment. In the local case, a whale shark appears, and its sound starts playing. When it goes out of view, its sound fades. In the global case, the sound for the whale shark would presumably always be playing, but might be very soft if the fish were far away. These two modes (local and global) provide different kinds of information to the visitor.

Real Fish Video, version 2. RealFish-Version2 (QuickTime MP4 31MB)
The same real fish video is used, and largely the same mappings. Different (more pleasant) sounds are used, and some tweaks are made to the tracking data.

Real Ants, version 1. Ants-Version1 (QuickTime MP4 4.5MB)
This video shows live ants moving around in an observation chamber, being tracked automatically by the computer vision system. There are 1-7 ants at a time (they can come and go). Each ant is represented by an instrument (e.g., piano, trumpet, high-hat), and the movements of the ants results in changes to the music. Left-right movement is mapped to left-right stereo panning; up-down movement is mapped to a filter, such that the top of the screen has a "bright" sound, and the bottom of the screen has a lower/bass sound; the speed of the ant is mapped to tempo of the music. This video has a jazzy soundscape.

Real Ants, version 2. Ants-Version2 (QuickTime MP4 4.5MB)
The same ants video as the previous one. This video has a rock and roll soundscape.

Screen Saver, Multiple Fish. ScreenSaverFish (QuickTime MOV 12MB)
This video is based on a video created from a screen saver. There are several fish, and a turtle. The music is based on a canon, with each creature being represented by a different instrument. The left-right position is mapped to stereo location. Vertical location is mapped onto high- or low-pass filters on the instrument timbre. Note that the music is complex and aesthetic, but it is somewhat harder to track any of the specific fish due to the many instruments playing simultaneously.

Live Fish, GT Research Aquarium, Multiple Fish. GT Research Aquarium 01, Small size (QuickTime MOV 5.2MB) -- GT Research Aquarium 01, Large size (QuickTime MOV 70MB)
This video is a live capture from the 65 gallon marine aquarium that has been established at Georgia Tech to support the research on this project. We track yellow tangs, blue chromis, and clown fish. The underlying musical structure is a Bach chorale. All fish of a given type have the same instrument, but different specific fish will have different register. Left/right and up/down mapping is as above. The movement speed is mapped onto note density (tempo).


Publications Relating to the Research

(See the Publications page for all Sonification Lab publications.)


Bruce, C, M., & Walker, B. N. (2009). Developing Effective Real-time Audio Interpretation to Enhance Accessibility of Dynamic Zoo and Aquaria Exhibits. Proceedings of the Association for the Advancement of Assistive Technology in Europe Conference (AAATE 2009), Florence, Italy (31 August - 02 September). pp. TBD. <PDF>


Pendse, A., Pate, M., & Walker, B. N. (2008). The Accessible Aquarium: Identifying and evaluating salient creature features for sonification. Proceedings of the Tenth International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS2008), Halifax, Canada (13-15 October, 2008). pp. 297-298. DOI: 10.1145/1414471.1414546 <PDF>


Walker, B. N., Kim, J., & Pendse, A. (2007). Musical soundscapes for an accessible aquarium: Bringing dynamic exhibits to the visually impaired. Proceedings of the International Computer Music Conference (ICMC 2007), Copenhagen, Denmark (27-30 August). <PDF>


Walker, B. N., Godfrey M. T., Orlosky, J. E., Bruce, C., & Sanford, J. (2006). Aquarium sonification: Soundscapes for accessible dynamic informal learning environments. Proceedings of the International Conference on Auditory Display (ICAD 2006), London, England (20-24 June). pp. 238-241. <PDF>


Press Coverage:

Georgia Tech Press Release, and Project Overview Videos (Nov, 2008)

Associated Press Video, via Atlanta Journal Constitution, (Dec 17, 2008)

Associated Press Story, via Atlanta Journal Constitution, (Dec 17, 2008)

Associated Press Story, via [PDF] (Dec 19, 2008)

Bruce Walker discusses the project on The Takeaway, a BBC/NY Times radio show [Download mp3] (Dec 24, 2008)

Carrie Bruce discusses CATEA and the Accessible Aquarium Project on Eye on Vision, from WYPL-FM 89.3 in Memphis, TN [Download mp3] (Jan 17, 2009)

Bruce Walker discusses the Accessible Aquarium Project on Eye on Vision, from WYPL-FM 89.3 in Memphis, TN [PART 1: Download mp3] [PART 2: Download mp3] (Feb 7, 2009)

Atlanta Journal Constitution Story, [PDF] (Feb 22, 2009)



This research has been supported, in part, by grants from the US Department of Education through the National Institute on Disability and Rehabilitation Research (NIDRR) via the Wireless RERC, and the National Science Foundation (NSF). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the funding agencies or sponsors.
NSF logo NSF logo



Georgia Tech Sonification Lab
Bruce Walker