The Sonification Lab is growing and is looking for graduate students
interested in doing research in sonification, auditory displays and human
computer interaction. The lab is housed by the School of Psychology, but research opportunities are available for graduate students from other areas as well, including (but not limited to) the College of Computing, College of Design, and College of Engineering.
The Sonification Lab also has many opportunities for undergraduates. Positions as research assistants and programmers are both available for either pay or course credit. Tasks for undergrad research assistants include participating in research experiment design, running subjects and gathering data. Programmers are needed to write the software that runs the experiments and stores the data.
We have a variety of projects that need programmers of all types for implementation, HCI students to lead projects, and grad students to manage programs of investigation. Some projects have funding, so paid work is a possibility; other opportunities are intended to be completed first for course credit (special topics, etc.) in CS, Psych, or HCI.
The Georgia Tech Sonification Lab is always looking for programmers. We are primarily
seeking students who will work for credit, at least at first. There is a possibility of research
funding in subsequent semesters.
The Sonification Lab is located in the Coon Psychology building (near Tech Tower). We
have several projects requiring a variety of skills. All students with programming and/or
IT experience are encouraged to apply. Version control and bug tracking experience a
To apply, send an email expressing your interest and your skills, as well as a resume, to Dr. Bruce Walker (office: J.S. Coon Psychology Building, room 230). If possible, indicate which kinds of projects most interest you.
Sample Active Projects
Here are some current or recent examples of projects:
- STING 2.0. Developing middleware that takes telemetry data from our NADS driving simulator, and makes it available to third-party programs. See Sting 1.0, which will basically be re-built to work with the latest version of NADS.
- VRlandia. We are developing a new "maker space" shared lab for virtual reality, augmented reality, and extended reality (VR/AR/XR) that will be available to researchers and students across campus, similar to how the Invention Studio opperates.... We need help setting up the space, deploying the large (!) amount of hardware and software we have acquired, programming in Unity, and mentoring students who are new to the field.
- VR training for automated vehicles. When someone buys a new car that has automated driving capabilities, how can we train the buyers how to use those features...before they use them out on the street? We plan to use VR for training.
- Accessible Maps. We are making maps of buildings, of campus, and of the city accessible to people who cannot see traditional maps. A mix of GIS, CAD, databases, and programming. Plus accessibility and assistive technologies.
- AccessCORPS. We are developing an on-campus student-led club/organization in which trained students will work with faculty to make GT courses more accessible to students with disabilities. We need tools to support the organization, and students who will become trained experts.
- Accessible Safari. How can we give blind tourists the experience of a safari--lions and giraffes and rhinos, oh my!? Advanced sensors, computer vision, animal detection and tracking, mapping, and more. PLenty of AI/ML in this, along with next-gen user interfaces.
- Accessible Weather. We are making weather information more accessible to people who cannot look at traditional weather maps or forecasts.
- Sonification Studio. We have partnered with Highcharts to develop the Sonification Studio, a replacement for the aging Sonification Sandbox. The new Studio is completely modern, all online, and supports a great amount of extensibility. for instance, it can connect to Google Sheets. We want to extend teh Sonification Studio to support more of the features of the old Sandbox.
- Robotic Guide Dog. (Real) guide dogs are amazing support animals for persons with vision loss. There are, however, some drawbacks. We are starting a research program to explore the development of a robotic guide dog that can help provide many of the capabilities of canines.
- Connect Outdoors: Accessible Archery. Many veterans enjoy the outdoors, including camping and activities such as archery and marksmanship. For veterans with disabilities, especially vision loss, these pursuits can be challenging. We are partnering with connect-outdoors.org to find technology that can support a return to the great outdoors.
- Affect Detection System. An ADS gives a computer information about a
participant's emotional state. Your task will be to install a third-party ADS. You
will then build in hooks for the experimenters to use the ADS for experiments.
- Auditory Graphs and Statistics. We are currently creating experiments for
evaluating the use of accessible graphs for statistics courses. Your task will be
centered on creating Director and Eprime programs that run in the experiment.
- Auditory Graphs in the Classroom. This project explores what sort of graphs
visually impaired high school students use in their math classroom. It depends on
using and prototyping a variety of graphing tools.
- Auditory Use in Everyday Software. This project is looking to determine the
prevalence of sound use in everyday software. This uses a large dataset. Strong
SQL and database architecture abilities required. Java and bash experience a plus.