Immersed in the Sounds of Space

NASA offers a variety of “sonifications” – translations of 2D astronomical data into sound –that provide a new way to experience imagery and other information from space. Advanced instruments currently provide hyperspectral (many color) images from space that are 3D (two spatial dimensions and one color dimension), and sophisticated techniques can be used to enhance 2D astronomical images to make video representations called “fly-throughs” that allow viewers to experience what it would look like to move among space objects in 3D (three simulated spatial dimensions).

Your challenge is to design a method to create sonifications of these 3D NASA space datasets to provide a different perceptual path that can help us understand and appreciate the wonders of the universe!


BACKGROUND





The Hubble Space Telescope provides us with cosmic sights, but these astronomical marvels can be experienced with other senses as well. For example, have you ever wondered what space might sound like? Through data sonification, NASA’s images can be transformed into sound. No sound can travel in space, but sonifications provide a new way of experiencing and conceptualizing data. Sonifications allow the audience, including blind and visually impaired communities, to “listen” to astronomical images and explore their content. Using a new sense can broaden the appeal of NASA data, and can also provide a different perceptual path to help us understand and appreciate the wonders of the universe.

In the past few years, NASA has been producing “sonifications” of 2D astronomical data. These projects take the digital data captured by telescopes in space such as the Hubble Space Telescope and the Chandra X-ray Observatory and translate it into musical notes and sounds so it can be heard rather than seen. Because different telescopes can detect different types of light, each telescope provides unique information about whatever it observes. Each layer of sound in a sonification can represent particular wavelengths of light detected by a different telescope. The resulting sonification is similar in some ways to how different musical instruments can be played together to create harmonies that are impossible to achieve with single notes alone.

Advanced instruments are currently producing 3D “hyperspectral” data from space (two spatial dimensions and one color dimension). These images contain many more than the three colors (red, green, and blue) that computer monitors produce. In some cases, the images from these instruments contain hundreds of colors! In addition, advanced imaging techniques can be used to enhance astronomical 2D images to make “fly-through” videos that allow viewers to experience what it would look like to move around space objects in 3D (three simulated spatial dimensions). These new 3D NASA space datasets cry out for sonification so they can be perceived by a broader audience!


OBJECTIVES


Your challenge is to design a method to create sonifications of 3D NASA space datasets. What are some ways that 3D ‘data cubes’ could be converted into sounds to convey the richness of the data? If you want to create a 3D sonification fly-through, how can you convert the video image into sounds that accurately represent what is in the visualization? Do you want to develop a method to sonify a certain data set? Or maybe you want to develop an app that allows people to select data from an archive and then sonifies the selected data according to your prescription? Or perhaps you can develop a way to explore 3D data that merges visual and audio elements?

Think about your audience; how could experiencing your solution enhance their understanding of NASA data? Your audience could be a scientist at a university studying her data about the surface of Mars, a college student at a science museum exploring multi-color images of star-forming clouds, a class of students in a high school learning about Earth’s climate as measured from space, or someone else. Can your sonification tool be accessible to people with low vision, and also enhance visualizations for sighted people? Will you develop a smartphone app, a website, or just a description about how to sonify 3D data for certain NASA data sets?

Don’t forget to clearly explain how your solution sonifies the data!


POTENTIAL CONSIDERATIONS


You may (but are not required to) consider the following as you develop your solution to this challenge:

  • The spirit of this challenge is about innovatively communicating three-dimensional information! How can you creatively make 3D data more accessible by incorporating audio representations?
  • It’s okay to develop a hybrid visual/sonified product, as long as the audio portion conveys enough of the data content to be comparable to the visual portion.
  • Some of these datasets can be challenging to deal with! It’s okay if your solution includes only a description of how your app could automatically pull data from NASA’s archives, but don’t forget to also provide an example of how your app sonified some manually downloaded data.
  • A strong solution would include at least one example of an original sonification. The sonification doesn’t need to be perfect or complete, and could just be a brief segment in a submission video.
  • Don’t forget to clarify who your app is designed for, and how they will benefit from it.
  • There are several approaches that could be used to sonify the data. For instance, for every slice of a data cube or frame in a video, elements of the image (like brightness and position) could be assigned pitches and volumes. You could include different audio waveforms/instruments; use effects like surround sound, echoes, and modulation; or incorporate other means to achieve more aural complexity.
  • If you include technical information in your solution’s code, don’t forget to clearly explain the equations that convert data units (e.g., the brightness of a pixel at a certain location in a data cube) into sound (e.g., the pitch/volume/timbre/duration of a note).
  • An interesting add-on could be a visualization of the sonified data itself! For example, if you convert an image to a sound wave, the visualization could overlay the sound wave in some manner (colors, brightness, waveform, spectrum), superimposed on the data it represents. In this way, sonification can benefit the sighted as much as those with low vision!
  • You are welcome to think outside the box! For example, you could gamify your solution so that sonified images are part of a more extensive app. Or you could develop an app that teachers could use in a classroom situation. Or you could expand your app to sonify generic 3D data (for instance, medical computerized tomography scans).

    For data and resources related to this challenge, refer to the Resources tab at the top of the page. More resources may be added before the hackathon begins.



  • Tabuk
    Sep 29, 2023

    “SpaceSound : Virtual Cosmic Harmony”

    Makkah
    Sep 30, 2023

    music of space