It takes a lot of energy to make an X-ray wave or photon. NASA’s Chandra X-ray Observatory detects X-ray emission from the scenes of explosions and collisions in space such as colliding black holes, galaxies and neutron stars, matter falling into black holes, and exploding stars. This presentation would cover how we take our astronomical data from binary code and into 2D images, 3D maps and prints, and eventually, virtual reality.

Talking points:
Looking at how we apply virtual and augmented reality techniques to work with scientific data, I would showcase the insides of a star that exploded in our Galaxy over 300 years ago (as seen from Earth). This virtual/augmented application is the first time participants can walk inside real NASA data of an exploded star and see the locations of the elements necessary for life on Earth that are made in stars and then distributed throughout the universe. Additionally, I would discuss the ramifications of visual technologies such as VR on underserved populations, and possible solutions.

Participants would walk away with an understanding of how scientific data can be used with emerging technologies such as 3D printing and VR.

Participants would hear about some of the technical challenges involved in 3D printing and translating into VR such data, and what our solutions have been.

Participants would learn about how we’re working to create ADA/508 compliant versions of the data in VR/AR using techniques of sonification and haptic technology.

Ideally participants would also get to experience the apps directly if Google Cardboard or similar viewers could be made available.