Document Type

Student Presentation

Presentation Date

2015

Faculty Sponsor

Josh Johnston

Abstract

This research investigates methods for interacting with 3D visualizations of science data. Even with higher resolution, large format, and stereoscopic displays, most visualization still involves the user looking at the result rendered on a flat panel. Changing perspective, zooming, and interpreting depth is often disorienting and frustrating. Specialized hardware and software solutions like large format displays and CAVEs address these issues with infrastructure limited by cost, complexity, and size.

We investigate low cost commercial hardware solutions for their potential application to this problem. The Leap Motion Controller and Kinect Motion Sensor are assessed for gesture-based visualization control. The Oculus Rift is considered for immersive virtual reality combining head tracking and close-to-eye wide angle display. Finally, Android devices are used for augmented reality by overlaying rendered 3D objects on a camera video stream to react to a user’s perspective. These devices are integrated with the Unity 3D gaming engine as a tool for connecting input from the sensors to both the Oculus and flat panel displays. The visualizations use example models created from scientific data.

Share

COinS