CS294-137 Course Projects
CS 294-137 Immersive Computing Projects
Fall 2020 Showcase
AR for Emergency Medical Training
by Josh Mao, Alex Wu, and Arjun Sarup
Description: An AR application designed to utilize AR technologies to help produce better training materials and assess an EMT trainee’s performance.
Opensource VR Cardboard
by Yizhou Wang, Kailun Wan, Yanda Li
Description: With our Opensource Cardboard VR system, you can make your own hand controllers with low-cost wireless gamepads, and enjoy the 6 degree of freedom in fanscanting VR scene on a recent version of iPhone. It is even possible for developers to use our system making their own VR games for this user group.
ROAR VR Project Page
by Sam Moturi, Jiajian Lu, Hankai Zhou, Xinyun Cao
Description: 1. Reduce cost of racing sport – our car is 1:10 size to real car.
2. Build a platform for testing autonomous vehicles.
3. Facilitate research in transportation field.
Testing Product Prototypes in VR
by Bethany Lu, Trista Hu, Joshua Yang, Frederick Kim, Xinwei Zhuang
Description: A pipeline testing CAD model and FIGMA prototype in virtual 3D world fast and cost-efficiently.
AR for Healthcare
by Yuhan Xie, Zhirong Lin, Erin Kraemer, Omotara Oloye
Description: The COVID-19 pandemic brings about stress in people of all ages. The project aims to provide children in isolation an exercise procedure to follow at home to stay physically and mentally healthy.
by Flaviano Christian Reyes, Ritika Shrivastava, Songwen Su
Description:Due to the recent development in both software and hardware, augmented reality is becoming increasingly popular and applied in applications from many fields such as gaming, healthcare, and education. When designing these AR and VR applications, one problem that is encountered is the variance faced when the same application is used in different environments.
by Michael Khorram, Tiantian Wang
Description: OpenARK is an open source augmented reality SDK written in C++, and its goal is to allow developers to rapidly prototype AR applications. A crucial aspect of any AR system is to ensure that rendered objects appear as if they are a part of the user’s world. Implementing this feature typically requires simultaneous localization and mapping (SLAM) which is a technique used to keep track of a device’s location relative to its environment. Our main task is to handle a very specific edge case in OpenARK’s SLAM component which involves the merging of the independent maps generated by this system.
by Keming Gao, Yudi Tan, Janaki Vivrekar, Yuhan Yang
Description: HyperBlend enables users to engage in a unique hyperspectral painting experience in virtual reality. With HyperBlend, users can draw on a canvas using a hyperspectral color palette, which renders slightly varying colors to each eye in a virtual reality interface. Through binocular fusion (the physiological process of combining signals from both eyes into a single blended image), users interpret the slightly varying colors shown to both eyes as a lustrous, hyperspectral color. By providing a flexible and accessible interface for painting with hyperspectral colors, HyperBlend prototypes a novel interaction with hyperspectral colors while creating art.
UI for Bespoke Scenes
by William Dai, Weili Liu
Description: This semester, we explored a user interface for scene augmentation apps that could allow users to customize their own scenes.
We were inspired to do this because current scene augmentation apps generally rely upon a set of rules or known relationships between the objects in a room to calculate the most likely placements for a user’s situation. This means that creative designers with exotic scenes or unusual objects not found in common datasets like MatterPort3D would have to build scale models or create a set like those used in filmmaking. We wanted to see if we could extend AR based scene augmentation apps to such situations. This user interface is the result.