Research
Research
Vive Center Projects
OpenARK
OpenARK is an open-source wearable augmented reality (AR) system founded at UC Berkeley in 2016. The C++ based software offers innovative core functionalities to power a wide range of off-the-shelf AR components, including see-through glasses, depth cameras, and IMUs. The open-source platform includes fundamental tools such as AR-based camera calibration and SLAM, and it also includes higher-level functions to aid human-computer interaction, such as 3D gesture recognition and multi-user collaboration.
OpenARK Atlas
Vive Center Faculty from EECS and Architecture Departments are pursuing an ambitious project to create a high-resolution 3D digital model of the entire Berkeley campus. We use a hybrid hardware platform that fuses LIDAR, depth cameras, and RGB cameras to record multi-modality 3D data for both indoor and outdoor environment throughout Berkeley, and we carefully label the semantics of both 3D structural primitives and high-level object shapes. Our aim is to open-source both the raw multi-modality data and the semantic labels of the Berkeley data and encourage the community at large to best utilize the campus-wide data to perform larger-scale AR/VR research and applications.
ISAACS
ISAACS is an open-source project to envision new ways for human users to intuitively interface and collaborate with aerial drones around augmented reality (AR) technologies. ISAACS utilizes real-time SLAM solutions to localize the 3D coordinates of aerial vehicles and the operators wearing HoloLens. The first version of the platform will be developed in collaboration also with DJI based on the Matrice-100 platform.
Robot Open Autonomous Racing (ROAR)
Led by its faculty members with deep expertise in AI and autonomous driving, Berkeley is proud to announce and host a new AI racecar competition in 2020. The Robot Open Autonomous Racing (ROAR) competition will pit multiple student racing teams to compete for speed and vehicle skills at the heart of the iconic Berkeley campus.
Tele-Immersion
This research group is interested in computer vision, robotics, tele-immersive communications, and modeling of cyber-physical systems. Our ongoing research activities entail human-robot cooperation, human activity recognition from multi-modal data, development of individualized musculoskeletal models, quantification of human performance, remote monitoring in health care and its privacy and security considerations, and modeling of driver interaction in semi-autonomous vehicles.
RADVR
We introduce a 6DOF virtual reality daylight analysis tool, RadVR, for daylighting-based design and simulations, that allows simultaneous comprehension of qualitative immersive renderings to be analyzed with quantitive physically correct daylighting calculations.
BAMPFA AR - Augmented Time
BAMPFA AR—Augmented Time explores new modes of narrative and storytelling using augmented reality as a medium. The project unfolds the history of the new BAMPFA, from the building’s inauguration in 1940 as the university printing press, to an abandoned structure covered with graffiti by local artists, to its transformation into a contemporary cultural hub. A narrator guides the user through an augmented timeline, following the history of the construction of the museum. The AR experience further renders visible hidden narratives such as the building’s role during World War II, the architecture concepts behind Diller Scofidio + Renfro design, and the workers who built the building.
XR Maps of Berkeley
XR Maps of Berkeley investigates the potential of VR to make visible and accessible the wealth of research and creativity happening across the Berkeley campus. The project consisted of the creation of six immersive environments to generate user experiences that uniquely convey on-going research or creative work at Berkeley, using a combination of scientific/technical information, experiential environments and game playing. Student groups were responsible for the creation of each environment, in the context of a graduate seminar. Results from this project are intended to act as demonstrators for the future expansion of VR/AR work horizontally across campus.
Generative Design In Augmented Reality
Generating dynamic spatial environments can facilitate productivity and efficiency for future AR/VR workplaces by allowing custom working spaces for various tasks and working settings, all adapted to the real-world contains the users surrounding environments. By taking advantage of generative design and 3D recognition methodologies, we envision our proposed the system would play a major role in future augmented reality and virtual reality spatial interaction workflows and facilitate automated space layouts.
energyVR
EnergyVR is a virtual reality energy analysis tool that uses EnergyPlus to help designers make informed design decisions on low-energy buildings. Users can analyze their design real-time based on changes made spatially or on different properties of the envelope, all in an immersive experience. The first prototype is based on a simplified model using the San Francisco weather file, where the user can try alternative window-to-wall ratios (WWR), visualize the corresponding design alterations and perform the energy simulations. This tool represents a next step for designers to understand the impact of their design decisions on energy consumption of a building in an immersive setting.
ViewVR
The use of VR with 3D scanned and 360 ° fish-eye High Dynamic Range (HDR) photography images is a relatively new research approach for the built environment. To study the impact of outside views occupants, we reproduce the 3D scanned and HDR images into immersive VR scenes, where research participants are able to experience various view conditions using a 6DOF VR headset.
Co.DesignX
Co.DesignX in VR-AR explores a novel methodology of using immersive environments and experiences within the current design workflow process of healthcare facilities for collaborative design with stakeholders. The research uses the evidence-based design approach to evaluate the impact of design and health in healthcare spaces. It is proposed as a participatory research design involving patients, staff and professionals as co-designers collaborating to create and deliver an experience.