• Skip to main content
  • Skip to primary navigation
  • Skip to primary sidebar
Header Search Widget
FHL Vive Center for Enhanced Reality
  • Home
  • Research
    • Robot Open Autonomous Racing (ROAR)
    • OpenARK
    • OpenARK Atlas
    • ISAACS
    • Tele Immersion
    • xR Lab Research
  • Education
    • ROAR Academy
    • Master of Engineering Program
    • Courses
    • DeCal
    • Tutorials
    • Seminar Videos
  • News & Events
    • Newsletters
    • Annual Reports
  • Publications
  • Membership
  • Career

Temporal IK: Data-Driven Pose Estimation for Virtual Reality

James Lin

Abstract – High-quality human avatars are an important part of compelling virtual reality (VR) experiences. Animating an avatar to match the movement of its user, however, is a fundamentally difficult task, as most VR systems only track the user’s head and hands, leaving the rest of the body undetermined. In this report, we introduce Temporal IK, a data-driven approach to predicting full-body poses from standard VR headset and controller inputs. We describe a recurrent neural network that, when given a sequence of positions and rotations from VR tracked objects, predicts the corresponding full-body poses in a manner that exploits the temporal consistency of human motion. To train and evaluate this model, we recorded several hours of motion capture data of subjects using VR. The model is integrated into an end-to-end solution within Unity, a popular game engine, for ease of use. Our model is found to do well in generating natural looking motion in the upper body.

Temporal_IKDownload

Primary Sidebar

  • Contact
  • Berkeley Engineering
  • UC Berkeley
  • linkedin
  • youtube
  • facebook
  • instagram
  • Privacy
  • Accessibility
  • Nondiscrimination

© 2016–2025 UC Regents  |  Log in