Building on the Introduction to Virtual Reality, OpenXR Hand-Tracking, and Gesture Recognition in Unreal Engine tutorial, this slightly more advanced tutorial will dive deeper into the following topics:
- Transitioning seamlessly between motion controller and hand-tracking modes in Unreal Engine.
- Adding custom debugging gizmos to improve development and testing workflows.
- Visualizing debug virtual hands by incorporating the custom gizmos.
- Animating virtual hand meshes with OpenXR hand-tracking data, moving beyond basic joint representation with cubes.
- Re-using and adapting the gesture recognition code from the introductory tutorial to integrate with the new animated virtual hand meshes.
This guide will help you take your VR projects to the next level with polished and practical implementations.
Table of Contents
English Video Tutorials
- The English video tutorial part 1:
- The English video tutorial part 2:
Farsi Video Tutorials
- The video tutorial in Persian part 1:
- The video tutorial in Persian part 2:
Project Source Files
-
The project source files on Microsoft Azure DevOps Repositories.
-
The project source files on GitLab.
See also
- Unreal Engine OpenXR Hand-Tracking on Android with Meta XR (Quest 3S/3/Pro/2) and HTC VIVE OpenXR (Focus Vision/XR Elite/Focus 3) Plugins
- Deploy Unreal Engine Projects to Android and Meta Quest 3S/3/Pro/2 in Standalone Mode
- WebRTC IP Leak Demonstration
- Introduction to Virtual Reality, OpenXR Hand-Tracking, and Gesture Recognition in Unreal Engine