Building on the Introduction to Virtual Reality, OpenXR Hand-Tracking, and Gesture Recognition in Unreal Engine tutorial, this slightly more advanced tutorial will dive deeper into the following topics:
- Transitioning seamlessly between motion controller and hand-tracking modes in Unreal Engine.
- Adding custom debugging gizmos to improve development and testing workflows.
- Visualizing debug virtual hands by incorporating the custom gizmos.
- Animating virtual hand meshes with OpenXR hand-tracking data, moving beyond basic joint representation with cubes.
- Re-using and adapting the gesture recognition code from the introductory tutorial to integrate with the new animated virtual hand meshes.
This guide will help you take your VR projects to the next level with polished and practical implementations.
Table of Contents
English Video Tutorials
- The English video tutorial part 1:
- The English video tutorial part 2:
Farsi Video Tutorials
- The video tutorial in Persian part 1:
- The video tutorial in Persian part 2:
Project Source Files
-
The project source files on Microsoft Azure DevOps Repositories.
-
The project source files on GitLab.
See also
- WebRTC IP Leak Demonstration
- Introduction to Virtual Reality, OpenXR Hand-Tracking, and Gesture Recognition in Unreal Engine
- A quick workaround for Unreal Engine Modeling Tools Editor Mode plugin not showing up on Linux and macOS
- Host Unreal Engine 4 projects on Microsoft Azure DevOPS with unlimited cost-free Git LFS quota