Deploy Unreal Engine Projects to Android and Meta Quest 2/3 in Standalone Mode

Deploy Unreal Engine Projects to Android and Meta Quest 2/3 in Standalone Mode

Deploy Unreal Engine Projects to Android and Meta Quest 2/3 in Standalone Mode

All right… so… you might be wondering—where have I been? Did I fall into a virtual void or some digital black hole? Did I get sucked into a never-ending loading screen? Did Unreal Engine finally crash me for good? Well, almost.

But I’m back! Back from the digital abyss. And let me tell you, I was this close to naming this video: ‘How to Deploy Your Soul to Standalone Mode’… because burnout is real, my friends.

Jokes aside, YOU—yes, you awesome people—kept me going. I read your comments. I saw your feedback. And honestly, some of those messages… they were half encouragement, half passive-aggressive ‘Bro, where’s the next tutorial?!’ You all have zero chills. But… I like that. That’s what got me here today.

So, by popular demand and an ungodly amount of coffee, here we are. Today, we will deploy your Unreal Engine projects to Meta Quest. We’re talking Android standalone mode, baby!

And if you’ve been struggling with this process, don’t worry. I’ve suffered so you don’t have to. You’re welcome!

But, before we dive into the screen-sharing and the real stuff, you know the drill. Smash that like button, subscribe, and drop a comment. If you don’t, I’ll deploy you into an infinite loop of beginner VR setup tutorials. Trust me, you don’t want that. So, it’s your call!

All right! Let’s get into it.

[Read More...]

Procedural Virtual Hand Mesh Animation Using OpenXR Hand-Tracking in Unreal Engine

Procedural Virtual Hand Mesh Animation Using OpenXR Hand-Tracking in Unreal Engine

Procedural Virtual Hand Mesh Animation Using OpenXR Hand-Tracking in Unreal Engine

Building on the Introduction to Virtual Reality, OpenXR Hand-Tracking, and Gesture Recognition in Unreal Engine tutorial, this slightly more advanced tutorial will dive deeper into the following topics:

  • Transitioning seamlessly between motion controller and hand-tracking modes in Unreal Engine.
  • Adding custom debugging gizmos to improve development and testing workflows.
  • Visualizing debug virtual hands by incorporating the custom gizmos.
  • Animating virtual hand meshes with OpenXR hand-tracking data, moving beyond basic joint representation with cubes.
  • Re-using and adapting the gesture recognition code from the introductory tutorial to integrate with the new animated virtual hand meshes.

This guide will help you take your VR projects to the next level with polished and practical implementations.

[Read More...]

Introduction to Virtual Reality, OpenXR Hand-Tracking, and Gesture Recognition in Unreal Engine

Introduction to Virtual Reality, OpenXR Hand-Tracking, and Gesture Recognition in Unreal Engine

Introduction to Virtual Reality, OpenXR Hand-Tracking, and Gesture Recognition in Unreal Engine

Here’s a lesser-known fact about me. Once upon a time, when it was just me and the dinosaurs… Okay, maybe not that far back, but long before VR headsets, the Metaverse, and the social media craze, I used to teach people things for 8 years! I loved it so much that, honestly, I could talk for hours (don’t worry, I won’t in this video)!

Anyway, enough of my rambling! Recently, I rediscovered that passion for teaching, and I’m thrilled to kick off a VR tutorial series in Unreal Engine. That’s why, as the introductory step into this series, we’re diving hands-first into the fascinating world of virtual reality.

But, before we jump in, make sure to hit that subscribe button so you don’t miss any of my VR adventures.

Ready? Let’s bring those virtual hands to life!

Find the link to the video tutorial and the project repository down below.

[Read More...]