Haptics in XR, Apple’s Hand Tracking, and DIY Haptics Peripherals


Akin to a tree falling down in the forest with no one around to hear it—if you can’t touch what you see, does that even make it real?

The rapidly evolving landscape of Extended Reality (XR) is marked by emerging technologies that increasingly blur the lines between the physical and digital.

History shows that haptics has been an integral part of gaming, VR and XR. Haptic feedback helps augment the immersive experiences, enhancing overall realism and engagement in virtual worlds. 

With major announcements like Apple’s new Vision Pro, which forgoes controllers entirely, users are understandably more curious as to the future direction of the XR industry. Thankfully, it doesn’t necessarily mean that the realism we’ve developed is lost if we fully rely on hand tracking.  


Industry leaders like Sony, Meta, and Oculus have each launched their own XR devices, with each new model improving upon the existing technology. Most XRs typically follow a similar structure, consisting of a headset and two controllers (or in the case of Sony’s newly-announced, as-of-yet-unnamed MR headset, a controller and a ring). 

Aside from that, many VR games in the market are meticulously designed for controller-based interactions. It can be argued therefore that a lot of the games may now require adaptation to maintain the same level of engagement with hands-only tracking.

Hand Tracking Only

However, Apple’s initial venture into extended reality, the Vision Pro, deviates from this norm by choosing to forgo controllers and instead fully rely on hand tracking. This development raises a lot of questions about the implications of this in our existing technologies, the impact on future technologies, the direction of the XR landscape, and the possibility of coexistence between the two approaches. 

Meta, who also employs the hand tracking feature in their Quest devices, claims that hand tracking reflects a stride towards more natural and intuitive interaction (source). But a full reliance on hand tracking for XR experiences means that we lose the third realm—our sense of touch—in these digital experiences. 

High-profile VR enthusiasts such as Denny Unger from Cloudhead Games have been vocal about the need for haptics and tactility in the digital sphere, arguing that haptics enhances the depth of engagement for a more immersive experience.

Flexible Use-Case Specific Form Factors

The question now, therefore, is how do we bridge this gap between intuition and realism? What about the information and feedback we used to get from the vibration in our controllers?

DIY haptic peripherals can be a solution.

By creating custom, cost effective peripherals with vibration motors and sensors, and integrating them with hands-only tracking, users can enjoy the benefits of more natural and intuitive interactions while still experiencing the tactile feedback that is crucial for immersion.

An example of a DIY Haptic Peripheral is the Haptic Saber Experience created by BSDXR. Utilizing a TITAN Core Haptics Dev Kit, BSDXR designed a 3D-printed saber hilt capable of providing haptic feedback. When connected to an XR device, the user can actually feel the distinctive hum of the blade within VR.

Imagine the same principle in other use cases. The impact of the ball on your racket in a VR Tennis simulation for example, or the feel of a brush stroke when painting within XR. The shift to hand tracking doesn’t mean that we lose the tactile aspect of our digital experiences completely—in reality, it’s entirely possible for the two to exist simultaneously. 

As we anticipate the release of Apple’s headset, the intersection of hands-only interaction and haptic feedback opens up both exciting possibilities and challenges for the XR community. DIY haptic peripherals present an accessible option to merge the benefits of both worlds, ensuring users can enjoy a rich and immersive experience regardless of the interaction paradigm. We can only wait and see what else it is the future of XR holds.