I am surprised that, at least according to all reporting I’ve seen, Apple isn’t planning to build haptic integrations between their existing Watch product and upcoming Vision Pro headset.
It seems like such a great way to offer additional value to owners of both products.
I have no doubt Apple could create an uncanny sensory feedback experience for hand tracking using Watch haptics alone. For example, think about the haptic feedback you get when using the Digital Crown to scroll through a list on the watch. Imagine applying that to the act of turning a virtual dial in AR.
Ever since 2015, the trackpads in all of Apple’s laptops have been solid state—the “click” is simulated, there are no moving parts. They have arguably never felt better. In a sense, they are better than the genuine thing. More real.
Adding physicality to the VisionOS interface will both ground it in reality and deepen immersion while providing an instant level of familiarity to those using the new platform.