The UX of UIs in VR
XR is coming and it's coming fast. There is no doubt that in a few years time people will use AR-glasses alongside, or even instead of their smartphones.
As my bachelor's thesis I took a look at this future and built a working 3D-environment in Unity to be used with the Oculus Quest.
In several experiments various forms of interaction with handtracking were put into practice and findings documented. In the final product you are able to start, land and in other ways control a hot-air balloon solely with hand tracking and without controllers.
Hand tracking with Oculus Quest is possible since the beginning of 2020 and brings countless possibilities but also various problems with it. This thesis came into being in order to analyze these possibilities and problems and find out how interfaces in VR can work and which things there are to be considered.
Difference to 2D-Design
Interface-design in 2D has been explored for so many years now that countless guidelines, best practices and use cases exist out there.
But in the world of mixed reality (XR) is still a lot unexplored and untestet. There are different things to be considered here. For example
if you build an interface in VR, you don't just think about what is on that interface but also where, how big and how far away it is.
You have to make sure it doesn't block things behind it and decide how you make the user look where he's supposed to look since
things can always happen outside of his view.
Some of the things I've learned: