By | January 4, 2024
Explore XR UI interactions in the "Idea Engine"

Idea Engine lets you create and share VR and mixed reality experiences. Building such a general tool requires complex user interfaces. In this guest article, developer Brett Jackson shares his methods for user interface interaction.

Guest article by Brett Jackson

Brett Jackson has been developing VR projects since 2015 and is the head of the new UK-based company X82 Ltd. His previous releases include: Dimensional (PC VR), Breath Tech (PC VR), Jigsaw 360 (PC VR & mobile VR) and 120Hz (SideQuest).

It is common to present a user interface through interactive 2D panels in XR. It’s not an exciting prospect, but it’s familiar and effective. But even if we accept this 2D intrusion into our XR worlds, there are still new considerations and opportunities to break away from the 2D paradigm.

I quickly grew tired of laser pointers that exaggerated my hand movement on distant panels, along with their inconsistent target vectors and intermittent pinch detection. I prefer to reach out and interact with the world. I want the panel right in front of me so I can place it comfortably and use it like an actual device.

My latest project, Idea Engine, is developed using StereoKit, an open source OpenXR library. It has a hands-first philosophy and provides out-of-the-box support for hand tracking as well as control support. It enables efficient creation of dynamic windows with typical UI controls. It is a great tool for quickly creating XR projects and has many other benefits.

Panels

So my starting point is a UI panel that we can grab at any time (no special handles or edges to find) with a nice aura that appears when we’re within reach. Now let’s add more XR considerations.

In XR, it is easy for a user to get behind a UI panel. Instead of showing a blank interface on the back or reverse interface, I flip the interface to the side the user is looking at – simple. It sounds trivial, but it’s worth considering XR-specific scenarios. Another approach is to automatically rotate the panel to face the player at all times, but this removes control from the user. If they want the panel at a weird angle, let them, they may have a good reason.

An individual panel should be kept at a small size (page size / screen size) so that the user can easily absorb the content without having to turn their head, but XR gives us an abundance of space. I like to look for opportunities to break out of the page limit. My scrollable areas have a handle to grab and move the content. When you grab it, you’ll see a greatly expanded view of the content area, and you can drag and drop in this mode, giving a greater placement range.

I show tooltips next to panels, with a line to the UI component they describe. This reduces the amount of text on the panel. Users can scroll through tips and hide the ones they know.

In another project, I prototyped a 3D Gantt chart that scrolled off the page horizontally and faded into the distance. The user’s main focus was still on the full-sized central panel, but they could optionally take in the wider context.

Although panels are comfortable and familiar, we shouldn’t feel constrained by their boundaries and it’s fun to look for ways to break out.

Menus

StereoKit introduced me to the radial hand menu, which I then expanded upon. I like this idea because you use it with one hand, so it’s convenient, accessible. I make the same menu system available on both the right and left hands and use the same approach for popup menus on panels for consistency.

My volumetric menu takes things a step further and was driven solely by a desire to use the 3rd dimension. I use it to select teleported destinations (with a pointer to each destination) and to select nearby nodes to edit. I also use it for keyboard input when browsing metaverse addresses. This is quite experimental. It has the advantage that all symbols are the same distance from the center, and you see your input without having to look away (a common problem with virtual keyboards). The downside is that it’s unfamiliar to users, so I expect some resistance to it. Notice in the video that the letters spiral from front to back in alphabetical order, so in a short while their position should become familiar.

You’ll soon be able to add menus like these to your own Idea Engine projects.

3D widgets

A color picker offered an ideal opportunity for experimentation, with three values ​​(hue, saturation, and value) that could be mapped to three dimensions. In my 3D color picker, you can change all three values ​​at once or individually set the hue, saturation or value. I find it more interesting to interact with than sliders on a 2D page.

Similarly with locomotion, I want to move in 3D, so I made a 3D joystick for smooth hand tracking movement. Simply drag the sphere in the direction you want to travel and roll your wrist to snap or smoothly rotate. It works in walking or flying mode and rotation can be disabled if the user finds it too much to think about all in one control. I still support traditional controller-based movements, but this one-handed controller duplicates the functionality of multiple joysticks/buttons and is an interesting example of how 3d hand movements can meet demands in new ways.

Happens

In all of my example videos, you’ll see that I hide the user’s hand as soon as they start interacting with the UI. A lot of developers put effort into carefully crafting grip poses for different purposes, and it looks nice, but to me a well-positioned hand that doesn’t reflect my own hand position is more distracting than no hand at all. A hand can also be a visual barrier once the interaction has begun.

With the hand away, I’m also free to tone down or exaggerate hand movements without any visual conflict. I dampen hand movements in the color picker to lower sensitivity and exaggerate hand movements when scrolling when there is a lot of content.

Text

While Idea Engine supports Sketchfab for downloading 3D models, AI for generating images, and importing photo/audio, it’s hard to beat the ease and accessibility of text and the spoken word to convey complex stories. With this in mind, I needed decent support for text so that users could merge all available formats to tell their stories.

Text generally doesn’t look good in VR, so I fade it out when you walk away to remove unsightly artifacts and close down text panels as well. Users will be keen to explore the environment rather than read text, so I have the option of having a narrator automatically read out any block text you encounter.

Text input was a challenge without a good solution. I created mobile-style text input with support for cut and paste and automatic pagination with a virtual keyboard. When I finished I thought it’s OK, but I wouldn’t want to write a long section in XR. Then I added voice-to-text support. It helped, but I found I needed to do a lot of editing after my dictation and it was still slower than using traditional methods. Now I allow users to connect to their headsets from a browser on any device they own and import text via a web page. I regularly use all three techniques, with the browser used for long text input.

My lesson here was that you don’t always have to solve everything in XR. Sometimes it is better to use more appropriate units and then import the results.

Test

From educational mind maps to interactive stories and games, you can leverage CC assets and import your own photos, audio and text to build your idea. Then bring it to life by adding states, events, and high-level scripts and sharing it on our X82 metaverse. A feature-packed end-user tool for exploring the possibilities of XR.

The public alpha is now available and free to download on App Lab, so you can come try any of the features discussed and give me your feedback.

#Explore #interactions #Idea #Engine

Leave a Reply

Your email address will not be published. Required fields are marked *