VR Wheelchair: DevLog
About the Project
VR Wheelchair is a manual wheelchair locomotion system for virtual reality applications. The system will be packaged as a complete player controller rig which can be easily imported into an existing Unity project. VR Wheelchair is motivated by a general interest in representation within gaming. It also addresses a key strength and weakness in the medium of VR — immersion and locomotion, respectively.
A well-implemented wheelchair controller provides an opportunity for both representation and eroding implicit ableism. In the U.S. alone, over 3.5 million people use a wheelchair.¹ As much as 30% of gamers have some form of disability, and yet disability is rarely represented in games.² This package hopes to encourage the representation of wheelchair users in games, not just as NPCs but as the player protagonist.
For able-bodied gamers, representation presents an opportunity for growth in understanding and empathy. This is particularly relevant in the VR medium as total encapsulation of one’s vision, varying degrees of movement tracking, and other features make it highly immersive. Studies have shown that VR experiences provide a heightened sense of embodiment, or “body ownership”, over other digital mediums and that this quality can be taken advantage of to grow empathy.³ For example, a light-skinned user who spends a period of time immersed within a dark-skinned avatar may exhibit reduced racial bias when administered a test in the days or weeks following the experience.⁴
Lastly, a core motivation for this project is to develop a locomotion method that is innovative, intuitive, and comfortable. Locomotion is a notorious design problem in VR. Due to discrepancies of perceived movement and orientation between the eyes and other sensory organs, motion sickness is commonly experienced by VR users. Nothing ruins immersion like an upset stomach, and so comfort is top priority in developing this package. It’s my hope that the physical act of moving one’s arms to generate movement within the virtual space will have a grounding effect and ultimately offset the dissonance of the eyes registering movement while the body remains static. I’m aiming for a comfort level somewhere between teleportation and smooth movement — two of the most prominent locomotion systems in VR. In the case that testing shows my system to be significantly uncomfortable, I’m prepared to pursue another system of locomotion while remaining aligned with the other project motivations.
03 28 21
I spend the first day or so of the project doing research and watching tutorials. I’ve worked with the Oculus Integration package in the past, but I want my wheelchair rig to be applicable across VR platforms and am hoping to take advantage of Unity’s XR toolkit. I also look into using the OpenXR framework but, after reading some conflicting sources (Oculus 👀), come to understand Unity’s OpenXR plugin does not yet support deployment on Quest devices. We should get support later this year, and I’m considering updating the package for OpenXR at that time.
In the end, I decide to work with Unity’s built-in XR module and XR Interaction Toolkit. I’ll use their Action-Based Room-scale XR Rig as a foundation for my wheelchair rig, in order to provide flexibility across target platforms. This requires I implement using Unity’s new action-based input system, which I somehow didn’t know existed til now… but it looks super promising. It’s also worth noting that I’ll be working in Unity’s Universal Render Pipeline, for optimization, and, really, just cause customizable pipelines are the future of Unity development. Using a high-level, device-agnostic XR framework, as well as URP and the new input system sets my package up for a longer lifespan of compatibility.
03 30 21
I begin prototyping a physics-based rig. I’ve also planned to try a simulated rig, where movement is handled via translation and wheel motion is animated, but if this first method works out well, I don’t think I’ll have to. For the physics rig, the concept is simple. Torque is applied to either wheel. If the torque applied to each wheel is equal, the chair moves forward; if only the right wheel is spun, the chair will pivot around the stationary left wheel. Much like you might know or imagine and actual wheelchair to function. The rig should also include a set of front wheel to keep the chair from tipping forward or backwards.
The rig is built around two axles with two wheels attached to each. All six objects are siblings in the hierarchy, and the wheels are attached to the axles using Hinge Joints. These joints allow the wheel to freely rotate along it’s local Y-axis (which has been rotated along the wheel axle) while restricting rotation along other axes.
Before I implemented actual VR controls, I wanted to test this out in editor using keyboard controls, so I built a very simple controller to apply torque to either wheel.
The results are surprisingly accurate to my expectations given the simplicity of the rig at this point. Much tweaking must be made to the physics properties of the rig (mass, drag, etc) in order to achieve realism, but before I get into that, I need to develop interaction between the wheels and actual VR hand controllers.
04 11 21
After several days of trial and error, I’ve managed to implement an VR interaction scheme allowing users to grab and rotate the rig’s wheels using hand controllers.
The locomotion model is still completely physics-based, and the rig is pieced together using rigidbodies, as well as fixed and hinged joints. The wheels themselves are sphere colliders. The XR camera/controller rig is also fused to the wheelchair rig using a fixed joint.
In an effort to make my wheelchair package as flexible and accessible as possible, I’m trying to layer my interaction system squarely on top of Unity’s XR Interaction Toolkit. The wheel system is built on top of the toolkit’s
XRBaseInteractable class. This class provides some handy hover/select event callbacks and object coupling (with some help from the
XRInteractionManager). The derived class
XRGrabInteractable provides interaction with the
XRDirectInteractor component I’ve assigned to both hand controllers. Unfortunately, just slapping a
XRGrabInteractable component on my wheels wasn’t providing the behavior I need, even after tweaking attach positions. The wheel would stutter when selected and rotate around its anchor unnaturally. I found I could produce the desired behavior by attaching a separate interactable to the wheel with a fixed joint; I could then drag this “handle” to produce a smooth rotation of the wheel.
Currently, my wheel uses a custom component which inherits from
XRBaseInteractable. This allows me to hook into the toolkits built-in hover/select events. When the wheel is selected, the interaction manager immediately cancels the grab and instantiates a second interactable at the position where the interactor made the selection. This second interactable (visualized as the white sphere in the above GIF) will function as a handle for the wheel. It uses the default
XRGrabInteractable component, as well as a fixed joint fusing it to the wheel. Once this new interactable is created, the interaction manager forces a selection between it and the acting interactor. And that’s it — the user can now freely grab and rotate the wheel. Upon releasing the selection, the handle interactable will destroy itself.
While this achieves the basic behavior that I need, there some remaining issue, as demonstrated below.
During selection, the user can drag the wheel chair one way or another (even toppling it) by extending the controllers beyond the bounds of the wheel. Try sitting in a chair and pulling the seat of that chair outwards towards the right with all your strength. Not much happens. In this scenario, you’re effectively one object together with the chair. You can’t leverage yourself against yourself.
Unfortunately, this is not the case with the wheelchair rig at the moment. As is, I can magically drag or lift myself sideways, upwards, etc. I’m not sure what the fix is, but as I move forward, I’ll be looking into rigidbody constraints object relationships within the rig. I’ll also be trying out some physics materials to simulate friction.
04 11 21
It’s been a frustrating week coming to terms with Unity’s physics system. I’ve been trying to solve an issue with Unity’s joints exhibiting some unwanted elasticity (pictured below). When the user grabs the wheel, the inertia pulls the grab point forward (this is natural); however, because the hand isn’t actually moving with the grab point (like it would IRL), the wheel swings back, like a pendulum, till eventually settle back at the position of the hand. I’ve played around with different combinations of mass and angular drag, but at the end of the day, this is a problem inherent to the virtual nature of the experience. In other words, the user’s had, having no physical relationship to the wheel is sort of unavoidable.
Desired behavior should look something like what’s pictured below. I’m just releasing my grab when the wheel’s angular velocity initially approaches zero. I’m exploring approaches of faking a deceleration on grab, so that the wheel doesn’t swing backward after slowing on a grab. However, building a solution that works across a myriad of complex cases might introduce more problems, and it’s possible this elastic behavior might actually pass unnoticed with users in the virtual space.
04 20 21
My current solution to the rubbery, pendulum behavior of my braking system is surprisingly simple and required only a line or two of code (a fraction of some of the other solutions I’d tried). As I’ve explained in a previous entry, when the user selects a wheel a grab point is generated. This acts as a joint between the controller and the wheel’s rigidbody. I was able to achieve smoother, more natural braking behavior by repeatedly respawning this grab point as long as the controller is actively selecting and in a braking state.
The controller is considered to be “braking” when its forward/backward velocity sits at 0 (+/- 0.05). As the braking wheel’s inertia pulls the grab point around its axis, away from the actual hand position, the grab point is destroyed and spawned again back at the position of the hand. Basically, every frame the grab point diminishes the wheel’s angular velocity slightly before reseting to the controller position. Given that the grab point and controller are never apart for more than a frame, there is no opportunity for the wheel to pull back toward the controller during a brake. This method is vaguely reminiscent of the anti-lock braking systems on most modern vehicles.
This method of braking works works surprisingly well, and required very little code, as it just reused the function I already had in place to spawn grab points. Unfortunately, this method does not work completely on slopes. On a slope, gravity causes the wheel to roll continuously a small amount each frame. I’ve explored a few fixes, such as constraining rotation beneath a certain angular velocity or increasing angular drag exponentially, but nothing so far has yielded satisfactory results. Also, spawning/destroying grab points (albeit simple objects) on a frame-by-frame basis is not optimal. I’d like to implement object pooling or simply find a way to cleanly re-use the same grab point each frame.
05 01 21
I’ve achieved base functionality on my wheelchair controller, so I have switched gears temporarily to cosmetics. I spent a day modeling out a wheelchair to be used in the package demo in Blender.
I found some great free textures at https://cc0textures.com/, and I think it’s looking pretty good.
Next, I plan on building out a small demo scene to demonstrate some different uses of my rig for a presentation later this week.
: U.S. Wheelchair User Statistics, Pants Up Easy
: You Can Take an Arrow to the Knee and Still Be an Adventurer, Cherry Thompson, GDC
: Virtual reality can help make people more compassionate compared to other media, Stanford News, 2018
: The impact of virtual reality on implicit racial bias and mock legal decisions, J Law Biosci, US National Library of Medicine, 2018