Extended Reality Teleoperation and Simulation of Space Robotics Using Unity

Loading...
Thumbnail Image

Files

ARVR_URD_2024.pdf (1.18 MB)
No. of downloads: 27

Publication or External Link

Date

2024-04

Citation

Abstract

The use of extended reality (XR) for enabling “telepresence” of robotic systems has become more prevalent in the past decade. However, the technology leap from commercially available robots to more extreme environment robotic systems, such as industrial and space robots, has yet to be widely demonstrated. The Space Systems Laboratory at the University of Maryland is developing XR control interfaces using the Unity game engine to more intuitively command dexterous space robots. Utilizing the Unity game engine lends several benefits, such as high fidelity dynamic physics simulation, customizability through the use of C# scripting, and the built-in robotic features provided through the Unity Robotics Hub. Unity, being a game engine, is able to simulate different environments, such as microgravity and the surfaces of celestial bodies such as the Moon or Mars. This allows for the creation of immersive scenes that robots can be simulated in to demonstrate the intractability of the system in the desired environment. Accurate robotic models can be generated in Unity using ROS’s Unified Robot Description Files (URDFs) and Unity’s “ArticulationBody” component for joint simulation, which provide state feedback such as joint position, velocity, and torque, as well as forces and torques being applied to the robot body. The robot model can then be used to either visualize a robot by sending state data from a control program to Unity, or to command a robot by sending state data from Unity to the control program. Incorporating XR devices such as the Microsoft Hololens or the Oculus Quest allow for immersive control and visualization of space robotic systems in a desired environment.

Notes

Rights

Attribution 3.0 United States
http://creativecommons.org/licenses/by/3.0/us/