Cross-site Human-human Collaboration Demonstrator Using a UR-10 Robot as Embodiment for the Remote Worker

The RICAIP system demonstrates a multi-site and distributed manufacturing scenario in which a robotic arm is intuitively controlled remotely in real-time using VR glasses. The scenario covers a distributed, collaborative assembly of a cased Raspberry-Pi consisting of different parts: the CPU-board, the upper and lower part of the case and a fan.

This joint DFKI-ZeMA RICAIP demonstrator shows cross-site human-human collaboration using a UR-10 robot as embodiment for the remote worker. The demonstrator was shown at the DFKI booth at Hannover Fair 2022 and is the basis for one of RICAIP’s collaborative use-cases as well as for Wizard-of-Oz style studies within the context of human-robot collaboration in RICAIP. As such, the demonstrator will be extended in cooperation with the testbeds in Prague, Brno and Saarbrücken. A user-study is planned, to get a better insight into the planning space of the Rasberry-Pi case assembly task as well as the desired human-robot interaction modalities and verbal/nonverbal communication acts.


The parts are placed in a grid-like bin structure on a workbench together with a UR10e co-bot. While a human worker, located at the robot site, is mounting the parts, another human worker (VR-operator) controls the robot remotely through its digital 3D representation in real-time with VR-glasses, and can for example hand-over parts to their colleague.

To provide the VR-Operator with the necessary information about the remote site, a digital twin is enriched through real-time data about the environment as provided by two RGBD camera modules. One camera is responsible for recognizing and locating the different assembly-parts and is transmitting their position over shared real-time data streams via ROS, which can then be consumed and displayed within VR. The other camera module sends a 3D point cloud to the VR-glasses, to provide the VR-operator with a better understanding of the remote environment, e.g. unknown objects, obstacles and the actions of the co-worker. The VR interface for the robot remote control was implemented based on a URDF description of the UR10-e robot, which ensures that real and virtual robots match and thus movements on one side do not lead to different results on the other side.

To make the robot interactable in VR, a locally calculated inverse kinematics was utilized rather than making use of common solutions like MoveIt, which minimizes delays and enables multi-hand interaction. This way, a natural and intuitive two-handed control of the real robot was replicated. A paper about the system was submitted and accepted for IEEE AIVR ’22.



Publication: Caspar Jacob, Fabio Espinosa, Andreas Luxenburger, Dieter Merkel, Jonas Mohr, Tim Schwartz, Nishant Gajjar and Khansa Rekik (2022):  Digital Twins for Distributed Collaborative Work in Shared Production. In Proceedings of the IEEE International Conference on Artificial Intelligence & Virtual Reality (AIVR ’22), pp. 210-212.