Remote Operations of IRB140 with Oculus Rift
MetadataShow full item record
This article poses a suggestion on how to create a setup which projects the presence of the user to another location. The article gives a proof of concept of a newway of doing remote operations, giving the operator a better feel of presence and thus making the operator able to make safer and better decisions. The main goal is to link the movement of the human head tothe movement of a 6 degrees of freedom robotic arm and streaming stereo videofrom the robotic arm back to a stereoscopic screen mounted to the operators head. Additional objectives is to include streaming of sound, improving the video quality, studying inverse kinematics and to improve how well the system performs as a whole on the ABB RobotStudio platform. The virtual reality hardware that was chosen was the Oculus Rift. The software was written to extract data from the Oculus Rift and pass it on to the robotic arm. This software includes logging of data, TCP/IP connection, a simple GUI and multiple threads to handle the data flow both in the programming language C++ and RAPID. An open source software to handle a camera module, the Ovrvision, was studied and modified. This was to get a camera feed from the robot to the Oculus Rift screen. The sound streaming system was included by using the Blue Yeti stereo condenser microphone. The IRB140 was chosen as the robot manipulator as this was the robot arm available at the university.In addition to the hardware and software testing and development, alternative ways of improving the system is also suggested. This includes studies on how to do inverse kinematics to translate the Oculus position into joint angles of the robot. In order to try to improve the system, adjustments on system parameters was done and the results was evaluated by both physical testing and analyzing of the logged data. Some of the testing was done only in the simulator as tests showed that this correlated well with the physical robot. Future estimation of the position of the head was included in order to reduce overall latency. Predictions 40ms into the future gave a result that compromised the need for low latency and accuracy. Reduction of latency was also done by changing system parameters and using other settings in the move algorithm, the total system latency was reduced to about 0.3 seconds, this being the worst case, in the final iteration of testing and improving. The average case was at about 0.2-0.25 seconds and best case of about 0.1 seconds.