Video (Camera feed) Streaming for Teleoperation in Virtual Reality
Muruganantham Jaisankar
Description:
As the automation industry continues to evolve and expand, the need for effective teleoperation to manage both the real and virtual worlds becomes increasingly important. However, operating devices in a virtual environment can be challenging. In order to address this issue, the aim is to develop a framework that will support effective teleoperation by addressing various factors such as distortion, smooth visuals, improved stream quality and codec, and minimized latency.
Teleoperation is operating machines from a remote location or from distance, it is used in situations where human intervention at a remote location is needed. An example of such an application is the use of delivery robots, which aim to reduce human labor by allowing a single operator can operate multiple robots from the control station equipped with specific facilities for operating the robots.
Integrating streaming from boards such as the Raspberry Pi with virtual reality can be challenging, and existing solutions primarily focus on real-time video streaming but do not prioritize telepresence. The focus is to optimize the operator experience through the best possible streaming method without a web interface, the objectives for this are as follows.
Plan
The plan for achieving this goal includes the following steps:
1. Develop connectivity with a fish-eye camera in order to capture and transmit real-world images with minimal distortion.
2. Analyze and compare different solutions for addressing the challenges of teleoperation in order to identify the most effective approach.
3. Finalize and propose a comprehensive framework that incorporates the best solutions identified in step 2, and which can be implemented in the automation industry to support effective teleoperation.
Milestone 2 (18.10)
Research VR (1.5 hours)
1.1 VR conferencing system
A 360-degree position of a user is captured to eliminate the background based on depth, and the resulting user frame is displayed in virtual reality. A user study was conducted to evaluate the 360-degree multiuser experience in different settings including a roundtable, a stand-up, and remote conferencing between two external companies. However, it should be noted that in order to effectively remove depth-based backgrounds, users must be situated within a green screen environment, which may not always be feasible.
1.2. Real-Time Object Tracking for Augmented Reality Combining Graph Cuts and Optical Flow
In this study, the authors propose a method for tracking a random object in order to augment it without the use of artificial markers. Two techniques, Optical flow, and graph cut segmentation are utilized for object tracking.
1.2.1 Graph cut Segmentation
Graph cut is employed as a foreground/background segmentation technique to minimize the cost function.
1.3. Integrating virtual and augmented realities in an outdoor application
The paper presents integrated AR and VR modeling in outdoor, real-world objects augmented in virtual reality.
1.3.1 Augmented reality meets Virtual reality
Wearable computers interact with simulated combat systems modular automated force (ModSAF) [8]. The authors use highly modular architecture for better AR navigation.
Set up a VR scene in Unity with a basic Video stream (3 hours)
Here I have used an MP4 video stream from the internet and viewed it in the Unity component
Research about connectivity between the Real world and Virtual reality
Connectivity between ESP32 to a Web browser(7 hours)
There are three steps needed to stream a video from ESP32
1. Setup Websocket in ESP32
2. Server (Here I used JSON programming)
3. Setup Unity Websocket package
Create connectivity with a fish eye/other cameras in Virtual reality(4 hours).
Here is the Step 3 Setup Unity Websocket package and video stream.
Milestone 4 (01.11)
4.1 Discussion on the type of Raspberry pi and camera (4 hours)
4.2 Setting up Raspberry pi version 4 and Zero 2 (2.5hours)
To set up Raspberry pi 4, we need to understand the architecture of the board.
Source: https://projects.raspberrypi.org/
and Raspberry Pi Zero 2
Source: https://picockpit.com/raspberry-pi/everything-about-raspberry-pi-zero-2-w/
4.3 Installing and analyzing different Operating systems on Raspberry pi (3 hours)
4.3.1 Raspberry Pi OS https://www.raspberrypi.com/software/
I have installed Raspberry PI OS using Raspberry Pi imager and checked the OS running on Raspberry Pi 4 board like below.
4.3.2 Diet Pi OS https://dietpi.com/
After flashing the OS with belanaEtcher I can see the OS running and I can see that on a monitor like below.
4.3.3 IoT Empower Framework. https://github.com/iotempire/iotempower
Milestone 5 (29.11)
5.1 Streaming Video in VR room (8 hours)
Learning and connecting the Open XR project with Steam VR, and Vive Pro headset (4 hours)
The project was created using VR Unity Project, and connected to a Vive pro headset using Steam VR, and OpenXR.
Video streaming in VR (2 hours)
The video streaming is simple and created using a 3D Quad Unlit/texture component with a Video player.
Learning and creating a VR room (2 hours)
The VR room is made with Cube, plan, and some office assets.
Milestone 6 (13.12)
6.1 Connectivity between Pi and Unity (6-8 hours)
This project is in the process of being developed to handle efficient teleoperation.