Virtual Telepresence Robot With VR Display for Real-Time Monitoring of Remote Locations

A pair of students who recently graduated from VR Siddhartha Engineering College (VRSEC) in Kanuru, Andhra Pradesh in India, have created a virtual telepresence robot, transmitting its feed in a virtual reality environment - making users perceive the location as if they were also there.

The project was supervised by VRSEC professor VN Prudhvi Raj and demonstrates how robots could help advance virtual reality applications, together with robots, in capturing real-time video data and monitoring locations that are temporarily, or even permanently, inaccessible to humans.

VRSEC's Virtual Telepresence Robot
TechXplore Twitter Account


Allowing Real-Time Monitoring of Remote Locations

"[We] have recently graduated in electronics and instrumentation engineering from VR Siddhartha's Engineering College," said Mani Babu Gorantla and Grandhi Sathya Venkata Krishna, the two researchers from VRSEC who created the VR telepresence robot, in a statement to online news portal TechXplore. They explained that they developed the robot as their final project, taking inspiration from a feature on telepresence robots in the magazine Electronics For You.

The two explained that their objective in creating the robot was to "allow users to see things that are happening in remote locations in real-time." To achieve their target, they designed a mobile robot equipped with an onboard camera and Wi-Fi connectivity. The robot's video feed can then be viewed by users on their smartphone, through an Internet browser, or virtual reality (VR) headsets.

Proponents of the study explain that whatever is captured by the onboard camera of the robot can be transmitted directly to the user's devices, allowing them even to view the VR-generated data "as if they were actually navigating it." Furthermore, the onboard camera supposedly moves in the same direction as the user's head movements.

Built on Readily Available Materials

In building the telepresence robot, the VRSEC students initially used an Arduino microcontroller and a Raspberry Pi, a programmable minicomputer. In making the robot movement correspond to the user's own movements, they added accelerometers and gyroscopes - sensors also found in smartphones that determine position and orientation - to their project. The data from these sensors were interpreted in relation to the directional orientation and shifts in the user's head movements. Transmitting the data to the Raspberry Pi, they can control the onboard camera's movement, ensuring that it matches the other end's head movements.

"We decided to use two different controllers (Raspberry Pi and Arduino) because while Raspberry Pi can also control the robot's motor, it would have higher RAM," the students told TechXplore, adding that the setup allows the setup to have more time for data processing and streaming, reducing the burden on the Raspberry Pi minicomputer.

Later on, in their tests, they used a MyRIO - an embedded evaluation board from National Instruments - in place of the Arduino and the Raspberry Pi. Since the NI portable device can perform the microcontroller tasks and the minicomputer, it was chosen by the duo. Additionally, it boasts a higher processing capability, albeit being more expensive than the parts it replaced.


Check out more news and information on Virtual Reality on Science Times.

Join the Discussion

Recommended Stories

Real Time Analytics