Using smartglasses to collect floating waste from river
Operating a crane from anywhere to remotely collect floating waste
The essence
To tackle the river pollution challenge in Belgium, DEME installed a plastic collector platform in the middle of the Scheldt to collect floating waste from the water. As the platform is not accessible without using a boat, the crane on this platform needed to be operated from the shore. We created an augmented reality solution combined with a 3D camera that streams a stereoscopic 3D video in real-time. This allows the crane operator to visualize the crane and the river as if he was there on the platform.
The challenge
Operating a crane from anywhere
De Vlaamse Waterweg and DEME Environmental Contractors decided to join forces and combat river pollution. They installed a plastic collector platform in the middle of the Scheldt. A crane on this platform transfers the collected waste into a container. DEME wanted to investigate a way to control the crane without being physically nearby, as they can only access the platform by boat. This would also be insightful information for their future projects. It’ll give them a clear vision on how to control any machinery from a safe place.
The Solution
By leveraging a 3D stereoscopic camera, the app visualizes the actions of the crane and its environment in a true 3D livestream, projected on a holographic hemisphere. Being an augmented reality device, the HoloLens allows for pin-sharp stereoscopic projection of the 3D camera image while still giving the operator a view of his physical remote control and real-life surroundings.
Also, the HoloLens is a fully standalone, untethered headset. It can be used from anywhere, in every office or any location in the field, as long as good network connectivity is provided. No need for an additional laptop or desktop computer. Having this functionality running on a HoloLens proves to be a very convenient and portable solution.
What makes the project truly robust is the secondary “controller” mechanism keeping track of the latency. Since there is no room for errors, the end-to-end latency (the time between making a single video frame, streaming it, and visualizing on the HoloLens) of the stream was highly important and a determining factor to give the project a ‘No‘ or a ‘Go‘. By using a mix of the industry standard streaming protocols and a carefully chosen combination of the videocodec, container and transport parameters, we were able to achieve a latency of around 120 milliseconds! Generally, anything below 200 milliseconds is considered “real-time”. Should the stream slow down and be interrupted due to network issues for example, a warning and even an alarm appear on the screen to instruct the operator to pause his process until the latency returns to normal values.
Our approach
Pin-sharp stereoscopic projection in real-time
After a few iterations, we went for more control by linking a minicomputer (the Jetson Nano) to the ZED 3D camera from Stereolabs. This was clearly the way to go as the solution is now used on a daily basis.
