Eurofusion logo

When a 3D machine vision system used in ITER for remote handling helps minimise blind spots in mobile machinery : Technology Transfer Award 2022

One of the key aspects for the success of a fusion reactor is to maintain a high plant availability. Remote Handling (RH) will help engineers to transport, maintain and replace heavy ITER components which will be exposed to radiations in the machine core. These operations usually require a high degree of accuracy (mm), but regularly encounters issues related to camera occlusions or frame poor quality. In this context, the Tampere University developed in collaboration with F4E and VTT an innovative 3D Node target tracking system. The 3D Node is a camera-based system designed to optically detect the position of marked targets in remote handling operations. The system consists of purpose-built optical markers designed to withstand the stringent environmental conditions of the ITER divertor, methods and approaches to consistently calibrate the different components of the camera system, and tailored software algorithms for the markers detection. It offers millimeter precision guidance to robotic manipulators to perform tool insertion, object manipulation and other remote handling operations.

''

In order to maximise ITER plant availability, remote handling will perform maintenance of heavy ITER components with a high degree of accuracy, while coping with very limited visibility to localise and grasp components.

Emilio Ruiz, F4E project manager for Remote Handling Control Systems.

For example, the 3D Node system has been tested with success on a 1:1 replica of the divertor Cassette Locking System (CLS) for the accurate manipulation of pin tools and jack tools used in the CLS operations for cassette locking and unlocking, and cassette compression, respectively.

An easier calibration of 3D camera systems to tackle the challenges of visibility and dead angles in the mobile machinery industry – The reach stacker case

The heavy mobile work machine industry produces vehicles used in earthmoving, harbor logistics, mining, forestry, etc. The vehicles are typically large and commonly suffer from visibility issues. Indeed, while visibilty is essential for efficient and safe operations, mobile machines are big and bulky, inducing numerous blind spots and difficult-to-see areas. This issue is taken to the extreme when operators are taken out of the cabin and placed in remote operation stations, potentially kilometers away from the machine actual location.

To tackle this challenge, Tampere University has launched a project to adapt the technology developed in prior research projects and identify the best market access strategy. 490,000€ have been co-invested by the public innovation agency Business Finland (70%) and the university (30%) in a “Research to Business” initiative, through which relevant contacts with mobile machinery OEMs have been established. The scientists also initiated different pilot programs in which the technology has been deployed on equipment to validate the technical feasibility and the prospects support.

Especially, the team implemented and tested a visualization system on a reach stacker, a vehicle manufactured by Cargotec, designed for transport and shipping containers stacking in harbor logistics. The massive containers carried around by stackers cause significant visual obstructions. In this pilot, the hand-eye calibration system developed with F4E has been applied to calibrate a 3D camera system attached to the end of the boom and detect optical markers for live position tracking. This set-up successfully allowed the operator to see what stood behind the container blocking the view, projected onto the windshield of the vehicle.

A massive market potential and a spin-off company expected to be created in Q4 2022

The global mobile machinery industry is massive, spanning dozens of different sectors with various sizes and dimensions, and different driving factors motivating unmanned operations. In total, there are around 3 million new work machines produced annually, all of which can benefit from vison support of different levels. Work machines operate in various environments that are most of the time not as structured as those from ITER. However, when the type of operation is known beforehand, methods developed for remote handling can be used. Examples of such common tasks may be interaction between vehicles (such as a combine harvester offloading harvested crops into a tractor trailer), multi-use vehicles changing tools between tasks (like an excavator changing tools from digging to drilling) or vehicles backing to be attached to a trailer. These tasks do not significantly differ from accurately determining the location of a divertor cooling pipe flange for a robotic manipulator to grab it.

''

It offers millimeter precision guidance to robotic manipulators to perform tool insertion, object manipulation and other remote handling operations

Olli Suominen, researcher at Tampere University.

This massive market potential, as well as the opportunity to build a sustainable business, is fostering the creation of a spin-off company. The latter should occur at the end of 2022, with the ambition of creating highly qualified jobs, firstly in the company itself, and through a knock-on effect along the value chain. Some use cases can already be addressed with the existing solutions within 6 to 12 months from spinning-out, while more complex ones will become reality over time. In terms of impact for society and industry, better vision systems will bring higher efficiency and increased safety in day-to-day operations for the company’s customers (OEMs and end-users).

Thanks to the effort invested and the result achieved in further using a fusion innovation in a new application outside the ITER project, Tampere University received the Technology Transfer Award 2022.

This technology
was supported by

Are you interested in this technology?

You can also get in touch with us if you need any support

Share