IIT’s AMI Lab adopting TouchDIVER for teleoperation applications​

AMI Lab is the Artificial and Mechanical Intelligence laboratory inside the Italian Institute of Technology (IIT). The Lab has adopted the TouchDIVER to manipulate the hands of the ergoCub humanoid robot, leveraging full haptic feedback to get a more natural experience during teleoperations.

Case study

01

Goals

ergoCub is a humanoid robot endowed with embodied intelligence and designed for ergonomic interaction with humans. The robot is intended to relieve workers from repeated and excessive exertion.

02

TouchDIVER adoption

AMI Lab adopted the TouchDIVER mainly for teleoperation application. The avatar system is based on the ergoCub, and a human operator manipulates the hands of the humanoid robot by wearing the device.

"It makes the experience more real"

"The overall experience is highly satisfactory. The TD stands out due to its ease of wear and lightweight design, making it exceptionally convenient for prolonged usage. Its diverse haptic feedback contributes to a more realistic experience. What I appreciate most about the TD is the flexibility of the SDK. It enabled me to develop and implement my own hand tracking algorithm, significantly enhancing its functionality.” –

Ehsan Ranjbari, PhD Student Fellow at AMI Lab.

Application context

TouchDIVER can serve as an exceptional human-machine interface on the master side. Operators are empowered with a natural and comfortable means of controlling the slave system while receiving valuable feedback on the robot’s interactions with the environment. Through haptic feedback, it is possible to feel the shape, textures, and temperature of objects manipulated by the robot, thus enhancing their overall sense of presence in the application.

About

IIT AMI

AMI Lab combines Artificial Intelligence with Mechanics to provide mankind with the next generation of Humanoid Robots, using iCub as development platform.

TouchDIVER Pro

Discover TouchDIVER Pro

TouchDIVER Pro glove features six actuation points, including all five fingers and the palm, and current WEART’s unique multimodal haptic feedback, which provides lifelike force feedback, textures rendering and thermal cues, with precise hand tracking.