Soft Robotic Hand Sees & Feels Like a Human Hand | AI Breakthrough

by Priyanka Patel

Zhejiang University researchers have unveiled a robotic hand capable of “seeing” around corners, a breakthrough that could dramatically improve how robots handle delicate tasks and objects. The innovation, detailed in a paper published in Nature Communications, mimics the dexterity and sensory perception of the human hand.

A New Grip on Robotic Dexterity

The new “FlexiRay” hand uses a unique combination of flexible materials and deep learning to overcome limitations in existing robotic grippers.

Illustration comparing the FlexiRay sensor with human hand perception modalities and its application scenarios. It introduces the design inspiration, working principle, and layered structure. Credit: Nature Communications (2025). DOI: 10.1038/s41467-025-67148-y.

Current robotic hands often rely on visual-tactile sensors—essentially small cameras—to “see” and feel objects. However, these sensors typically require stiff materials to capture clear images, limiting the robot’s flexibility and making it difficult to handle fragile or irregularly shaped items. FlexiRay tackles this problem with a novel approach.

“The inspiration for this work came from the remarkable capabilities of the human hand, which combines soft, compliant skin with a complex sensory system capable of perceiving pressure, texture, and temperature simultaneously,” explained Huixu Dong, senior author of the paper, in an interview. “Our primary objective was to solve the ‘blind spot’ problem in soft sensors.”

How FlexiRay Works

FlexiRay’s design is based on the “Fin Ray Effect,” a bio-inspired structure that allows it to passively wrap around objects, much like a fish fin. The hand incorporates an internal camera and a unique “multi-mirror” optical system. Unlike traditional soft robotic fingers that block the camera’s view when bent, FlexiRay dynamically redirects the field of view, maintaining a continuous, unobstructed view of the contact surface.

“Essentially, the mechanical deformation of the finger drives the optical system to ‘look around’ the corners,” Dong said. “It transforms the structural obstruction from a hindrance into a functional mechanism for full-coverage sensing. Its ‘skin’ is a multi-layered pad containing thermochromic (color-changing with heat) and reflective materials.”







Human-Robot Interactions. Credit: Nature Communications (2025). DOI: 10.1038/s41467-025-67148-y

What Makes FlexiRay Different?

The researchers found that existing visual-tactile sensors struggle with flexibility, as surface deformations can compromise image quality. FlexiRay, however, can reliably collect visual and tactile information even when significantly bent or twisted. In tests, the hand achieved over 90% effective sensing coverage during large deformations, according to Dong. “The most notable achievement of our work is the paradigm shift from ‘avoiding deformation’ to ‘leveraging deformation’,” he stated.

A soft robotic hand designed to achieve human-like touch
Demonstration of FlexiRay in a remote tea-making task. This experiment tests how the sensor’s multimodal perception capabilities enable the robot to perform long-sequence teleoperation while providing multi-type feedback. Credit: Nature Communications (2025). DOI: 10.1038/s41467-025-67148-y.

The team believes FlexiRay has the potential to improve robotic manipulation across a range of applications, from handling delicate agricultural products to grasping irregularly shaped packages. Its softer materials also make it safer for use around humans and easily damaged objects.

Looking ahead, the researchers plan to expand the technology to create fully multi-fingered robotic hands and integrate FlexiRay with imitation learning frameworks, allowing robots to learn dexterity and safety from human demonstrations.

More information:
Yanzhe Wang et al, Flexible robotic hand harnesses large deformations for full-coverage human-like multimodal haptic perception, Nature Communications (2025). DOI: 10.1038/s41467-025-67148-y.

© 2026 Science X Network

You may also like

Leave a Comment