• Fingertip sensitivity for robots

    From ScienceDaily@1:317/3 to All on Thu Feb 24 21:30:40 2022
    Fingertip sensitivity for robots

    Date:
    February 24, 2022
    Source:
    Max Planck Institute for Intelligent Systems
    Summary:
    Striving to improve touch sensing in robotics, scientists
    developed a thumb-shaped sensor with a camera hidden inside
    and trained a deep neural network to infer its haptic contact
    information. When something touches the finger, the system
    constructs a three-dimensional force map from the visible
    deformations of its flexible outer shell. This research invention
    significantly improves a robot finger's haptic perception, coming
    ever closer to the sense of touch of human skin.



    FULL STORY ==========================================================================
    In a paper published on February 23, 2022 in Nature Machine Intelligence,
    a team of scientists at the Max Planck Institute for Intelligent Systems (MPI-IS) introduce a robust soft haptic sensor named "Insight" that uses computer vision and a deep neural network to accurately estimate where
    objects come into contact with the sensor and how large the applied forces
    are. The research project is a significant step toward robots being able
    to feel their environment as accurately as humans and animals. Like its
    natural counterpart, the fingertip sensor is very sensitive, robust,
    and high resolution.


    ==========================================================================
    The thumb-shaped sensor is made of a soft shell built around a lightweight stiff skeleton. This skeleton holds up the structure much like bones
    stabilize the soft finger tissue. The shell is made from an elastomer
    mixed with dark but reflective aluminum flakes, resulting in an opaque
    greyish color which prevents any external light finding its way in. Hidden inside this finger-sized cap is a tiny 160-degree fish-eye camera which
    records colorful images illuminated by a ring of LEDs.

    When any objects touch the sensor's shell, the appearance of the color
    pattern inside the sensor changes. The camera records images many
    times per second and feeds a deep neural network with this data. The
    algorithm detects even the smallest change in light in each pixel. Within
    a fraction of a second, the trained machine-learning model can map out
    where exactly the finger is contacting an object, determine how strong
    the forces are and indicate the force direction. The model infers what scientists call a force map: it provides a force vector for every point
    in the three-dimensional fingertip.

    "We achieved this excellent sensing performance through the innovative mechanical design of the shell, the tailored imaging system inside,
    automatic data collection, and cutting-edge deep learning," says Georg
    Martius, Max Planck Research Group Leader at MPI-IS, where he heads the Autonomous Learning Group. His Ph.D. student Huanbo Sun adds: "Our unique hybrid structure of a soft shell enclosing a stiff skeleton ensures high sensitivity and robustness.

    Our camera can detect even the slightest deformations of the surface from
    one single image." Indeed, while testing the sensor, the researchers
    realized it was sensitive enough to feel its own orientation relative
    to gravity.

    The third member of the team is Katherine J. Kuchenbecker, the Director
    of the Haptic Intelligence Department at MPI-IS. She confirms that the
    new sensor will be useful: "Previous soft haptic sensors had only small
    sensing areas, were delicate and difficult to make, and often could
    not feel forces parallel to the skin, which are essential for robotic manipulation like holding a glass of water or sliding a coin along a
    table," says Kuchenbecker.

    But how does such a sensor learn? Huanbo Sun designed a testbed to
    generate the training data needed for the machine-learning model to
    understand the correlation between the change in raw image pixels and
    the forces applied. The testbed probes the sensor all around its surface
    and records the true contact force vector together with the camera
    image inside the sensor. In this way, about 200,000 measurements were generated. It took nearly three weeks to collect the data and another
    one day to train the machine-learning model.

    Surviving this long experiment with so many different contact forces
    helped prove the robustness of Insight's mechanical design, and tests
    with a larger probe showed how well the sensing system generalizes.

    Another special feature of the thumb-shaped sensor is that itpossesses
    a nail- shaped zone with a thinner elastomer layer. This tactile fovea
    is designed to detect even tiny forces and detailed object shapes. For
    this super-sensitive zone, the scientists choose an elastomer thickness
    of 1.2 mm rather than the 4 mm they used on the rest of the finger sensor.

    "The hardware and software design we present in our work can be
    transferred to a wide variety of robot parts with different shapes and precision requirements.

    The machine-learning architecture, training, and inference process are
    all general and can be applied to many other sensor designs," Huanbo
    Sun concludes.

    Video: https://youtu.be/lTAJwcZopAA ========================================================================== Story Source: Materials provided by Max_Planck_Institute_for_Intelligent_Systems. Note: Content may be edited
    for style and length.


    ========================================================================== Journal Reference:
    1. Huanbo Sun, Katherine J. Kuchenbecker, Georg Martius. A soft
    thumb-sized
    vision-based sensor with accurate all-round force
    perception. Nature Machine Intelligence, 2022; 4 (2): 135 DOI:
    10.1038/s42256-021-00439-3 ==========================================================================

    Link to news story: https://www.sciencedaily.com/releases/2022/02/220224112625.htm

    --- up 11 weeks, 5 days, 7 hours, 13 minutes
    * Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1:317/3)