AI/ML

Generation of Tactile Data from 3D Vision and Target Robotic Grasps.


Icon for IEEE Engineering in Medicine and Biology Society Related Articles

Generation of Tactile Data from 3D Vision and Target Robotic Grasps.

IEEE Trans Haptics. 2020 Jul 24;PP:

Authors: Zapata-Impata BS, Gil P, Mezouar Y, Torres F

Abstract
Tactile perception is a rich source of information for robotic grasping: it allows a robot to identify a grasped object and assess the stability of a grasp, among other things. However, the tactile sensor must come into contact with the target object in order to produce readings. As a result, tactile data can only be attained if a real contact is made. We propose to overcome this restriction by employing a method that models the behaviour of a tactile sensor using 3D vision and grasp information as a stimulus. Our system regresses the quantified tactile response that would be experienced if this grasp were performed on the object. We experiment with 16 items and 4 tactile data modalities to show that our proposal learns this task with low error.

PMID: 32746383 [PubMed – as supplied by publisher]

Source link

Related posts

LANL geothermal project selected for DOE funds – Los Alamos Monitor

Newsemia

[SSCAIT] StarCraft Artificial Intelligence Tournament Live Stream

Newsemia

Luminance and Omnius are bringing AI to legacy industries

Newsemia

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy

COVID-19

COVID-19 (Coronavirus) is a new illness that is having a major effect on all businesses globally LIVE COVID-19 STATISTICS FOR World