Tactile Sketch Saliency

Jianbo Jiao, Ying Cao, Manfred Lau, W H LAU Rynson

Research output: Chapter in Book/Report/Conference proceedingConference contribution


In this paper, we aim to understand the functionality of 2D sketches by predicting how humans would interact with the objects depicted by sketches in real life. Given a 2D sketch, we learn to predict a tactile saliency map for it, which represents where humans would grasp, press, or touch the object depicted by the sketch. We hypothesize that understanding 3D structure and category of the sketched object would help such tactile saliency reasoning. We thus propose to jointly predict the tactile saliency, depth map and semantic category of a sketch in an end-to-end learning-based framework. To train our model, we propose to synthesize training data by leveraging a collection of 3D shapes with 3D tactile saliency information. Experiments show that our model can predict accurate and plausible tactile saliency maps for both synthetic and real sketches. In addition, we also demonstrate that our predicted tactile saliency is beneficial to sketch recognition and sketch-based 3D shape retrieval, and enables us to establish part-based functional correspondences among sketches.
Original languageEnglish
Title of host publicationMM '20
Publication statusPublished - Oct 2020


Dive into the research topics of 'Tactile Sketch Saliency'. Together they form a unique fingerprint.

Cite this