Color Image-Guided Boundary-Inconsistent Region Refinement for Stereo Matching

Jianbo Jiao, Ronggang Wang*, Wenmin Wang, Dagang Li, Wen Gao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)

Abstract

Cost computation, cost aggregation, disparity optimization, and disparity refinement are the four main steps for stereo matching. While the first three steps have been widely investigated, few efforts have been taken on disparity refinement. In this paper, we propose a color image-guided disparity refinement method to further remove the boundary-inconsistent regions on disparity map. First, the origins of boundary-inconsistent regions are analyzed. Then, these regions are detected with the proposed hybrid-superpixel-based strategy. Finally, the detected boundary-inconsistent regions are refined by a modified weighted median filtering method. Experimental results on various stereo matching conditions validate the effectiveness of the proposed method. Furthermore, depth maps obtained by active depth acquisition devices like Kinect can also be well refined with our proposed method.

Original languageEnglish
Article number7368917
Pages (from-to)1155-1159
Number of pages5
JournalIEEE Transactions on Circuits and Systems for Video Technology
Volume27
Issue number5
DOIs
Publication statusPublished - May 2017

Bibliographical note

Funding Information:
This work was supported in part by the National Science Foundation of China under Grant 61370115 and Grant 61402018, in part by the China 863 Project under Grant 2015AA015905, and in part by Shenzhen Peacock Plan under Grant JCYJ20150331100658943.

Publisher Copyright:
© 1991-2012 IEEE.

Keywords

  • Boundary
  • disparity refinement
  • Kinect
  • stereo matching

ASJC Scopus subject areas

  • Media Technology
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Color Image-Guided Boundary-Inconsistent Region Refinement for Stereo Matching'. Together they form a unique fingerprint.

Cite this