BASS: Boundary-Aware Superpixel Segmentation

Antonio Rubio, Longlong Yu, Edgar Simo-Serra, Francesc Moreno-Noguer

Research output: Chapter in Book/Report/Conference proceedingConference contribution

15 Citations (Scopus)


We propose a new superpixel algorithm based on exploiting the boundary information of an image, as objects in images can generally be described by their boundaries. Our proposed approach initially estimates the boundaries and uses them to place superpixel seeds in the areas in which they are more dense. Afterwards, we minimize an energy function in order to expand the seeds into full superpixels. In addition to standard terms such as color consistency and compactness, we propose using the geodesic distance which concentrates small superpixels in regions of the image with more information, while letting larger superpixels cover more homogeneous regions. By both improving the initialization using the boundaries and coherency of the superpixels with geodesic distances, we are able to maintain the coherency of the image structure with fewer superpixels than other approaches. We show the resulting algorithm to yield smaller Variation of Information metrics in seven different datasets while maintaining Undersegmentation Error values similar to the state-of-the-art methods.

Original languageEnglish
Title of host publication2016 23rd International Conference on Pattern Recognition, ICPR 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Electronic)9781509048472
Publication statusPublished - 2016 Jan 1
Event23rd International Conference on Pattern Recognition, ICPR 2016 - Cancun, Mexico
Duration: 2016 Dec 42016 Dec 8

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651


Other23rd International Conference on Pattern Recognition, ICPR 2016

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition


Dive into the research topics of 'BASS: Boundary-Aware Superpixel Segmentation'. Together they form a unique fingerprint.

Cite this