Unsupervised video object segmentation by supertrajectory labeling

Masahiro Masuda, Yoshihiko Mochizuki, Hiroshi Ishikawa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

We propose a novel approach to unsupervised video segmentation based on the trajectories of Temporal Super-pixels (TSPs). We cast the segmentation problem as a trajectory-labeling problem and define a Markov random field on a graph in which each node represents a trajectory of TSPs, which we minimize using a new two-stage optimization method we developed. The adaption of the trajectories as basic building blocks brings several advantages over conventional superpixel-based methods, such as more expressive potential functions, temporal coherence of the resulting segmentation, and drastically reduced number of the MRF nodes. The most important effect is, however, that it allows more robust segmentation of the foreground that is static in some frames. The method is evaluated on a subset of the standard SegTrack benchmark and yields competitive results against the state-of-the-art methods.

Original languageEnglish
Title of host publicationProceedings of the 15th IAPR International Conference on Machine Vision Applications, MVA 2017
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages448-451
Number of pages4
ISBN (Electronic)9784901122160
DOIs
Publication statusPublished - 2017 Jul 19
Event15th IAPR International Conference on Machine Vision Applications, MVA 2017 - Nagoya, Japan
Duration: 2017 May 82017 May 12

Publication series

NameProceedings of the 15th IAPR International Conference on Machine Vision Applications, MVA 2017

Other

Other15th IAPR International Conference on Machine Vision Applications, MVA 2017
Country/TerritoryJapan
CityNagoya
Period17/5/817/5/12

ASJC Scopus subject areas

  • Computer Science Applications
  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Unsupervised video object segmentation by supertrajectory labeling'. Together they form a unique fingerprint.

Cite this