Efficient motion vector prediction algorithm using pattern matching

Zhenxing Chen*, Satoshi Goto

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)

Abstract

The state-of-the-art median prediction scheme is widely used for predicting motion vectors (MVs) in recent video standards. By exploiting the spatial correlations among MVs, median prediction scheme predicts MV for current block from three neighboring blocks. When MV is obtained from motion estimation, MV difference (MVD) is calculated and then transmitted. This process for predicting MV and calculating MVD is known as MV coding process. For MV coding, the performance depends on how efficient both the spatial and the temporal correlations among MVs are being exploited. Median prediction scheme applies a sophisticated way including some special rules to exploit the spatial correlations, however the temporal correlations among successive MVs are not exploited. In this paper, a new algorithm named MV pattern matching (MV-PM) exploiting both the spatial and temporal correlations is proposed. Various kinds of experimental results show that the proposed MV-PM algorithm outperforms the median prediction and the other related prediction schemes.

Original languageEnglish
Pages (from-to)727-733
Number of pages7
JournalJournal of Visual Communication and Image Representation
Volume22
Issue number8
DOIs
Publication statusPublished - 2011 Nov

Keywords

  • Median prediction
  • Mode indicator
  • Motion estimation
  • Motion vector coding
  • Motion vector pattern matching (MV-PM)
  • Motion vector prediction (MVP)
  • Motion vector spatial correlation
  • Motion vector temporal correlation

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Media Technology
  • Computer Vision and Pattern Recognition
  • Signal Processing

Fingerprint

Dive into the research topics of 'Efficient motion vector prediction algorithm using pattern matching'. Together they form a unique fingerprint.

Cite this