Novel scene generation, merging and stitching views using the 2D affine space

Kuntal Sengupta*, Jun Ohya

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)


In this paper we present a unified theoretical framework for novel scene synthesis, merging real and virtual worlds, and view stitching. To start with, we have a set of real images from weakly calibrated cameras, for which we compute the dense point match correspondences. For applications like novel view synthesis, one may first solve the 3D scene reconstruction problem, followed by a view rendering process. However, errors in 3D scene reconstruction usually gets reflected in the quality of the new scene generated, so we seek a more direct method. In this paper, we use the knowledge of dense point matches and their affine coordinate values to estimate the corresponding affine coordinate values in the new scene. Our technique of reprojection is extended for other applications like merging real and synthetic worlds, and view stitching.

Original languageEnglish
Pages (from-to)39-53
Number of pages15
JournalSignal Processing: Image Communication
Issue number1-2
Publication statusPublished - 1998 Nov 6
Externally publishedYes


  • Image based rendering
  • Reprojection
  • View generation

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering


Dive into the research topics of 'Novel scene generation, merging and stitching views using the 2D affine space'. Together they form a unique fingerprint.

Cite this