2-D mesh-based video object segmentation and tracking with occlusion resolution


Celasun I., Tekalp A., Gokcetekin M., Harmanci D.

SIGNAL PROCESSING-IMAGE COMMUNICATION, vol.16, no.10, pp.949-962, 2001 (Journal Indexed in SCI) identifier identifier

  • Publication Type: Article / Article
  • Volume: 16 Issue: 10
  • Publication Date: 2001
  • Doi Number: 10.1016/s0923-5965(00)00055-2
  • Title of Journal : SIGNAL PROCESSING-IMAGE COMMUNICATION
  • Page Numbers: pp.949-962

Abstract

This paper integrates fully automatic video object segmentation and tracking including detection and assignment of uncovered regions in a 2-D mesh-based framework. Particular contributions of this work are (i) a novel video object segmentation method that is posed as a constrained maximum contrast path search problem along the edges of a 2-D triangular mesh, and (ii) a 2-D mesh-based uncovered region detection method along the object boundary as well as within the object. At the first frame, an optimal number of feature points are selected as nodes of a 2-D content-based mesh. These points are classified as moving (foreground) and stationary nodes based on multi-frame node motion analysis, yielding a coarse estimate of the foreground object boundary. Color differences across triangles near the coarse boundary are employed for a maximum contrast path search along the edges of the 2-D mesh to refine the boundary of the video object. Next, we propagate the refined boundary to the subsequent frame by using motion vectors of the node points to form the coarse boundary at the next frame. We detect occluded regions by using motion-compensated frame differences and range filtered edge maps. The boundaries of detected uncovered regions are then refined by using the search procedure. These regions are either appended to the foreground object or tracked as new objects. The segmentation procedure is re-initialized when unreliable motion vectors exceed a certain number. The proposed scheme is demonstrated on several video sequences. (C) 2001 Elsevier Science B.V. All rights reserved.