A confident scale-space shape representation framework for cell migration detection

K. Zhang, H. Xiong, X. Zhou, L. Yang, Y. L. Wang, S. T.C. Wong

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Automated segmentation of time-lapse images is a method to facilitate the understanding of the intricate biological progression, e.g. cancer cell migration. To address this problem, we introduce a shape representation enhancement over popular snake models in the context of confident scale-space such that a higher level of interpretation can hopefully be achieved. Our proposed system consists of a hierarchical analytic framework including feedback loops, self-adaptive and demand-adaptive adjustment, incorporating a steerable boundary detail term constraint based on multiscale B-spline interpolation. To minimize the noise interference inherited from microscopy acquisition, the coarse boundary derived from the initial segmentation with refined watershed line is coupled with microscopy compensation using the mean shift filtering. A progressive approximation is applied to achieve represented as a balance between a relief function of watershed algorithm and local minima concerning multiscale optimality, convergence and robust constraints. Experimental results show that the proposed method overcomes problems with spurious branches, arbitrary gaps, low contrast boundaries and low signal-to-noise ratio. The proposed system has the potential to serve as an automated data processing tool for cell migration applications.

Original languageEnglish (US)
Pages (from-to)395-407
Number of pages13
JournalJournal of Microscopy
Volume231
Issue number3
DOIs
StatePublished - Sep 2008

Keywords

  • 3T3 cell
  • Cellular image segmentation
  • Mean shift filtering
  • Multiscale detail detection
  • Snake model
  • Time-lapse microscopy

ASJC Scopus subject areas

  • Instrumentation

Fingerprint

Dive into the research topics of 'A confident scale-space shape representation framework for cell migration detection'. Together they form a unique fingerprint.

Cite this