Coding theoretic approach to SAR image segmentation

U. Ndili, R. Nowak, R. Baraniuk, H. Choi, M. Figueiredo

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

In this paper, a coding theoretic approach is presented for the unsupervised segmentation of SAR images. The approach implements Rissanen's concept of Minimum Description Length (MDL) for estimating piecewise homogeneous regions. Our image model is a Gaussian random field whose mean and variance functions are piecewise constant across the image. The model is intended to capture variations in both mean value (intensity) and variance (texture). We adopt a multiresolution/progressive encoding approach to this segmentation problem and use MDL to penalize overly complex segmentations. We develop two different approaches both of which achieve fast unsupervised segmentation. One algorithm is based on an adaptive (greedy) rectangular recursive partitioning scheme. The second algorithm is based on an optimally-pruned wedgelet-decorated dyadic partition. We present simulation results on SAR data to illustrate the performance obtained with these segmentation techniques.

Original languageEnglish (US)
Title of host publicationProceedings of SPIE - The International Society for Optical Engineering
EditorsE.G. Zelnio
Pages103-111
Number of pages9
Volume4382
DOIs
StatePublished - 2001
EventAlgorithms for Synthetic Aperture Radar Imagery VIII - Orlando, FL, United States
Duration: Apr 16 2001Apr 19 2001

Other

OtherAlgorithms for Synthetic Aperture Radar Imagery VIII
Country/TerritoryUnited States
CityOrlando, FL
Period4/16/014/19/01

Keywords

  • Adaptive Recursive Partitioning
  • CART
  • Minimum Description Length
  • Multiscale
  • SAR
  • Segmentation
  • Wedgelets

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Condensed Matter Physics

Fingerprint

Dive into the research topics of 'Coding theoretic approach to SAR image segmentation'. Together they form a unique fingerprint.

Cite this