Automatic segmentation of abdominal fat from CT data

Amol Pednekar, Alok N. Bandekar, Ioannis A. Kakadiaris, Morteza Naghavi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

26 Scopus citations

Abstract

Abdominal visceral fat accumulation is one of the most important cardiovascular risk factors. Currently, Computed Tomography and Magnetic Resonance images are manually segmented to quantify abdominal fat distribution. The manual, delineation of subcutaneous and visceral fat is labor intensive, time consuming, and subject to inter- and intraobserver variability. An automatic segmentation method would eliminate intra- and inter-observer variability and provide more consistent results. In this paper, we present a hierarchical, multi-class, multi-feature, fuzzy affinity-based computational framework for tissue segmentation in medical images. We have applied this framework for automatic segmentation of abdominal fat. An evaluation of the accuracy of our method indicates bias and limits of agreement comparable to the inter-observer variability inherent in manual segmentation.

Original languageEnglish (US)
Title of host publicationProceedings - Seventh IEEE Workshop on Applications of Computer Vision, WACV 2005
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages308-315
Number of pages8
ISBN (Print)0769522718, 9780769522715
DOIs
StatePublished - 2005
Event7th IEEE Workshop on Applications of Computer Vision, WACV 2005 - Breckenridge, CO, United States
Duration: Jan 5 2005Jan 7 2005

Publication series

NameProceedings - Seventh IEEE Workshop on Applications of Computer Vision, WACV 2005

Conference

Conference7th IEEE Workshop on Applications of Computer Vision, WACV 2005
Country/TerritoryUnited States
CityBreckenridge, CO
Period1/5/051/7/05

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Software

Fingerprint

Dive into the research topics of 'Automatic segmentation of abdominal fat from CT data'. Together they form a unique fingerprint.

Cite this