Consistent segmentation using a Rician classifier

Snehashis Roy, Aaron Carass, Pierre Louis Bazin, Susan Resnick, Jerry L. Prince

Research output: Contribution to journalArticlepeer-review

23 Scopus citations


Several popular classification algorithms used to segment magnetic resonance brain images assume that the image intensities, or log-transformed intensities, satisfy a finite Gaussian mixture model. In these methods, the parameters of the mixture model are estimated and the posterior probabilities for each tissue class are used directly as soft segmentations or combined to form a hard segmentation. It is suggested and shown in this paper that a Rician mixture model fits the observed data better than a Gaussian model. Accordingly, a Rician mixture model is formulated and used within an expectation maximization (EM) framework to yield a new tissue classification algorithm called Rician Classifier using EM (RiCE). It is shown using both simulated and real data that RiCE yields comparable or better performance to that of algorithms based on the finite Gaussian mixture model. As well, we show that RiCE yields more consistent segmentation results when used on images of the same individual acquired with different T1-weighted pulse sequences. Therefore, RiCE has the potential to stabilize segmentation results in brain studies involving heterogeneous acquisition sources as is typically found in both multi-center and longitudinal studies.

Original languageEnglish (US)
Pages (from-to)524-535
Number of pages12
JournalMedical image analysis
Issue number2
StatePublished - Feb 2012


  • Biomedical imaging
  • Medical image segmentation
  • Rician distribution
  • Tissue classification

ASJC Scopus subject areas

  • Radiological and Ultrasound Technology
  • Radiology Nuclear Medicine and imaging
  • Computer Vision and Pattern Recognition
  • Health Informatics
  • Computer Graphics and Computer-Aided Design


Dive into the research topics of 'Consistent segmentation using a Rician classifier'. Together they form a unique fingerprint.

Cite this