Liana Rocha1,
Caroline Amoedo2, Fernanda Mago2, Marcio Reis2,
Ralph Strecker3, Xiadong Zhong4, Stephan A.R.
Kannengiesser5, Ronaldo Baroni2
1Imaging
Department , Hospital Israelita Albert Einstein, So Paulo, Brazil; 2Imaging
Department, Hospital Israelita Albert Einstein, So Paulo, Brazil; 3Healthcare
MR, Siemens Ltda, So Paulo, Brazil; 4MR R&D Collaborations,
Siemens Healthcare , Atlanta, Geogia, United States; 5Healthcare
MR, Siemens, Erlangen, Bavaria, Germany
Magnetic resonance imaging (MRI) has been recognized as a non-invasive method for detection and quantification of fat and iron deposition in the liver. Recently, a two-point automated dual-ratio Dixon discrimination technique with automatic liver segmentation, dubbed screening Dixon (SD), has emerged as a potential method for screening and discrimination of fat and iron signals. The purpose of this study was to evaluate the accuracy of the SD sequence with automatic estimation of fat/ iron contents in a population of chronic liver disease (presence of siderotic lesions and steatosis) and compare them with the results of our routine quantitative sequences as the reference standards: GRE multi-echo for Iron deposition and 3-echo GRE Dixon for fat. 70 abdominal MRI performed at 1,5T were compared. Considering only the presence (altered) or absence (normal) of disease, SD demonstrates 100%, 86.96% and 91.43% for sensitivity, specificity and accuracy, resp. When we considered only presence or absence of FD, SD was more specific (90.38%) than sensitive (50%), with accuracy of 80%. When we considered only the presence or absence of ID, SD was 100% sensitive, with specificity of 82.76% and accuracy of 85.71%. Although the type of deposition is not perfectly discriminated, SD is an accurate method for detecting presence or absence of fat/iron deposition in the liver, being adequate as a screening technique in general MRI abdominal studies.