Meeting Banner
Abstract #3403

Semi-Automated Tracking of Tongue Movements in Dynamic MRI of Speech

Bradley P. Sutton1, Andrew Naber1, Jason Wang1, Jamie L. Perry2, David P. Kuehn3

1Bioengineering Department, University of Illinois at Urbana-Champaign, Urbana, IL, United States; 2Department of Communication Sciences and Disorders, East Carolina University, Greenville, NC, United States; 3Department of Speech and Hearing Sciences, University of Illinois at Urbana-Champaign, Champaign, IL, United States

As the frame rate of dynamic speech imaging with MRI increases, automated extraction of frame-by-frame soft tissue movements becomes critical for evaluating large studies of pathology or cultural differences in movement. This is a challenging task as dynamic MRI suffers from noisy images and lack of contrast in structures. We present a semi-automated algorithm to extract two tongue positions (tip and dorsum) and compare the tracking results with three trained speech scientists. The semi-automated algorithm performs well in correlating with the manual tracings on data from four study participants.

Keywords

accordance acoustics allow anterior automated automatic blade board canny champaign cleft comp connecting consuming contour converge correlation correlations cultural custom detection difficult disorders dynamic east edge employ entire filter fisher frames gradient gram grant hard hearing help hods hours human identify input intensity labeling larger longest made manual maxim mouth movements much noise norm oped oral outline output paced pairs palate pared physiology placements plots position primarily program quantitative robust roof root sample sciences semi series shaping since sliding sounds speech spiral squared starts strong structured studies subjective subjects suffer supported surface threshold tone tong tongue tracers tracing tracings track tracking trained travels useful user vertical weak whole wind wing worth