报告人: Dr. Ming Li (Duke Kunshan University)
时间: 2018年9月21日(星期五)15:00—16:00
地点: 苏州大学本部精正楼二楼报告厅
报告摘要:
Speech signal not only contains lexiconinformation, but also deliver various kinds of paralinguistic speech attributeinformation, such as speaker, language, gender, age, emotion, channel, voicing,psychological states, etc. The core technique question behind it is utterancelevel supervised learning based on text independent speech signal with flexibleduration. I will select speaker verification as an example to introduce theframework of paralinguistic speech attribute recognition. Moreover, we can extendthe signal from speech to multimodal human centered behavior data. We applysignal processing and machine learning technologies to human behavior signals,such as audio, visual, physiological and eye tracking data to provide objectiveand quantitative behavior measurements or codes. Example studies include autismspectrum disorder, obesity, biometrics and piano learning.
报告人简介:
Ming Li received his Ph.D. inElectrical Engineering from University of Southern California in May 2013. Heis currently an associate professor of Electrical and Computer Engineering atDuke Kunshan University and a research scholar at the ECE department of DukeUniversity. His research interests are in the areas of speech processing andmultimodal behavior signal analysis with applications to human centeredbehavioral informatics notably in health, education and security. He haspublished more than 90 papers and served as the member of CCF speech languageand hearing committee, CAAI artificial psychology and affective computingcommittee, APSIPA speech and language processing committee, area chairs forInterspeech 2016 and 2018. Works co-authored with his colleagues have won firstprizes at Body Computing Slam Contest 2009, Interspeech2011 Speaker StateChallenge and Interspeech2012 Speaker Trait Challenge; best paper awards atIEEE DCOSS 2009, ISCSLP 2014, IEEE CPTECE 2018. He received the IBM facultyaward in 2016 and ISCA computer speech and language best journal paper award in2018.
欢迎参加!