Speech recognition under noisy environments using multiple microphones based on asynchronous and intermittent measurements

Kohei Machida, Akinori Ito

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We propose a robust speech recognition method under noisy environments using multiple microphones based on asynchronous and intermittent observation. In asynchronous and intermittent observation, the noise spectrum is estimated by the environmental noise observed in fragments from multiple microphones, and spectral subtraction is performed by this estimated noise spectrum. In this paper, we consider the case of estimating the noise spectrum from the noise observed by another microphone just before speech input. However, the noise spectrum needs to be compensated because of the difference in the location of the microphone in this case. Then, we examined compensating the noise spectrum by using the estimated LSFL on the log spectrum. By compensating the noise spectrum, the recognition rate improved compared with the case without compensation.

Original languageEnglish
Title of host publication2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA 2013
DOIs
Publication statusPublished - 2013
Event2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA 2013 - Kaohsiung, Taiwan, Province of China
Duration: 2013 Oct 292013 Nov 1

Publication series

Name2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA 2013

Conference

Conference2013 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA 2013
Country/TerritoryTaiwan, Province of China
CityKaohsiung
Period13/10/2913/11/1

Fingerprint

Dive into the research topics of 'Speech recognition under noisy environments using multiple microphones based on asynchronous and intermittent measurements'. Together they form a unique fingerprint.

Cite this