Low Resource ASR: The Surprising Effectiveness of High Resource Transliteration

Khare, Shreya ; Mittal, Ashish ; Diwan, Anuj ; Sarawagi, Sunita ; Jyothi, Preethi ; Bharadwaj, Samarth (2021) Low Resource ASR: The Surprising Effectiveness of High Resource Transliteration In: Interspeech.

Full text not available from this repository.

Official URL: http://doi.org/10.21437/Interspeech.2021-2062

Related URL: http://dx.doi.org/10.21437/Interspeech.2021-2062

Abstract

Cross-lingual transfer of knowledge from high-resource languages to low-resource languages is an important research problem in automatic speech recognition (ASR). We propose a new strategy of transfer learning by pretraining using large amounts of speech in the high-resource language but with its text transliterated to the target low-resource language. This simple mapping of scripts explicitly encourages increased sharing between the output spaces of both languages and is surprisingly effective even when the high-resource and low-resource languages are from unrelated language families. The utility of our proposed technique is more evident in very low-resource scenarios, where better initializations are more beneficial. We evaluate our technique on a transformer ASR architecture and the state-of-the-art wav2vec2.0 ASR architecture, with English as the high-resource language and six languages as low-resource targets. With access to 1 hour of target speech, we obtain relative WER reductions of up to 8.2% compared to existing transfer-learning approaches.

Item Type:Conference or Workshop Item (Paper)
Source:Copyright of this article belongs to ResearchGate GmbH
ID Code:129254
Deposited On:14 Nov 2022 12:12
Last Modified:14 Nov 2022 12:12

Repository Staff Only: item control page