Azure语音API语言 [英] Azure Speech API language
问题描述
我已经在网页上实现了聊天,并可以通过Azure语音API使用语音转文本.它可以正常工作,但是我不知道在哪里可以设置API可以理解的语言. 我希望它能听懂法语,但是当我用法语交谈时,它会以熟悉的声音翻译成英语单词. 如何/在哪里设置语言? 我明确指出,我不是在Azure仪表板上设置服务的人.
I have implemented a chat on a web page, with the possibility to use Speech to text, using Azure Speech API. It works fine but I don't understand where I can set the language understood by the API. I want it to understand french, but when I talk in french, it transcripts in english words with familiar sound. How / Where I can I set the language ? I precise that I'm not the one who set up the service on Azure dashboard.
推荐答案
There is a locale
parameter that you can use optionally like the following example:
export interface ICognitiveServicesSpeechRecognizerProperties {
locale?: string,
subscriptionKey?: string,
fetchCallback?: (authFetchEventId: string) => Promise<string>,
fetchOnExpiryCallback?: (authFetchEventId: string) => Promise<string>
}
If you don't provide a value the following example is used:
const locale = properties.locale || 'en-US';
You can find the possible values for those parameters here
这篇关于Azure语音API语言的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!