Meta claims its AI improves speech recognition quality by reading lips

People perceive speech both by listening to it and watching the lip movements of speakers. In fact, studies show that visual cues play a key role in language learning. By contrast, AI speech recognition systems are built mostly or entirely on audio. And they require a substantial amount of data to train, typically ranging in the tens of thousands of hours of recordings.

Read More