Why Indian Languages Are Among the Hardest to Lip‑Read
“Lipreading has an almost mystical pull on the hearing population,” begins Dan Nosowitz’s exhaustive deep dive into lip-reading for Atlas Obscura. In an attempt to find what language is the hardest to read, Nosowitz reveals Indian languages — mainly Hindi, Tamil, and Gujarati — are among the most difficult to decipher for lip-readers.
First, South Asian languages have the hardest-to-decipher phonemes (distinct sounds of a language) — specifically “th” and “d” — according to a 2006 Swedish study published in the Journal of Speech, Language, and Hearing Research. These phonemes, including “g,” “n,” and “k,” do not require the speaker to move their lips a great deal, which throws off a lip-reader.
Anything else that conceals the lips would also prove to be an impediment to lip-readers — in the case of India, mustaches. “A mustache that comes down over your upper lip can affect being able to see what’s going on,” professional lip-reader Consuelo González tells Atlas Obscura. And in India, the ubiquity of mustaches can be explained by how it’s inherently tied to manliness.
But several other cultural quirks, in other parts of the world, also make local languages difficult to lip-read. For example, Mandarin is a more tonal language (with four tones and a fifth, neutral tone) that derives its uniqueness from the sounds produced deep in the throat; a lip-reader won’t be able to decipher tone based on how a Mandarin speaker’s lips are moving. Similarly, Welsh and Dutch have guttural sounds that arise from the larynx; again, this makes it difficult for lip-readers to make out words.
Related on The Swaddle:
In Japan, the difficulty in lip-reading arises not from the pronunciation of the language, but people’s body language while speaking it. Japanese people tend not to display much emotion while speaking; they make less eye contact, which can otherwise help a lip-reader gauge tone or emotion, and tend to cover their mouths (especially women) while laughing, Nosowitz writes.
But, in oralist regions — places where lip-reading is more popular than using sign language — such as India, people tend to evolve gauging all the facial muscles (not just the lips), head movement, and body language to decipher what a person is saying. Deciphering phonemes and correctly interpreting lip movements are an essential, but ultimately small, part of the entire lip-reading process. Since it encompasses the entire body, a more accurate term for the phenomenon is speech-reading, process.
Unfortunately, Nosowitz surmises, we don’t know much about speech-reading yet — most of the research done on the practice studies subjects with normal hearing, not people who are hard of hearing and have more practice with speech-reading. And with artificial intelligence swooping in soon — new technology can employ lip-reading techniques to solve crimes — it’s time the research catches up with humans before we standardize the process across technology.