How auditory-visual synaesthesia gave Kim Elms a unique gift for languages
Seeing sounds: How synaesthesia shapes a language expert's world

For Kim Elms, a car journey with loud music isn't just noisy – it's a visual spectacle of static images and flashing lights in her mind's eye. The 44-year-old speech pathologist has auditory-visual synaesthesia, a neurological condition where hearing sounds automatically triggers visual experiences.

A World Painted in Sound

Kim describes her perception of sound as seeing shapes akin to sound waves on a screen, or like "little neurons connecting and space nebulas exploding." For her, the auditory world is anything but silent. This unique sensory blending, however, remained a mystery until her thirties. Before putting a name to it, she simply knew she possessed an uncanny gift for linguistics.

Her academic journey was marked by effortless success in language learning. She excelled in Japanese at school, finding the words and sounds presented as memorable images. At university, majoring in Spanish, Korean, and Indonesian required "no effort at all." This innate talent led to a remarkable performance on a military language aptitude test after she joined the air force as an intelligence officer. "No one's ever managed to get every answer right," she was told upon receiving her results, despite not having tried particularly hard.

From Discovery to a Career with Siri

The term 'synaesthesia' entered her vocabulary while training as a speech pathologist after leaving the military. While learning about neurodivergence, she read about the condition but didn't immediately connect it to her own life. The pivotal moment came during speech-to-text computational linguistic work, when she consciously realised the shapes she saw were linked to sounds and phonemes.

Connecting with others online, she found most saw sounds as colours, whereas her shapes are primarily black and white. The exception is high-frequency sounds, which appear as bright white before shifting through yellows and oranges to reds. This visual sensitivity allows her to pass hearing tests at "ridiculously quiet decibels," seeing pure tones as coloured flashes.

Her specialised skill in phonetics led to a career-defining opportunity. While transcribing Indigenous creole languages for the University of Wollongong, her supervisor connected her with Apple's head linguist. This resulted in a 90-day project in Japan, where on the first day she discovered she would be working on developing Siri. She has since contributed to speech-to-text projects for major companies like TomTom GPS and Bank of America.

Living with a Noisy World

While her synaesthesia has been a professional boon, it presents daily challenges. "My brain feels busy pretty much constantly thanks to the noisy world we live in," she admits, often resorting to earplugs for relief. Running to music is her preferred escape, as it's the only time her brain is quiet and she doesn't visually perceive sound.

Despite the sensory overload, Kim wouldn't change her condition. She views her work with words and sounds as her ikigai – a Japanese concept meaning her reason for being. Confident in the enduring need for human nuance in language, she isn't worried about AI, stating she "wouldn't trust AI" to analyse subtle accents like the western Sydney Lebanese dialect she can break down instantly.

Professor Anina Rich, a cognitive neuroscientist and chair of the Synaesthesia Research Group at Macquarie University, assisted in reporting this insight into Kim Elms's extraordinary sensory world.