Popular social app Snapchat has launched a new feature to help Australians learn more than 170 Indigenous words.

The social media company partnered with First Languages Australia to bring the Aboriginal and Torres Strait Islander dialects to users screens through the app’s language learning lenses.

Each lens uses augmented reality and machine learning to identify objects and display their name in Wiradjuri (central New South Wales), Yugambeh (south-east Queensland), Wakka Wakka (central Queensland) and Yawuru (Broome in Western Australia) languages.

Yugambeh descendant Shaun Davies said social media was the modern “campfire” where stories were shared.

Yugambeh Language x3

“In the old days, our Elders taught lingo by the campfire,” he said.

“But the camp has changed, and the fire that people stare every day at is not the same.

Technology has become a central place in the home and now our lingo needs to go there if it is to survive for mobo jahjum (future generations).”

First Languages Australia chief executive Beau Willaims said the lens would boost recognition of language.

“We know millions of young Aussies use Snapchat everyday – so this is an incredible opportunity for them to experience our First Nations’ languages in a fun and interactive way,” he said.

Among the objects are ear (wudha in Wiradjuri), spider (wanggarranggarra in Yawuru) and
hat (binka in Yugambeh).

“But the camp has changed, and the fire that people stare every day at is not the same” Shaun Davies

Snapchat APC general manager Kathryn Carter said it was important to share Indigenous language with young Australians.

“We’re thrilled to collaborate with First Languages Australia, and hope these Lenses represent our small part in supporting Australia’s Aboriginal and Torres Strait Islander communities in a unique way,” she said.

The lenses are accessible by searching Learn Wiradjuri, Learn Yugambeh, Learn Wakka Wakka or Learn Yawuru, or scanning the Snapcodes below.

Users point their cameras at an object to scan it, and the lens automatically displays the object’s English and Indigenous language names in real time, along with an audible clip.