Goŋ dalu lakaramirr (2020)
Goŋ dalu lakaramirr is an experiment. A prototype that makes it possible to communicate with an operating system using hand gestures that grew out of a conversation with Graham Wilfred Jnr, a Yolŋu artist and designer currently living in Mparntwe/Alice Springs. The system is trained using Google's machine learning platform Teachable Machine, to learn and respond to a set of Yoŋlu hand signs, part of an endemic sign language of Graham's people, the Yolŋu people of North East Arnhem Land. He asked "why couldn't I communicate with my computer using hand signs?"
Indigenous sign languages exist across Australia in different states of use. Yoŋlu people describe theirs as “Dhiyal djorra’ŋur ga lakaram dhärukpuy ga nhatha ŋuli limurr bäki”, or “the language we use to communicate with each other when we don’t want to speak” (James, 2019). Sign languages like this one are stand-alone languages, meaning that they are not signed versions of spoken language, in this case Yoŋlu Matha, but are complete, alternate languages used traditionally in a range of contexts such as hunting and to communicate over distances (Bentley, 2019). These languages are evidence of the complex “bimodal-bilingualism” these cultures possess - the ability to be fluent in both auditory and visual languages, an ability which is rare globally (Adone and Maypilama, 2014).
Goŋ dalu lakaramirr is a Yoŋlu phrase roughly meaning “contains the attribute of the hand, in action, telling”. It was suggested as an appropriate name for this prototype by a group called SAVE Yoŋlu Sign Language, a team of elders collaborating with the linguist Dr Bentley James, and whose ongoing work aims to preserve Yoŋlu hand signs. While there are at least 1800 signs in use, these languages are severely endangered due to their highly contextual nature and the ongoing forces of colonisation that are changing people’s lives so dramatically.