Maybe it's because I am under the influence of Spoken Here, but one of the key messages coming out of the GK3 conference last week was the increasing importance of promoting local content and language to address the digital divide.
It is well known that out of the 6,000 languages spoken on the planet, only a tiny percentage is represented on the web. Perhaps less intuitive are the factors that preclude multilingual digitization of content. They range from the problems of recognition of minority languages, the lack of local language computing capacity, through the plethora of internet governing bodies involved in encryption projects, to the lack of interface between linguistic and IT expertise.
History didn't help either.
When a language borders don't match the political ones, which government is responsible for codifying it? Far from being confined to the realm of computer geeks, these obstacles have a direct impact on development.
Take for instance the case of Bhutan. Until recently, there was no way the local government could use computers in Dzongkha: this hampered the development of e-government. Or think of emergency preparedness when local communities, often illiterate, need to be quickly warned in their local language about an imminent threat. The lack of codification means that translating and voice software is not an option for English-speaking overseas research centers.
Projects like Panlocalization and Translate.org.za are trying to bridge the gap. But there's still a long way to go.
As the president of the African Academy of Languages noted, isn't it ironic that Africa, home to an incredible linguistic diversity, is still conventionally categorized into English, Spanish, French or Portuguese speaking - the languages of the colonizers?
Join the Conversation