ABENA
  • Project : ABENA
  • Company : Algorine & GhanaNLP
  • Status : Work in progress
  • Date : 2021-07-15 00:00:00 +0000

Interested to see how it works? Try ABENA here:

Explore

We named our main model ABENAA BERT Now in Akan

This model— which we have shared in our Kasa Library repo — enables a computer to begin to reason in Twi computationally. However it is “static” in the sense that the vectors do not change with different contexts. State-of-the-art NLP in high-resource languages such as English has largely moved away from these to more sophisticated “dynamic” embeddings capable of understanding a changing contexts. The most prominent example of such a dynamic embedding architecture is BERT — Bidirectional Encoder Representations from Transformers.

We share all models through the Hugging Face Model Hub allowing you to begin executing modern NLP on your Twi data in just a few lines of Python code. Concepts are linked to Kaggle notebooks illustrating them whenever possible, to enable you to test out the ideas in code very quickly. To use ABENA to fill in the blanks in the blanks, for instance, it is sufficient to execute the following python code.

Project is work in progress

Features:

  • 1. Model is able to address both the Asante and Akuapem Twi dialects.