MaLLaM – Bahasa Raya Malaysia Model, Promising the Best Results for Malaysian Speaking AI, Multi-Language Support




Today, there are several language learning models (LLM) on the market that are trained for various purposes using different models. One of them is its own local offering, called MaLLaM. Developed under Mesolitica, it includes several advantages, and one of them is the ability to understand local Malay, Jawi, Manglish, Mandarin and Indonesian as input.


With this offering, it simultaneously makes MaLLAM more suitable for powering various chat-based services, focusing on local users.



MaLLaM was trained using models with 1.1 billion, 3 billion and 5 billion parameters, over a 349GB dataset – equivalent to using 90 billion tokens for training. According to its developers, although trained using a slightly small dataset, it offers better results than other models for Malaysian user input, including multi-lingual use. This will allow more applications to be developed using it, especially for Malaysian users.


MaLLaM is offered in open source form, and is available through HuggingFace. You can also check MaLLaM's official site on Mesolitica.

Previous Post Next Post

Contact Form