In Honolulu last year Qualcomm demonstrated the ability of Snapdragon 8 Gen 3 to run AI LLM models and image generation on devices that do not require an internet connection. This feature has already started to be seen on Samsung, Xiaomi, Oppo and Honor devices that have just been launched this year. Qualcomm is now announcing their success running the Multimodal Raya Model (LMM) directly on Android devices and even Windows laptops that use their chips.
The Large Language and Vision Assistant (LLaVA) model containing 7+ billion parameters is run directly from an Android device. It is an LMM that can not only understand text input but also images entered by the user. With LMM, the security of the user's personal data and the cost can be ensured. Also possible is to train a personal AI that is specifically trained using the device owner's personal data.
Qualcomm also successfully demonstrated Low Rank Adaptation (LoRA) using a Stable Diffusion model containing 1 billion parameters on an Android device. Images can be generated by the device in seconds compared to using various online generative AI services.
Finally, they also managed to run LMM on a Windows PC that uses a Snapdragon X Elite chip. As with Snapdragon chips for smartphones, Snapdragon chips for Windows computers are also equipped with an NPU powerful enough to run LMMs that can accept input in the form of text as well as audio.
Also announced today is the Qualcomm AI Hub which offers over 75 AI models that have been optimized to run on Snapdragon and Qualcomm chips. Developers can use AI Hub to create applications with artificial intelligence capabilities that are then offered on devices. Among the accessible models are Whisper, ControlNet, Stable Diffusion, and Baichuan 7B.