Llama-2 language model (LLM) is launched by Meta to researchers and commercial users starting today. Coinciding with its launch today, Qualcomm says Llama-2 will be supported built-in on their chips starting in 2024.
Chips from the Snapdragon family are confirmed to support it in smartphones as well as in chips for laptops and AR/VR headsets. With the model supported directly on the device, faster processors are possible without the need for an internet connection. More importantly, the processed data is stored in the device avoiding the privacy issue that is now an important industry issue.
This opens up opportunities. to future products with better artificial intelligence (AI) capabilities. With generative AI technology now gaining more and more attention from consumers, the ability to run models in devices will accelerate its integration into everyday life. Meta has already confirmed that their Quest ships with Qualcomm chips. When Quest 3 launches next September, Llama-2 integration will likely be done.