Elon Musk has previously promised Grok's offering in open source form. In parallel, now xAI has already started offering the basic weighting model for Grok, in addition to the network build for Grok-1. xAI states that the Grok-1 base model offered is not optimized yet, and is in the pre-training phase, dated last October 2023.
The model offered is also not optimized for any application use, including conversational and so on. Grok-1 is said to include 314 billion “Mixture-of-Experts” training model parameters by xAI.
Currently, the released source code does not include a corpus for its training data. Regardless, this offering is quite interesting, not least with Elon Musk suing OpenAi for not offering their model in open source form as originally planned.
For those of you who are interested, you can check out this code offering via GitHub today.