March 17, 2024 Open Release of Grok-1We are releasing the weights and architecture of our 314 billion parameter Mixture-of-Experts model, Grok-1. We are releasing the base model weights and network architecture of Grok-1, our large language model. Grok-1 is a 314 billion parameter Mixture-of-Experts model trained from scratch by xAI. This is the raw base model checkpoint from the Grok-1 pre-traini