Skip to content
All posts

Grok-1 of PyTorch + HuggingFace version is now available!

Grok-1, the 314-billion-parameter Mixture of Experts (MoE) model open-sourced by Musk's xAI, is the largest open-source large language model, and allows for free distribution and commercialization of changes.
 
Grok-1 has attracted a lot of attention in the open source community since its release, and has been ranked No. 1 in the world on the GitHub Trending.

W1

However, Grok-1 is built using Rust+JAX, which has a high threshold for users who are used to mainstream software ecosystems such as Python+PyTorch+HuggingFace to get started.
 
Colossal-AI team followed up immediately and provided an easy-to-use Python + PyTorch + HuggingFace version of Grok-1 for all AI developers.
 
Grok-1 greedy search test
img_v3_0297_3b2d646b-352b-45bc-8471-cd240eba04fg
 
HuggingFace Download:
 

Tutorial

 
Download the tokenizer from the official grok-1 repository.
 
wget https://github.com/xai-org/grok-1/raw/main/tokenizer.model
 
Run inference scripts
 
./run_inference_fast.sh hpcaitech/grok-1 tokenizer.model
 
Model weights will be downloaded and loaded automatically.
More details can be found in:
https://github.com/hpcaitech/ColossalAI/tree/main/examples/language/grok-1
 
Colossal-AI will further introduce optimizations for Grok-1 in parallel acceleration, quantitative reduction of cost, etc. in the near future. Welcome to stay tuned.
 
Colossal-AI open source address:
https://github.com/hpcaitech/ColossalAI

 

Comments