ITT nobody remembers gpt2 anymore and that makes me sad
This model was trained on 6T tokens and has 256k embeddings, quite different than a gpt2 model comparable in size.