| ▲ | erwald 3 hours ago | |||||||
Where did you read that it was trained on Ascends? I've only seen information suggesting that you can run inference with Ascends, which is obviously a very different thing. The source you link also just says: "The latest model was developed using domestically manufactured chips for inference, including Huawei's flagship Ascend chip and products from leading industry players such as Moore Threads, Cambricon and Kunlunxin, according to the statement." | ||||||||
| ▲ | cherryteastain 2 hours ago | parent [-] | |||||||
I took the "for inference" bit from that sentence you quoted as a qualifier applied to the chips, as in the chips were originally developed for inference but were now used for training too. Note that Z.ai also publically announced that they trained another model, GLM-Image, entirely on Huawei Ascend silicon a month ago [1]. [1] https://www.scmp.com/tech/tech-war/article/3339869/zhipu-ai-... | ||||||||
| ||||||||