| ▲ | Aerolfos 9 hours ago | |
> They are retrained every 12-24 months and constantly getting new/updated reinforcement learning layers This is true now, but it can't stay true, given the enormous costs of training. Inference is expensive enough as is, the training runs are 100% venture capital "startup" funding and pretty much everyone expects them to go away sooner or later Can't plan a business around something that volatile | ||
| ▲ | manmal an hour ago | parent | next [-] | |
GPT-5.1 was based on over 15 months old data IIRC, and it wasn’t that bad. Adding new layers isn’t that expensive. | ||
| ▲ | 0x696C6961 8 hours ago | parent | prev | next [-] | |
You don't need to retrain the whole thing from scratch every time. | ||
| ▲ | astrange 3 hours ago | parent | prev [-] | |
Google's training runs aren't funded by VC. The Chinese models probably aren't either. | ||