| ▲ | Grosvenor an hour ago | |||||||||||||||||||||||||
Could this generate pressure to produce less memory hungry models? | ||||||||||||||||||||||||||
| ▲ | hodgehog11 an hour ago | parent | next [-] | |||||||||||||||||||||||||
There has always been pressure to do so, but there are fundamental bottlenecks in performance when it comes to model size. What I can think of is that there may be a push toward training for exclusively search-based rewards so that the model isn't required to compress a large proportion of the internet into their weights. But this is likely to be much slower and come with initial performance costs that frontier model developers will not want to incur. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
| ▲ | lofaszvanitt 41 minutes ago | parent | prev [-] | |||||||||||||||||||||||||
Of course and then watch those companies reined in. | ||||||||||||||||||||||||||