| ▲ | colinplamondon 17 hours ago | |
Sure - and the people responsible for a new freaking era of computing are the ones who asked "given how incredible it is that this works at all at 0.5b params, let's scale it up*. It's not hyperbole - that it's an accurate description at a small scale was the core insight that enabled the large scale. | ||
| ▲ | staticman2 16 hours ago | parent [-] | |
Well it's obviously hyperbole because "all human thought" is not in a model's training data nor available in a model's output. If your gushing fits a 0.5b it probably doesn't tell us much about A.I. capabilities. | ||