| ▲ | pagekicker 2 days ago | ||||||||||||||||||||||
I asked Grok to visualize this: https://grok.com/share/bGVnYWN5_463d51c8-d473-47d6-bb1f-6666... *Caption for the two images:* Artistic visualization of the universal low-parameter subspaces discovered in large neural networks (as described in “The Unreasonable Effectiveness of Low-Rank Subspaces,” arXiv:2512.05117). The bright, sparse linear scaffold in the foreground represents the tiny handful of dominant principal directions (often ≤16 per layer) that capture almost all of the signal variance across hundreds of independently trained models. These directions form a flat, low-rank “skeleton” that is remarkably consistent across architectures, tasks, and random initializations. The faint, diffuse cloud of connections fading into the dark background symbolizes the astronomically high-dimensional ambient parameter space (billions to trillions of dimensions), almost all of whose directions carry near-zero variance and can be discarded with negligible loss in performance. The sharp spectral decay creates a dramatic “elbow,” leaving trained networks effectively confined to this thin, shared, low-dimensional linear spine floating in an otherwise vast and mostly empty void. | |||||||||||||||||||||||
| ▲ | 100721 2 days ago | parent [-] | ||||||||||||||||||||||
Acting as a pass-through for LLMs is logically equivalent to wiring up a bot account. | |||||||||||||||||||||||
| |||||||||||||||||||||||