Remix.run Logo
aleph_minus_one 4 hours ago

Couldn't you simply increase the temperature of the model to somewhat mitigate this effect?

lbrito 3 hours ago | parent | next [-]

I kind of think of that as just increasing the standard deviation. Its been a while since I experimented with this, but I remember trying a temp of 1 and the output was gibberish, like base64 gibberish. So something like 0.5 doesn't necessarily seem to solve this problem, it just flattens the distribution and makes the output less coherent, with rarer tokens, but still the same underlying distribution.

swyx 3 hours ago | parent | prev | next [-]

you have to know that your "simply" is carrying too much weight. here's some examples of why just temperature is not enough, you need to run active world models https://www.latent.space/p/adversarial-reasoning

mannykannot 3 hours ago | parent | prev [-]

When applied to insightful writing, that is much more likely to dull the point rather than preserve or sharpen it.