Remix.run Logo
lm28469 2 days ago

Given an input clang will always give the same output, not quite the same for llms. Also nobody ever claimed compilers were intelligent or that they "understood" things

conradev 2 days ago | parent | next [-]

The determinism depends on the architecture of the model!

Symbolica is working on more deterministic/quicker models: https://www.symbolica.ai

I also wish it was that easy, but compiler determinism is hard, too: https://reproducible-builds.org

9rx 2 days ago | parent | prev | next [-]

An LLM will also give the same output for the same input when the temperature is zero[1]. It only becomes non-deterministic if you choose for it to be. Which is the same for a C compiler. You can choose to add as many random conditionals as you so please.

But there is nothing about a compiler that implies determinism. A compiler is defined by function (taking input on how you want something to work and outputting code), not design. Implementation details are irrelevant. If you use a neural network to compile C source into machine code instead of more traditional approaches, it most definitely remains a compiler. The function is unchanged.

[1] "Faulty" hardware found in the real world can sometimes break this assumption. But a C compiler running on faulty hardware can change the assumption too.

whimsicalism 2 days ago | parent | next [-]

currently LLMs from majorvproviders are not deterministic with temp=0, there are startups focusing on this issue (among others) https://thinkingmachines.ai/blog/defeating-nondeterminism-in...

lm28469 2 days ago | parent | prev [-]

You can test that yourself in 5 seconds and see that even at a temp of 0 you never get the same output

9rx 2 days ago | parent [-]

Works perfectly fine for me.

Did you do that stupid HN thing where you failed to read the entire comment and then went off to try it on faulty hardware?

lm28469 2 days ago | parent [-]

No I did that HN thing where I went to an LLM, set temp to 0, pasted your comments in and got widely different outputs every single time I did so

9rx 2 days ago | parent | next [-]

"Went" is a curious turn of phrase, but I take it to mean that you used an LLM on someone else's hardware of unknown origin? How are you ensuring that said hardware isn't faulty? It is a known condition. After all, I already warned you of it.

Now try it on deterministic hardware.

lm28469 a day ago | parent [-]

Feel free to share your experiments, I cannot reproduce them but you seem very sure about your stance so I am convinced you gave it a try, right ?

9rx a day ago | parent [-]

Do you need to reproduce them? You can simply look at how an LLM is built, no? It is not exactly magic.

But what are you asking for, exactly? Do you want me to copy and paste the output (so you can say it isn't real)? Are you asking for access to my hardware? What does sharing mean here?

2 days ago | parent | prev | next [-]
[deleted]
NewsaHackO 2 days ago | parent | prev [-]

Was the seed set to the same value everytime?

whimsicalism 2 days ago | parent [-]

https://thinkingmachines.ai/blog/defeating-nondeterminism-in...

bewo001 2 days ago | parent | prev [-]

Hm, some things compilers do during optimization would have been labelled AI during the last AI bubble.