Remix.run Logo
jmclnx 7 days ago

I am not a fan of today's concept of "AI", but to be fair, building today's software is not for the faint of heart, very few people gets it right on try 1.

Years ago I gave up compiling these large applications all together. I compiled Firefox via FreeBSD's (v8.x) ports system, that alone was a nightmare.

I cannot imagine what it would be like to compile GNOME3 or KDE or Libreoffice. Emacs is the largest thing I compile now.

anotherhue 7 days ago | parent [-]

I suggest trying Nix, by being reproducible those nasty compilation demons get solved once and for all. (And usually by someone else)

trod1234 7 days ago | parent [-]

The problem with Nix is that its often claimed to be reproducible, but the proof isn't really there because of the existence of collisions. The definition of reproducible is taken in such an isolated context as to be almost absurd.

While a collision hasn't yet been found for a SHA256 package on Nix, by the pigeonhole principle they exist, and the computer will not be able to decide between the two packages in such a collision leading to system level failure, with errors that have no link to cause (due to the properties involved, and longstanding CS problems in computation).

These things generally speaking contain properties of mathematical chaos which is a state that is inherently unknowable/unpredictable that no admin would ever approach or touch because its unmaintainable. The normally tightly coupled error handling code is no longer tightly coupled because it requires matching a determinable state (CS computation problems, halting/decidability).

Non-deterministic failure domains are the most costly problems to solve because troubleshooting which leverages properties of determinism, won't work.

This leaves you only a strategy of guess and check; which requires intimate knowledge of the entire system stack without abstractions present.

anotherhue 6 days ago | parent [-]

Respectfully, you sound like AI. I expect you don't trust git either, especially as its hash is weaker.

A cursory look at a nix system would also show you that the package name, version and derivation sha are all concatenated together.

trod1234 6 days ago | parent [-]

Respectfully, I sound like a Computer Engineer because I've worked alongside quite a number of them, and the ones I've worked with had this opinion too.

> A cursory look at a nix system would show ... <three things concattenated together>

This doesn't negate or refute the pigeonhole principle. In following pigeonhole there is some likelihood that a collision will exist, and that probability trends to 1 given sufficient iterations (time).

The only argument you have is a measure of likelihood and probability, which is a streetlight effect cognitive bias or intelligence trap. There's a video which discusses these type of traps on youtube, TED from an ex-CIA officer.

Likelihood and probability are heavily influenced by the priors they measure, and without perfect knowledge (which no one has today) those priors may deviate significantly, or be indeterminable.

Imagine for a second that a general method for rapidly predicting collisions, regardless of algorithm, is discovered and released; which may not be far off given current advances with quantum computing.

All the time and money cumulatively spent towards Nix, as cost becomes wasted, and you are left in a position of complete compromise suddenly and without a sound pivot for comparable cost (previously).

With respect, if you can't differentiate basic a priori reasoned logic from AI, I would question your perceptual skills and whether they are degrading. There is a growing body of evidence that exposure to AI may cause such degradation as seems to be starting to be seen with regards to doctors and their use and diagnostics after use in various studies (1).

1: https://time.com/7309274/ai-lancet-study-artificial-intellig...