Remix.run Logo
dijit 3 hours ago

I agree with every point made except there’s two caveats;

1) I am a bonafide systemd hater, and I am bent out of shape about the fact other init systems (more akin to SMF) were (and are) routinely ignored when discussing what was available. But: I feel like Linux desktops are better now for systemd. Even if I can’t tolerate how it spiders into everything personally.

2) Wayland was a “We have pushed X as far as it will go, and now we’re going to have to pay down our tech debt” by the X11 developers themselves.

I know it was “baby with the bathwater”, but in theory we don’t need to do that again for the next 50 years because we have a significantly better baseline for how computers are actually used. The performance ceiling has been lifted because of Wayland; consistent support for multiple monitors and fractional scaling are things we have today because of Wayland.

I won’t argue about security, because honestly most people seem to want as little security as possible if it infringes on software that used to work a certain way, but it should be mentioned at some point that a lack of security posture leads to a pretty terrible experience eventually.

So, yes, Wayland was worth the 10y cost, because the debt was due and with interest. Kicking the can down the road would most likely kill desktop Linux eventually.

hulitu 3 hours ago | parent [-]

> because we have a significantly better baseline for how computers are actually used.

Except, they don't. X was device agnostic. Wayland makes some asumptions which will be wrong in 10 years. And being a monolith does not help.

dijit 30 minutes ago | parent [-]

this "device-agnosticism" is also the source of many of X11's modern problems. Because the X server has to handle all rendering and input, it acts as a middleman. For every frame, an application has to send rendering commands to the X server, which then composites the scene and sends it to the display. This extra step introduces latency and makes it difficult to implement modern features like smooth animations, variable refresh rates, and HDR. In contrast, Wayland's design is based on the idea of direct rendering. The application renders a frame directly to a buffer in memory, and then the Wayland compositor takes this finished buffer and displays it. This approach is highly efficient for modern GPUs and display technology. The trade-off is that it ties the display protocol more closely to the graphics hardware, but this is a necessary step to achieve the high performance and low latency that modern users expect.

"Wayland makes some assumptions which will be wrong in 10 years."

This is a fair and common criticism. Yes, Wayland assumes a graphics stack that is based on OpenGL/Vulkan and a kernel with a Direct Rendering Manager (DRM). This works well today because modern Linux graphics drivers are built around this model.

However, an X11 advocate might argue that this tight coupling could be a problem if a new, fundamentally different type of display technology or graphics hardware emerges. With its modular design, X11 could theoretically adapt by adding new extensions.

Wayland developers have addressed this by keeping the core protocol simple and extensible. New features, like HDR or adaptive sync, are implemented as extensions to the base protocol. The hope is that this design allows Wayland to evolve without the bloat and complexity that burdened X11. While it's impossible to predict the future, Wayland's developers believe that its modular design is flexible enough to handle future changes in display technology.

Which I think is fair.