Remix.run Logo
Zed for Windows: What's Taking So Long?(zed.dev)
83 points by janjones a day ago | 80 comments
mxhwll a day ago | parent | next [-]

This is why you don’t make your own cross platform toolkit.

WD-42 a day ago | parent | next [-]

You'd rather see another crappy, slow editor packaging an entire browser? Because that seems like what people are using for "cross platform toolkits" these days. I'm glad Zed is being ambitious, it's truly a joy to use because it feels native. And to be honest, it's Windows, who cares. If you are a developer you should have switched to Linux years ago anyway.

perching_aix a day ago | parent | next [-]

> If you are a developer you should have switched to Linux years ago anyway.

This is so often repeated, but I genuinely don't understand why. Could you try selling me on it? I ended up going the sysadmin/devops route instead after college, but the more I learn about Linux, the less I understand why anyone would choose it for personal, active manual use.

I can understand server deployments, it works well enough. It's available at no cost, Windows Server is way out in the far other end in terms of current desired behavior, and whatever pains it has you get paid to make up for. None of which applies on a personal device level.

The most common selling points I see are more performance and less "spying". I find neither of these very persuasive, and I'm not interested in ideological rationales either (supporting free software). If you have anything else, I'm all ears.

skydhash a day ago | parent | next [-]

Not selling you on it, as I find that, if you're using IDEs, OSes don't really matter. And windows can be actually beneficial as you'll get prime support from most vendors. Where Unix shine is adhoc automation. Almost everything is fully hackable and that makes some solution easier to implement.

As in case for the desktop, you can switch out your audio stack, alter the display of any element and many other things. Using windows is borrowing some shoes while Linux can be your favorite slipper.

perching_aix a day ago | parent | next [-]

Do you have any easy automations in mind that would be broadly appealing and one really needs to go out of their way to implement on Windows / is impossible to do so?

I have a few things here and there, but it's more a scheduled script or two than anything more elaborate, and I don't think they were difficult to make and deploy.

> you can switch out your audio stack

Why would I want that? Isn't this more for someone doing live audio production (e.g. due to latency concerns)?

In general, the customizability angle is also another that doesn't resonate with me much. It's less that I want to customize my stuff, and more that I want my stuff to be to my liking from the get-go.

skydhash a day ago | parent [-]

> Do you have any easy automations in mind that would be broadly appealing and one really needs to go out of their way to implement on Windows / is impossible to do so?

There’s no broad stroke here. It’s more about the possibility to adjust something here and there. I don’t have anything against windows technically (I’ll use it with no complaints if it’s work provided).

When I notice something I don’t like (workflow mostly instead of appearance), I want to be able to fix it instead of suffering it (unless I’m being paid).

wolvesechoes a day ago | parent | prev [-]

> if you're using IDEs, OSes don't really matter

Unless you are using Visual Studio that blews out every other IDE out of the water if you consider debugging and profiling experience.

> As in case for the desktop, you can switch out your audio stack, alter the display of any element and many other things. Using windows is borrowing some shoes while Linux can be your favorite slipper.

Ah yes, biggest Linux advantage - being a mud hut.

mudkipdev a day ago | parent | prev | next [-]

Try compiling anything in C

wolvesechoes a day ago | parent [-]

Are you saying that it is somehow complicated to press Build in the VS window?

Aeolun a day ago | parent | prev [-]

Well, the thing I absolutely love about my linux setup, is that I can literally leave my computer running for a year, come back after that one year, and find it in exactly the state I left it. The system will never attempt to do anything for my own protection. No updates without me confirming them, never suddenly having it shut down because there’s a “critical” vulnerability. When it updates it never magically reverts a setting I had set.

Everything it does or doesn’t do is my responsibility.

steve_adams_86 a day ago | parent | prev | next [-]

No, there are good reasons developers are on Windows. Industrial and embedded systems are very often Windows-based, for better or worse. Heaps of games are developed on Windows. Windows-based software itself is developed on Windows.

com2kid a day ago | parent | next [-]

Being ~2 weeks into migrating from Windows to Linux for my dev machine, there are a lot of good reasons why people use Windows, and I keep learning more each and every day!

From a lock screen that appears ~3 seconds after my desktop does (during which time I can interact with my desktop...) to getting Nvidia GPU passthrough working in Docker being harder running on Linux natively than what it was making it work on WLS (...) to absurd amount of time it takes my machine to come out of sleep.

Oh also the popping and clicking over my BT headset every time someone speaks in a meeting. That was wonderful.

Despite using an older model MB, I needed to install some kernel extensions to get system temperatures working.

Also if I want to develop desktop software, I'm going to be writing against Windows anyway because at least that is somewhat documented, vs the ever changing landscape of Linux desktop software development. (Windows used to be the OS for desktop software, but Microsoft shot themselves in that foot, then removed the entire leg, long ago, by constantly changing and deprecating frameworks, ugh, 20+ years of API stability down the drain...)

saghm a day ago | parent [-]

> to Nvidia GPU passthrough working in Docker being harder running on Linux natively than what it was making it work on WLS

To be fair, assuming you're using WSL2, you're running docker on a VM, so it doesn't sound that crazy that it might be more work without the abstraction around the hardware that defines. If there were a built-in VM for your Linux distro, it might end up being easier to expose the GPU through that to things running on it than directly too. I can't say I've ever had any need to access a GPU from a container running on a VM then, so this is just conjecture.

inetknght a day ago | parent | prev | next [-]

> Industrial and embedded systems are very often Windows-based

I find Windows to be the outlier against a sea of embedded Linux devices.

> Heaps of games are developed on Windows

Inertia.

> Windows-based software itself is developed on Windows.

Plenty of Windows-based software is developed on Linux with Wine.

delta_p_delta_x a day ago | parent | next [-]

> Plenty of Windows-based software is developed on Linux with Wine

The overwhelming majority of software written against MinGW (or worse, Cygwin) are bad/lazy ports of Linux-first software. Case in point: Git and Perl, both of which drag along an entire coreutils ecosystem (each, so you have two copies of `ls`) along with the main binaries.

First-class Windows programs that are used every day like Office, Chromium and its forks, the Adobe suite, and tons and tons of internal administrative programs for HR, inventory, and more are written on Windows, for Windows, using C# or C++ and 'boring', so-called enterprisey frameworks like WPF, Windows Forms, and WinUI 2.

Anyone remotely serious about taking advantage of the large (albeit shrinking) market share of Windows users should at the very least fire up a VM to test their release binaries, rather than just 'use Wine'.

WD-42 a day ago | parent [-]

Ironically it was the act of doing that: spinning up a VM to test a release on Windows, which really turned me against it forever. During installation, I counted 4 un-skippable EULAS about "sharing data" and then asked me what my ad preferences were. To add insult to injury once I finally did get it installed, the start menu was full of Xbox apps and the taskbar had some news headline about the Kardashians on it.

I don't know how people put up with it. It feels disrespectful.

delta_p_delta_x 18 hours ago | parent [-]

> I don't know how people put up with it. It feels disrespectful.

I install the Enterprise/Education versions.

WD-42 14 hours ago | parent [-]

No. I’ll install Linux which doesn’t make me the product.

delta_p_delta_x 14 hours ago | parent [-]

Good for you.

steve_adams_86 a day ago | parent | prev [-]

> I find Windows to be the outlier against a sea of embedded Linux devices.

I think you're thinking of consumer devices, not industrial.

> Inertia.

I think that's a tough case to make. Windows offers legitimate technical advantages for gaming and game development. Integration with large vendors' tooling like NVIDIA and AMD is pretty huge. There are real workflow benefits.

> Windows-based software itself is developed on Windows.

You know more about this than I do. That sounds kind of wild to me, like it could be a pretty awful work flow at times for no good reason. It looks like you don't have access to native debugging tools and Wine itself introduces potential compatibility risks. I would rather just develop on target, personally

broodbucket a day ago | parent | next [-]

>Integration with large vendors' tooling like NVIDIA and AMD is pretty huge.

This is a product of inertia. If Windows didn't have inertia it wouldn't have ecosystem advantages, it's not inherent to Windows itself

rstuart4133 a day ago | parent | prev [-]

> I think you're thinking of consumer devices, not industrial.

Maybe he's thinking of more modern devices. There was a time when Microsoft flogged WinCE as an embedded solution, and yes a lot of people producing embedded stuff drank the kool aid.

I watched one instance of this happen first hand. They asked me what OS should they base their shiny new product (that I would be the first customer of), I said I would use some 'nix, but they should chose what they were comfortable with.

It turned out to be bad advice. They were comfortable with Windows desktop of course, so they chose WinCE. WinCE is not the stable WinNT they were familiar with, despite what Microsoft's marketing said. I've used a number of WinCE based devices in the past, they were all about as reliable as Windows 95/ME, which is to say most wouldn't last the day without rebooting.

In the end they could only get it working by shipping the product to a team in Germany that had access to the WinCE source. It cost them a small fortune, and lost them over a year. The delay lost me as a customer.

Most (I hope all, but it's never all) of todays experienced software engineers wouldn't make that mistake, but these people where (pretty good) hardware engineers, with a vision for a product they built the hardware for. Developing software was something you hired people to so for you, like plumbing and legal work. And they wanted those people to provide them with a familiar environment.

WinCE has long since been retired, or course. May it soul burn in hell. Yes, those same hardware engineers who insist on sticking to what they are familiar with might turn to Windows 11 instead. But that comes with costs - no ARM or other CPU's, huge resource requirements, insistence on TPM's, so little lack of control of the platform that you lose control of the USS Yorktown [0]. Those costs are large. In fact so large they would have overwhelmed the budget of my engineering friends years ago, and they would have just gone with Linux. I haven't seen a new embedded Windows design in quite a while, so I suspect that's true for most embedded projects now.

[0] https://archive.is/aKrml

steve_adams_86 a day ago | parent [-]

Sorry, it is my mistake. I was more so talking about software for working on industrial embedded devices (machinery, robots, or similar), which often use bespoke software for editing ladder logic or similar things for devices like PLCs.

I've never encountered a robot that didn't require windows to program. I know they're out there, but they don't seem common in my experience. Building them yourself is possible, but you regularly encounter cases where common, well-supported components require Windows to program. It's a drag.

I'd love to see it — Windows is far from my preferred OS. But my original point was essentially that there are tons of reasons like this which makes Windows a very productive and useful platform for many developers. I totally agree that there are cases where Linux or macOS are better (I prefer them both when possible) and yeah, WinCE was a total mess even by consumer standards. I had a pocket pc (ha, I was so excited about it) and it was a tremendous letdown largely because of the OS.

Side note, thanks for reminding me of that era. As bad as the software was, those devices were so god damn exciting. A pocket computer! I still remember how incredibly futuristic it felt.

cholantesh a day ago | parent | prev | next [-]

As someone who's ambivalent about the experience, I'd say "because that's what my employer issued to me" is perfectly acceptable.

steve_adams_86 a day ago | parent [-]

It's also probably one of the most common explanations for why anyone's using it. It's 70% of the market, and even more if you focus on enterprise. Us Linux and Mac folks are weirdos.

psyclobe a day ago | parent | prev [-]

Was a windows dev for 20 years-but then I got a job where I didn’t have to use that ad infested joke of an operating system.

Never going back.

pjmlp a day ago | parent | prev | next [-]

I use computers since 1986, UNIX variants since 1992, and yet Windows is where I spend most of my time.

I find hilarious this FOSS concept that developers only use Linux, I wonder who writes software for all other operating systems in the world.

wolvesechoes a day ago | parent | prev | next [-]

> You'd rather see another crappy, slow editor packaging an entire browser?

Windows still offers other options, even if MS itself tend to ignore them.

> If you are a developer you should have switched to Linux years ago anyway.

Developers is much broader set than web developers, and even then advantages of Linux escape me.

jamwil 13 hours ago | parent | prev | next [-]

Some people work for large corporations and can’t just use whatever computer they want.

vovavili a day ago | parent | prev [-]

>If you are a developer you should have switched to Linux years ago anyway.

These days, WSL2 effectively eliminates a need for that for most developers.

pjmlp 19 hours ago | parent | next [-]

And cheaper than VM Workstation, or easier than Virtual Vox, my solutions since 2010, I never dual booted again in 15 years.

LtWorf 15 hours ago | parent | prev [-]

If you write a very narrow category of high level server software or command line utilities.

Otherwise no.

pjmlp a day ago | parent | prev | next [-]

Or why you don't insist in using Khronos stuff on Windows, when most OEMs only care about the native API, DirectX.

Lets recap that this lesson has been learned by Godot developers, regarding their backends as well.

The ICD mechanism is a kind of escape hatch leftover due to backwards compatibility, and even user mode drivers build on top of DirectX runtime infrastructure.

leecommamichael a day ago | parent [-]

Is it Khronos? The 3 issues they linked were: 1. ARM support missing for one of their Cargo crates. 2. An issue with RemoteDesktop 3. The team required dynamic_rendering, and it wasn't available for user with an old machine on Windows 10.

It really depends how you define scope, but I don't think I would've taken on another GPU backend for that.

pjmlp a day ago | parent [-]

In the sense that OpenGL and Vulkan are paper standards that OEMs might implement, whereas they tend to design their DirectX drivers alongside Microsoft, and then their OpenGL/Vulkan drivers are mostly an afterthought.

This is especially visible when buying random asian cards that aren't the reference designs from AMD and NVidia. Intel was never great regardless of the API.

Additionally we have the usual extension spaghetti, which is one thing that has beaten them here.

Require too many of them, and coding around their inexistence becomes like using yet another API, that is similar but not quite.

andsoitis a day ago | parent | prev | next [-]

while I would agree in general, there could theoretically be SOME applications where the range of UI controls (and systems) is small enough where it could pay off. But things tend to expand in surface area...

So with that, this presents a HUGE opportunity for someone to build something akin to Zed, but not with the baggage that their technical strategy brings.

cosmic_cheese a day ago | parent [-]

> So with that, this presents a HUGE opportunity for someone to build something akin to Zed, but not with the baggage that their technical strategy brings.

Not sure it’s so clean-cut. More than avoiding baggage, you’re just shifting it elsewhere. The question is if you want to own (and can handle) the baggage and benefit from the control that brings.

kermatt a day ago | parent | prev | next [-]

What should they have used instead?

delta_p_delta_x 21 hours ago | parent [-]

As mentioned here... Probably Skia, which would've saved them the effort of writing any GPU backend, let alone three.

karunamurti 5 hours ago | parent [-]

Isn't Skia controlled by Google?

kermatt 5 hours ago | parent [-]

https://github.com/google/skia

Mountain_Skies a day ago | parent | prev [-]

The FOSS community has become full of ideological landmines, with projects now including clauses in their requirements, often vague and ill defined, about how those who build upon their projects must act and believe. As a result, some are now finding it less risky to roll their own base dependencies instead of using someone else's project that could at any time become problematic for non-technical reasons.

While I doubt this had anything to do with the decision by the Zed team to make their own toolkit, it is something becoming more common. Hopefully it doesn't start happening in the encryption space.

wwfn a day ago | parent [-]

I can see "ill defined" causing problems. But isn't an explicit code of conduct more defined than none? (Assuming I'm reading that correctly from your comment.)

There aren't too many epithets floating around that offend me specifically. And I haven't heard anyone say I shouldn't/don't exist. So it's hard for me personally to feel the need for CoC and the like. But I'm all for policy that protects everyone against that kind of abuse -- which seems to be on the rise. Are there better alternatives?

RattlesnakeJake a day ago | parent | prev | next [-]

Entirely unrelated, but the sections, toolbars, and controls in that RenderDoc app are so cleanly separated compared to modern dev tools. I wish more apps still looked like this.

jbverschoor a day ago | parent [-]

I have my macOS set up that all buttons have borders etc

If I switch no vanilla macOS, it’s basically unusable

Clean, but unusable

dang a day ago | parent | prev | next [-]

Related ongoing threads:

Zedless: Zed fork focused on privacy and being local-first - https://news.ycombinator.com/item?id=44964916

Sequoia backs Zed - https://news.ycombinator.com/item?id=44961172

munchler a day ago | parent | prev | next [-]

I’m sure this is a dumb question, but why does a code editor need to render on the GPU like a video game? Is it just for niceties like smooth scrolling?

delta_p_delta_x a day ago | parent | next [-]

> but why does a code editor need to render on the GPU like a video game?

It isn't just text editors—nowadays, everything renders on your GPU, even your desktop and terminal (unless you're on a tty). For example, at the bottom of Chromium, Electron, and Avalonia's graphics stack is Skia, which is a cross-platform GPU-accelerated windowing and 2D graphics library.

GPU compositing is what allows transparency, glass effects, shadowing, and it makes actually writing these programs much easier, as everything is the same interface and uses the same rendering pipeline as everything else.

A window in front of another, or a window partially outside the display? No big deal, just set the 3D coordinates, width, and height correctly for each window, and the GPU will do hidden-surface removal and viewing frustum clipping automatically and for free, no need for any sorting. Want a 'preview' of the live contents of each window in a task bar or during Alt-Tab, like on Windows 7? No problem, render each window to a texture and sample it in the taskbar panels' smaller viewports. Want to scale or otherwise squeeze/manipulate the contents of each window during minimise/maximise, like macOS does? Easy, write a shader.

This was a big deal in the early 2000s when GPUs finally had enough raw compute to always run everything, and basically every single OS and compositor switched to GPU rendering roughly in the same timeline—Quartz Extreme on Mac OS X, DWM.exe on Windows, and Linux's variety of compositors, including KWin, Compiz, and more.

There's a reason OSs from that time frame had so many glassy, funky effects—this was primarily to show off just how advanced their GPU-powered compositors were, and this was also a big reason why Windows Vista fell so hard on its face—its compositor was especially hard on the scrawny integrated GPUs of the time, enough that two themes—Aero Basic, and Aero Glass—had to be released for different GPUs.

munchler a day ago | parent [-]

Thanks. That explains why OSs use the GPU for rendering windows and effects, but it's still not clear to me why a code editor would do the same. The features you list (transparency, glass effects, shadowing, window management, etc.) seem to be outside the purview of a text editor.

If you're saying that Zed is built on something like Skia, then it would already be cross-platform and not have to worry about Vulkan vs. DirectX, right?

delta_p_delta_x a day ago | parent [-]

> but it's still not clear to me why a code editor would do the same.

Happy to elaborate further.

Old school text rendering began with a table of character codes to actual, fixed-size bitmaps (this was a font), and rendering was straightforward: divide the framebuffer resolution by the bitmap resolution, clip/wrap the remaining, just place the bitmaps into the resultant grid, and pipe the framebuffer to the display. Done.

Nowadays, text editors don't just have text; they have markup like highlighting and syntax colouring (with 24-bit deep-colour, rather than the ANSI 16 colour codes), go-to, version control annotations, debug breakpoints, hover annotations, and in the case of 'notebooks' like Python notebooks, may have embedded media like images, videos, and even 3D renders. Many editor features may open pop-up windows or dialogue boxes, which will probably occlude the text 'behind'.

Now, most modern text editors also expect to work with non-bitmapped, non-monospaced typefaces in OpenType or TrueType format. These are complex beasts of their own with hinting, ligatures, variable weights, and more, and may even embed entire programs. They are usually Bezier/polynomial splines that the GPU can rasterise easily in hardware (no special shader required). After this rasterisation, any reasonable text editor will apply anti-aliasing, which is also work delegated to the GPU. There is probably a different algorithm for text (which needs to account for display subpixel layouts) versus UI elements (which may not).

The point I am driving at is that the proliferation of features expected from a modern text editor means that using a GPU for all of this is a natural evolution. As users, we may think 'it's just text' but from the perspective of the developer or the hardware, text and a ray-traced 3D game is no different: it's one 4D square matrix multiplied by another, one after another, and in the end reduced into a three-vector, representing the colour of a pixel.

> If you're saying that Zed is built on something like Skia, then it would already be cross-platform and not have to worry about Vulkan vs. DirectX, right?

Absolutely, because Skia handles that for the developer. And I suspect the reason why Zed didn't use Skia in the first place is ideological (Skia is by Google, written in C++), together with wanting to write 'the whole world' in Rust.

pjmlp 18 hours ago | parent [-]

Last point is kind of ironic, given that Metal is Objective-C with C++14 as shading language, Swift bidings, and a light C++ wrapper lib, Vulkan is C99 (tutorials use the C++20 bindings), DirectX is C++ with a COM based API.

leecommamichael a day ago | parent | prev | next [-]

It doesn't need to. It's typical to do this these days, but they could still arrange all of the pixels on the CPU and then blit it onto the screen. There's an API to do so in every major OS.

Since it's more than quick enough to do this on the CPU, they're likely doing it for things like animations and very high quality font rendering. There's image-processing going on when you really care about quality; oversampling and filtering.

I suspect one could do most everything Zed does without a GPU, but about 10 to 20% uglier, depending on how discerning the user is on such things.

ben-schaaf a day ago | parent [-]

> Since it's more than quick enough to do this on the CPU

This is true until it isn't. A modern-ish CPU at 1080p 60hz it'll be fine. At 4k 120hz even the fastest CPU on the market won't keep up. And then there's 8k.

> they're likely doing it for things like animations and very high quality font renderin

Since they're using native render functions this probably isn't the case.

leecommamichael a day ago | parent [-]

I’m almost nerd-sniped enough to try and see exactly where it breaks down.

What’s a native render function? Do you mean just using a graphics API as opposed to an off-the-shelf UI library?

ben-schaaf 21 hours ago | parent [-]

> What’s a native render function?

As in using DirectWrite or GDI on Windows; or Core Text on macOS. As opposed to shipping your own glyph rasterizer.

hoistbypetard 15 hours ago | parent [-]

Doesn't the blog post specifically say they are shipping their own glyph rasterizer?

ben-schaaf 3 hours ago | parent [-]

No?

> To work around this limitation, we decided to stop using Direct2D and switch to rasterizing glyphs using DirectWrite instead.

starkrights 14 hours ago | parent | prev [-]

The zed blog has an early post[0] talking some about their decision. Mainly just decrying their experience of impossible-to-meet timing deadlines for something as basic as 60fps on electron.

It doesnt really do a tech breakdown of why it’d be impossible CPU side, but mentions a couple of things about their design process for it.

[0]: https://zed.dev/blog/videogame

spapas82 a day ago | parent | prev | next [-]

For people that use scoop and want to try Zed there's a precompiled binary they can install using scoop install zed.

I tried it and the experience (mainly visually, fonts colours etc) wasn't very good so I can understand why the Zed developers are reluctant to formally release windows binaries.

Thaxll a day ago | parent | prev | next [-]

Why dx11 and not 12? No one should care about win7 in 2025.

zamadatix a day ago | parent | next [-]

DX12 isn't just "newer DX11" though, so it really comes down to what makes the most sense for building Zed with.

delta_p_delta_x a day ago | parent | prev | next [-]

I've written a response below, but the summary is that mapping from Vulkan 1.3 with dynamic rendering to D3D11 is easier than targeting the lower-level D3D12. The latter does have some form of dynamic rendering too, but I suspect the authors woud've had to re-think their CPU code much more than what has currently been done. Getting Windows Vista and 7 support is a freebie.

pjmlp a day ago | parent | prev | next [-]

If you aren't trying to write a games engine, DX 11 is good enough, and won't be going away any time soon.

a day ago | parent | prev | next [-]
[deleted]
shortrounddev2 a day ago | parent | prev | next [-]

Dx11 is a far more ergonomic API than dx12. Dx12 is what you use when you need maximum control over memory allocation and such

Analemma_ a day ago | parent | prev [-]

Think about the customer base: the sorts of users who want a high-performance text editor are exactly the kind of people who will run Windows 7 until it's pried from their cold, dead fingers, and who will flood the support forums with complaints if you limit support to operating systems released in the last 15 years. Because of their target market, Zed probably has implicit support requirements which wouldn't apply to e.g. the last first-person shooter.

delta_p_delta_x a day ago | parent | next [-]

> Zed probably has implicit support requirements which wouldn't apply to e.g. the last first-person shooter.

This is incongruous given Zed uses modern frameworks (which is why they moved to D3D11 from Vulkan in the first place).

If Zed really wanted to target 'old Windows' then they might have used Win32 and GDI+, not D3D11. In fact they could've stuck to D2D (which was released with Windows 7 and back-ported to Vista), and not used their own rendering at all, since D2D is already a GPU-accelerated text-rendering API, and then used Win32 windowing primitives for everything else.

Someone a day ago | parent | prev [-]

Similarly, the Mac version is for MacOS 10.15 (from 2019) or later, and has an x64 version.

zamadatix a day ago | parent [-]

Zed had already targeted macOS when 10.15 still had over a year of support left https://github.com/zed-industries/zed/commit/b400449a58507cc... and some variant of x86-64 macOS will still be supported through 2028. Neither of these were adding support for really old things, one is current for many years to come and just there hasn't been a reason to break 10.15 support yet so why bother.

Meanwhile Windows 7 is already over 2 years past the end of extra-extended support at the time this new code was written with Windows 7 support still in mind. Which is nice, but a very different scenario.

delta_p_delta_x a day ago | parent | prev | next [-]

As a Windows dev...

> but we got reports from users that Zed didn't run on their machines due to the Vulkan dependency

This single sentence is abstracting a lot of detail. Vulkan runs on Windows, and quite well. Looking at the bug reports, especially the last one[1]...

> Rejected for device extension "VK_KHR_dynamic_rendering" not supported

Aha, ambitious devs >:) The dynamic rendering extension is pretty new, released with Vulkan 1.3. I suspect targeting Vulkan 1.1 or 1.2 might've been a little more straightforward than... rewriting everything to target DX11. Large games with custom engines (RDR2, Doom, Doom Eternal) were shipped before this was main-lined into Vulkan.

But thinking about it a little more, I suspect switching out the back-end to a dynamic rendering-esque one (which is why D3D11 rather than D3D12) was easier than reverting their Rust code to pre-dynamic rendering Vulkan CPU calls; the Rust code changes are comparatively light and the biggest change is the shader.

That being said, it's a bit annoying to manually write render-passes and subpasses, but it's not the worst thing, and more importantly extremely high performance is less critical here, as Zed is rendering text, not shading billions of triangles. The singular shader is also not necessarily the most complex[2]; a lot of it is window-clipping which Windows does for free.

> we had two implementations of our GPU shaders: one MSL implementation for macOS, and one WGSL implementation for Vulkan. To use DirectX 11, we had to create a third implementation in HLSL.

I wonder why HLSL wasn't adopted from the outset, given roughly 99.999% of shaders—which are mostly shipped with video games, which mostly target Windows—are written in HLSL, and then use dxc to target SPIR-V? HLSL is widely considered the best-specified, most feature-complete, and most documented shader language. I'm writing a Vulkan engine on Windows and Linux, and I only use HLSL. Additionally Vulkan runs on macOS with MoltenVK (and now 'KosmicKrisp'), but I suppose the Zed spirit is 'platform-native and nothing else'.

> symbolicating stack traces requires a .pdb file that is too large to ship to users as part of the installer.

Perhaps publishing a symbol server[3] is a good idea here, rather than users shipping dump files which may contain personally-identifiable information; users can then use WinDbg or Visual Studio to debug the release-mode Zed at their leisure.

[1]: https://github.com/zed-industries/zed/issues/35205

[2]: https://github.com/zed-industries/zed/blob/c995dd2016a3d9f8b...

[3]: https://randomascii.wordpress.com/2020/03/14/creating-a-publ...

maxbrunsfeld a day ago | parent | next [-]

The Zed spirit is definitely to prefer a platform native solution.

You're right that we may be able to get rid of our WGSL implementation, and instead use the HLSL one via SPIR-V. But also, at some point we plan to port Zed to run in a web browser, and will likely build on WebGPU, where WGSL is the native shading language. Honestly, we don't change our graphics primitives that frequently, so the cost of having the three implementations going forward isn't that terrible. We definitely would not use MoltenVK on macOS, vs just using Metal directly.

Good point that we should publish a symbol server.

jamienicol a day ago | parent | next [-]

Did you consider using wgpu instead of writing a new dx11 renderer? It has metal, vulkan and dx12 backends so could have been used for a single renderer for macOS windows and Linux. (And webgpu in the future)

bsder a day ago | parent | prev [-]

> But also, at some point we plan to port Zed to run in a web browser, and will likely build on WebGPU, where WGSL is the native shading language.

Except that everything has effectively converged to HLSL (via Slang which is effectively HLSL++) and SPIR-V (coming via Shader 7).

So, your pipelines, shader language, and IR code would all look mostly the same between Windows and Linux if you threw in with DX12 (which looks much more like Vulkan) rather than DX11. And you'd get the ability to multi-thread through the GPU subsystem via DX12/Vulkan.

And, to be fair, we've seen that MoltenVK gets you about 80-90% of native Metal performance on macOS, so you wouldn't have to maintain a Metal backend, anymore.

And you'd gain the ability to use all the standard GPU debugging tools from Microsoft, nVidia, and AMD rather than just RenderDoc.

You'd abandon this all for some mythical future compatibility with WebGPU--which has deployment counts you can measure with a thimble?

mrpippy a day ago | parent | prev | next [-]

> Vulkan runs on Windows, and quite well.

Not everywhere. See the middle bug report, "Zed does not work in Remote Desktop session on windows" (https://github.com/zed-industries/zed/issues/26692).

Most Remote Desktop/Terminal Services environments won't have any Vulkan devices available, unless you ship your own software rendererer (like SwiftShader).

Also, NVIDIA only supports Vulkan on Kepler (GTX 600 series), AMD on GCN 1.0 (Radeon HD 7000 series), and most importantly, Intel on Skylake (6000 series). Especially on the Intel side, there are plenty of old but still-supported Windows 10 machines that lack Vulkan support. For many applications that's ok, but IMO not for a text editor.

andrewmcwatters a day ago | parent | prev | next [-]

Yeah, I maintain a Vulkan backend, and this immediately triggered my internal "what?" alarm.

Modern Direct3D is almost indistinguishable from Vulkan, on the other hand. So it shouldn't be difficult for them to add.

I also agree with your HLSL comment. It sounds like these guys don’t have much prior graphics or game development experience.

delta_p_delta_x a day ago | parent [-]

I'd been thinking about it, and I added another paragraph above. I get a feeling they've been targeting Vulkan 1.3 with dynamic rendering from the beginning, so porting to D3D12 would be roughly as complex as rewriting to target older Vulkan.

shortrounddev2 a day ago | parent | prev | next [-]

> I wonder why HLSL wasn't adopted from the outset

I suspect because a huge amount of software engineers develop on Macbooks and consider Linux second and Windows third. Culturally, I think there's a difference in tooling between Graphics developers (who would go straight for HLSL, cross-platform Vulkan, or even SDL3) and mac users (who reach for Apple tools first)

TiredOfLife a day ago | parent | prev [-]

Zed spirit is Mac first and only.

guluarte a day ago | parent | prev [-]

[flagged]

CharlesW a day ago | parent [-]

Add `—dangerously-skip-permissions` so you can sleep.