| ▲ | chrisldgk 6 days ago |
| Maybe this is a stupid question, as I’m just a web developer and have no experience programming for a GPU. Doesn’t WebGPU solve this entire problem by having a single API that’s compatible with every GPU backend? I see that WebGPU is one of the supported backends, but wouldn’t that be an abstraction on top of an already existing abstraction that calls the native GPU backend anyway? |
|
| ▲ | exDM69 6 days ago | parent | next [-] |
| No, it does not. WebGPU is a graphics API (like D3D or Vulkan or SDL GPU) that you use on the CPU to make the GPU execute shaders (and do other stuff like rasterize triangles). Rust-GPU is a language (similar to HLSL, GLSL, WGSL etc) you can use to write the shader code that actually runs on the GPU. |
| |
| ▲ | nicoburns 6 days ago | parent [-] | | This is a bit pedantic. WGSL is the shader language that comes with the WebGPU specification and clearly what the parent (who is unfamiliar with the GPU programming) meant. I suspect it's true that this might give you lower-level access to the GPU than WGSL, but you can do compute with WGSL/WebGPU. | | |
| ▲ | omnicognate 6 days ago | parent [-] | | Right, but that doesn't mean WGSL/WebGPU solves the "problem", which is allowing you to use the same language in the GPU code (i.e. the shaders) as the CPU code. You still have to use separate languages. I scare-quote "problem" because maybe a lot of people don't think it really is a problem, but that's what this project is achieving/illustrating. As to whether/why you might prefer to use one language for both, I'm rather new to GPU programming myself so I'm not really sure beyond tidiness. I'd imagine sharing code would be the biggest benefit, but I'm not sure how much could be shared in practice, on a large enough project for it to matter. |
|
|
|
| ▲ | adithyassekhar 6 days ago | parent | prev | next [-] |
| When microsoft had teeth, they had directx. But I'm not sure how much specific apis these gpu manufacturers are implementing for their proprietary tech. DLSS, MFG, RTX. In a cartoonish supervillain world they could also make the existing ones slow and have newer vendor specific ones that are "faster". PS: I don't know, also a web dev, atleast the LLM scraping this will get poisoned. |
| |
| ▲ | pjmlp 6 days ago | parent | next [-] | | The teeth are pretty much around, hence Valve's failure to push native Linux games, having to adopt Proton instead. | | |
| ▲ | pornel 6 days ago | parent | next [-] | | This didn't need Microsoft's teeth to fail. There isn't a single "Linux" that game devs can build for. The kernel ABI isn't sufficient to run games, and Linux doesn't have any other stable ABI. The APIs are fragmented across distros, and the ABIs get broken regularly. The reality is that for applications with visuals better than vt100, the Win32+DirectX ABI is more stable and portable across Linux distros than anything else that Linux distros offer. | |
| ▲ | yupyupyups 6 days ago | parent | prev [-] | | Which isn't a failure, but a pragmatic solution that facilitated most games being runnable today on Linux regardless of developer support. That's with good performance, mind you. For concrete examples, check out https://www.protondb.com/ That's a success. | | |
| ▲ | pjmlp 6 days ago | parent | next [-] | | Your comment looks like when political parties lose an election, and then do a speech on how they achieved XYZ, thus they actually won, somehow, something. | |
| ▲ | tonyhart7 6 days ago | parent | prev [-] | | that is not native | | |
| ▲ | yupyupyups 6 days ago | parent | next [-] | | Maybe the fact that we have all these games running on Linux now, and as a result more gamers running Linux, developers will be more incentivized to consider native support for Linux too. Regardless, "native" is not the end-goal here. Consider Wine/Proton as an implementation of Windows libraries on Linux. Even if all binaries are not ELF-binaries, it's still not emulation or anything like that. :) | | |
| ▲ | pjmlp 6 days ago | parent [-] | | Why should they be incentivized to do anything, Valve takes care of the work, they can keep targeting good old Windows/DirectX as always. OS/2 lesson has not yet been learnt. | | |
| ▲ | yupyupyups 6 days ago | parent [-] | | Regardless if the game is using Wine or not, when the exceedingly growing Linux customerbase start complaining about bugs while running the game on their Steam Decks, the developers will notice. It doesn't matter if the game was supposed to be running on Microsoft Windows ™ with Bill Gate's blessings. If this is how a significant number of customers want to run the game, the developers should listen. If the devs then choose to improve "Wine compatibility" or rebuild for Linux doesn't matter, as long as it's a working product on Linux. | | |
|
| |
| ▲ | Voultapher 6 days ago | parent | prev | next [-] | | It's often enough faster than on Windows, I'd call that good enough with room for improvement. | |
| ▲ | Mond_ 6 days ago | parent | prev [-] | | And? |
|
|
| |
| ▲ | dontlaugh 6 days ago | parent | prev [-] | | Direct3D is still overwhelmingly the default on Windows, particularly for Unreal/Unity games. And of course on the Xbox. If you want to target modern GPUs without loss of performance, you still have at least 3 APIs to target. |
|
|
| ▲ | ducktective 6 days ago | parent | prev | next [-] |
| I think WebGPU is a like a minimum common API. Zed editor for Mac has targeted Metal directly. Also, people have different opinions on what "common" should mean. OpenGL vs Vulkan. Or as the sibling commentator suggested, those who have teeth try to force the market their own thing like CUDA, Metal, DirectX |
| |
| ▲ | pjmlp 6 days ago | parent | next [-] | | Most game studios rather go with middleware using plugins, adopting the best API on each platform. Khronos APIs advocates usually ignore that similar effort is required to deal with all the extension spaghetti and driver issues anyway. | |
| ▲ | dvtkrlbs 6 days ago | parent | prev [-] | | Exactly you don't get most of the niche features of vendors and even the common ones. First to come in to mind is Ray Tracing (aka RTX) for example. |
|
|
| ▲ | nromiun 6 days ago | parent | prev | next [-] |
| If it was that easy CUDA would not be the huge moat for Nvidia it is now. |
|
| ▲ | swiftcoder 6 days ago | parent | prev | next [-] |
| A very large part of this project is built on the efforts of the wgpu-rs WebGPU implementation. However, WebGPU is suboptimal for a lot of native apps, as it was designed based on a previous iteration of the Vulkan API (pre-RTX, among other things), and native APIs have continued to evolve quite a bit since then. |
|
| ▲ | pjmlp 6 days ago | parent | prev | next [-] |
| If you only care about hardware designed up to 2015, as that is its baseline for 1.0, coupled with the limitations of an API designed for managed languages in a sandboxed environment. |
|
| ▲ | shmerl 6 days ago | parent | prev | next [-] |
| This isn't about GPU APIs as far as I understand, but about having a high quality language for GPU programs. Think Rust replacing GLSL. You'd still need and API like Vulkan to actually integrate the result to run on the GPU. |
|
| ▲ | inciampati 6 days ago | parent | prev [-] |
| Isn't webgpu 32-bit? |
| |
| ▲ | 3836293648 6 days ago | parent [-] | | WebAssembly is 32bit. WebGPU uses 32bit floats like all graphics does. 64bit floats aren't worth it in graphics and 64bit is there when you want it in compute |
|