| |
| ▲ | doctorpangloss 4 days ago | parent [-] | | I edited my comment. > WebGPU enables compute shaders, and there are applications in anything that uses a GPU, from ML to physics to audio to … you name it. I know. If you have to go through a giant product like Unity for example to use WebGPU because Apple will essentially have its own flavor of WebGPU just like it has its own flavor of everything, is it really cross platform? Does Apple support Vulkan? No. It was invented for middlewares! Apple has a flag to toggle on WebGPU on iOS today. I know dude. What does that really mean? They have such a poor record of support for gamey things on Mobile Safari. No immersive WebXR, a long history of breaking WASM, a long history of poor WebGL 2 and texture compression support. Why is this going to be any different? | | |
| ▲ | dahart 4 days ago | parent | next [-] | | I’m still not sure what the point is. WebGPU is an API, is that that you mean by middleware? What’s the issue? Apple will do their own thing, and they might not allow WebGPU on Safari. What bearing does that have on what people using Linux, Windows, Firefox, and Chrome should do? And where exactly is this cross platform claim you’re referring to? | | |
| ▲ | CharlesW 4 days ago | parent | next [-] | | > Apple will do their own thing, and they might not allow WebGPU on Safari. Safari has WebGPU support today, albeit behind a feature flag until it's fully baked. https://imgur.com/a/b3spVWd Not sure if this is good, but animometer shows an Avg Frame time of ~25.5 ms on a Mac Studio M1 Max with Safari 18.2 (20620.1.16.11.6). https://webgpu.github.io/webgpu-samples/sample/animometer/ | | |
| ▲ | flohofwoe 4 days ago | parent [-] | | The demo is doing a setBindGroup per triangle, so not exactly surprising since this is a well known bottleneck (Chrome's implementation is better optimized but even there setBindGroup is a surprisingly slow call). But since both implementations run on top of Metal there's no reason why Safari couldn't get at least to the same performance as Chrome. |
| |
| ▲ | doctorpangloss 4 days ago | parent | prev | next [-] | | The issue is, it's likely that a company with $2 BILLION spent on product development and a very deep relationship with Apple, like Unity, will have success using WebGPU the way it is intended, and nobody else will. So then, in conclusion, WebGPU is designed for Unity, not you and me. Unity is designed for you and me. Are you getting it? | | |
| ▲ | jms55 4 days ago | parent | next [-] | | > The issue is, it's likely that a company with $2 BILLION spent on product development and a very deep relationship with Apple, like Unity, will have success using WebGPU the way it is intended, and nobody else will. Not really. Bevy https://bevyengine.org uses WebGPU exclusively, and we have unfortunately little funding - definitely not $2 billion. A lot of the stuff proposed in the article (bindless, 64-bit atomics, etc) is stuff we (and others) proposed :) If anything, WebGPU the spec could really use _more_ funding and developer time from experienced graphics developers. | | |
| ▲ | doctorpangloss 4 days ago | parent [-] | | I agree with you, Bevy is a worthy cause. Why are people downvoting? The idea of Great High Performance Graphics Effortlessly on All Platforms is very appealing. It is fundamentally empathetic. It is an opium to game developers whose real antagonist is Apple and Nintendo, and who want a more organic journey in game development than Unity Learn. Is it a realizable goal? Time will tell. Everyone should be advocating for more focused efforts, but then. Are you going to say, Bevy is better than Godot? It’s subjective right? Open source efforts are already spread so thin. An inability to rally behind one engine means achieving 2013’s Unity 5 level of functionality is years away. Looking at it critically, in fact much effort in the open source ecosystem is even anti games. For example emulators used to pirate Nintendo Switch games have more mature multiplatform graphics engine implementations than Godot and Bevy do. It would be nice if that weren’t true, you might tell me in some sense that I am wrong, but c’mon. It’s crazy how much community effort goes into piracy compared to the stuff that would sincerely benefit game developers. WebGPU is authored by giant media companies, and will have purposefully hobbled support by the most obnoxious of them all, Apple - the one platform where it is kind of impractical to pirate stuff, but also, where it is kind of impractical to deliver games through the browser. Precisely because of the empathetic, yet ultimately false, promises of WebGPU. | | |
| ▲ | dahart 4 days ago | parent [-] | | Why are you bringing up Godot again? Are you worried Godot will be left behind or unable to compete with Unity? Are you working on Godot? Why are you focused exclusively on games? What are the ‘false promises’ of WebGPU, and why do you think it won’t deliver compute shaders to every browser that supports it, like it says? I’m just curious, I get the sense there’s an underlying issue you feel strongly about and a set of assumptions that you’re not stating here. I don’t have a lot invested in whether WebGPU is adopted by everyone, and I’m trying to understand if and why you do. Seems like compute shaders in the browser will have a lot of interest considering the wild success of CUDA. Are you against people having access to compute shaders in the browser? | | |
| ▲ | doctorpangloss 3 days ago | parent [-] | | > What are the ‘false promises’ of WebGPU That you can write a "compute shader" once, and it will run "anywhere." This isn't the case with any accelerated compute API, so why is WebGPU going to be different? Reality will be Chrome Windows Desktop WebGPU, Chrome Android (newish) WebGPU, Mobile Safari iOS 18 WebGPU, iPad WebGPU, macOS Safari WebGPU, macOS Chrome WebGPU, iOS default in app browser WebGPU, Instagram and Facebook in app browser WebGPU... This isn't complicated! If that's reality, I'd rather have: Apple Compute Shaders for Browser. Windows Chrome Compute Shaders for Browser. Android Chrome Compute Shaders for Browser. Because I'm going to go through a middleware like Unity to deal with both situations. But look at which is simpler. It's not complicated. > I’m trying to understand if and why you do. I make games. I like the status quo where we get amazing game engines for free. I cannot force open source developers to do anything. They are welcome to waste their time on any effort. If Bevy has great WebGL 2 support, which runs almost without warts everywhere, even on iOS, for example, it makes no sense to worry about WebGPU at all, due to the nature of the games that use Bevy. Because "runs on WebGPU" is making-believe that you can avoid the hard multiplatform engine bits. Engines like Construct and LOVE and whatever - 2D games don't need compute shaders, they are not very performance sensitive, use the browser as the middleware, and the ones that are, they should just use a huge commercial game engine. People have choices. | | |
| ▲ | dahart 3 days ago | parent [-] | | > That you can write a "compute shader" once, and it will run "anywhere." Can you post a link to that quote? What exactly are you quoting? |
|
|
|
| |
| ▲ | dahart 4 days ago | parent | prev | next [-] | | It seems like you’ve jumped to and are stuck on a conclusion that isn’t really supported, somewhat ignoring people from multiple companies in this thread who are actively using WebGPU, and it’s not clear what you want to have happen or why. Do you want WebGPU development to stop? Do you want Apple to support it? What outcome are you advocating for? Unity spends the vast majority of its money on other things, and Unity isn’t the only company that will make use of WebGPU. Saying nobody will have success with it is like saying nobody will succeed at using CUDA. We’re just talking about compute shaders. What is making you think they’re too hard to use without Apple’s help? | |
| ▲ | Eiim 4 days ago | parent | prev | next [-] | | You haven't substantiated why nobody else could make use of WebGPU. Are Google the only ones who can understand Beacons because they make $300B/year? GPU is hard, but it doesn't take billions to figure out. | |
| ▲ | 4 days ago | parent | prev | next [-] | | [deleted] | |
| ▲ | raincole 4 days ago | parent | prev [-] | | There are already open source projects making use of WebGPU, e.g. wgpu. |
| |
| ▲ | sieabahlpark 4 days ago | parent | prev [-] | | [dead] |
| |
| ▲ | asyx 4 days ago | parent | prev | next [-] | | Apple submitted Metal as a web spec and they turned this into WebGPU and Apple got everything they asked for to avoid apple going rogue again. The fear that Apple of all companies is going to drop WebGPU support is really not based in reality. | |
| ▲ | flohofwoe 4 days ago | parent | prev [-] | | > because Apple will essentially have its own flavor of WebGPU Apple's WebGPU implementation in Safari is entirely spec compliant, and this time they've actually been faster than Firefox. | | |
| ▲ | rwbt 4 days ago | parent [-] | | I wish Apple made a standalone Webgpu.framework spinning it off of WebKit so that apps can link to it directly instead of having to link to Dawn/wgpu. | | |
| ▲ | flohofwoe 4 days ago | parent [-] | | Sounds like an interesting idea at first until the point where they will probably create a Swift/ObjC API around it instead of the standard webgpu.h C API, and at that point you can just as well use Metal - which is actually a bit less awkward than the WebGPU API in some areas. | | |
| ▲ | rwbt 4 days ago | parent [-] | | Maybe it makes more sense as a community project. Not sure how difficult it'd be to extract it from Webkit... |
|
|
|
|
|