▲ | dahart 4 days ago | |
For that matter, why would you expose your CPU to a website? Or your monitor? It could show you anything! ;) Maybe you are not be aware of the number of good web apps that use some WebGL under the hood? You might be using office applications in your browser already that use WebGL when it’s available, and the reason is it makes things faster, more responsive, more scalable, and more efficient. Same would go for WebGPU. There’s no reason to imagine that the web will do bad things with your resources that you didn’t ask for and don’t have control over. There have been hiccups in the past, but they got fixed. Awareness is higher now, and if there are hiccups, they’ll get fixed. | ||
▲ | stackghost 4 days ago | parent | next [-] | |
>There’s no reason to imagine that the web will do bad things with your resources that you didn’t ask for and don’t have control over. The web is like this right now. Why would things magically become a utopia? | ||
▲ | fulafel 4 days ago | parent | prev | next [-] | |
> There’s no reason to imagine that the web will do bad things with your resources that you didn’t ask for and don’t have control over. Read some security update news from browser vendors and vulnerability researcher posts. There's some weak signals about vendors acknowledging the difficulty of securing the enormous attack surface of browsers built on unsafe foundations, eg MS "enhanced security mode" and Apple "lockdown mode". | ||
▲ | skydhash 4 days ago | parent | prev [-] | |
I don't mind the browser using the GPU to speed up graphical operations. But I do mind random sites and apps going further than that. Native apps have better access, but there's an higher selection criteria than just opening a URL. |