| ▲ | snickerer 10 hours ago | |||||||||||||||||||||||||
Allowing scripting on websites (in the mid-90s) was a completely wrong decision. And an outrage. Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust. That’s completely unacceptable; it’s fundamentally flawed. Of course, you disable scripts on websites. But there are sites that are so broken that they no longer work properly, since the developers are apparently so confused that they assume people only view their pages with JavaScript enabled. It would have been so much better if we had simply decided back in the ’90s that executable programs and HTML don’t belong together. The world would be so much better today. | ||||||||||||||||||||||||||
| ▲ | diacritical 8 hours ago | parent | next [-] | |||||||||||||||||||||||||
> Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust Would've been cool if we could know if site X served the same JS as before. Like a system (maybe even decentralized) where people could upload hashes of the JS files for a site. Someone could even review them and post their opinions. But mainly you'll know you're getting the same JS as before - that the site hasn't been hacked or that you're not being targeted personally. If a file needs to update, the site could say in the changelog something like "updated the JS file used for collapsing comments to fix a bug". This could be pushed by the users to the system. Especially important for banking sites and webmail. | ||||||||||||||||||||||||||
| ▲ | coin 10 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||
Stepping back, it's pretty ridiculous that I need to download executable code, often bloated, solely to view read-only content. Just render the thing on the backend and send it to the client. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
| ▲ | rho_soul_kg_m3 6 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||
Why can't MY browser send some random JS to THEIR website? If it's safe for me to run some stranger's code, should it be safe for strangers to run my code? | ||||||||||||||||||||||||||
| ▲ | SchemaLoad 10 hours ago | parent | prev | next [-] | |||||||||||||||||||||||||
There is obviously huge demand for scripting on websites. There is no one authority on what gets allowed on the web, if the existing orgs didn't implement it, someone else would have and users would have moved over when they saw they could access new more capable, interactive pages. The 49MB webpage just shows what our priorities are. It shows the target audience has fast internet that can load this without issues. On my average home connection in Australia, I can download a 49MB page in 0.3 seconds. We spend time optimising for what matters to the end user. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
| ▲ | maxloh 7 hours ago | parent | prev [-] | |||||||||||||||||||||||||
> Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust. JavaScript and WebAssembly programs are always executed in a sandboxed VM, without read access to the host OS files (unless, of course, you grant it). Enabling scripting was a necessary step for interactive websites. Without it, a full page load would be required every time you upvote a Hacker News comment. In my opinion, the real problem is that browsers allow too many connections to third-party domains, which are mostly ads and trackers. Those should require user-approved permissions instead of being the default. | ||||||||||||||||||||||||||