| ▲ | urbandw311er 9 hours ago | |
> Tan’s website made 169 server requests > (Hacker News makes 7). It shipped 28 test files > to production users. It loaded 78 JavaScript > controllers > Uncompressed 2MB PNGs that could’ve been 300KB. > An empty 0-byte file sitting in production. > A rich-text editor loaded on a read-only page. I mean - none of this is great, but if these are the very worst examples they can find then it feels a bit like scraping the barrel. | ||
| ▲ | JSR_FDED 5 hours ago | parent | next [-] | |
If that’s the front end, imagine how the back end must look. Rapid iteration of a bad code base, what could possibly go wrong? | ||
| ▲ | ricardobeat 7 hours ago | parent | prev [-] | |
Deploying tests, 4MB of images, a 6MB homepage for a news site, a barrage of unnecessary assets and broken code is pretty bad for frontend. Hard to do any worse. While not the end of the world - we routinely deal with similar crap produced by humans - it's a clear marker of the kind of quality you get from AI without real supervision. | ||