| |
| ▲ | yoz-y 6 hours ago | parent | next [-] | | Not parent but: The first thing that pops to mind is inadvertently downloading and hosting CSAM. | | |
| ▲ | yoavm 5 hours ago | parent | next [-] | | If you suspect AA for spreading CSAM, please don't support the project. And please do share your reasons for suspicion. | |
| ▲ | RankingMember 5 hours ago | parent | prev | next [-] | | This isn't TOR, though it's not completely unfounded that the definition of CSAM could be broadened in the future by legislators to include things that are, by current definitions, not CSAM, e.g. works of fiction that include scenes of abuse. | | | |
| ▲ | Tepix 5 hours ago | parent | prev [-] | | Yes, your copy of your operating system could also contain CSAM, I hope you checked every single byte just to make sure. | | |
| ▲ | xpe 5 hours ago | parent [-] | | Please, let's be sensible and think about probabilities in the real world. | | |
|
| |
| ▲ | streetfighter64 5 hours ago | parent | prev | next [-] | | [flagged] | |
| ▲ | duozerk 5 hours ago | parent | prev [-] | | So you did use LLMs to write at least part of the software. I imagine you feel no shame, but it would be nice to at least mention it on the github page. It's a security risk. As for your question, I don't know about the person you're replying to, but for me any software where part of the source was provided by a LLM is a no-go. They're credible text generators, without any understanding of, well, anything really. Using them to generate source code, and then using it, is sheer insanity. One might suggest it means I soon won't be able to use any software; fortunately the entire fever dream that is the ongoing "AI" bubble will soon stop, so I'm hoping that won't be the case. | | |
| ▲ | satvikpendem 5 hours ago | parent | next [-] | | They literally state that they used LLMs to build it in the second sentence of their initial comment so not sure why you frame it as something they weren't upfront about. As for it being a bubble that will stop completely, that ship has long since sailed and I assume you're inadvertently using LLM generated code somewhere in your software stack already, due to news reports saying certain companies are already using LLMs in their codebase. | |
| ▲ | yoavm 5 hours ago | parent | prev [-] | | I wish I could speed up time just to see how this comment would age. While I personally prefer living in a world without LLMs, I do suspect you're going to end up without any software. | | |
| ▲ | dylan604 3 hours ago | parent | next [-] | | I'm imagining some apocalyptic world Mad Max style where there are underground groups hand writing code to avoid the detection of the AI. Unfortunately, so few people are able to do it any more and the code is so bug ridden that their attempts at regaining control over the AI often ends in embarrassing results. Those left in the fight often find themselves wondering why everyone just rolled over for the machines, what, because it made their lives easier?? Maybe it's a scene from a show I've seen already?? | |
| ▲ | duozerk 5 hours ago | parent | prev | next [-] | | A more reasonable response than my admittedly slightly aggressive comment deserved. Indeed, we'll see. | |
| ▲ | bigfishrunning 4 hours ago | parent | prev [-] | | I suspect we'll all end up without any software, once we've successfully gotten rid of anyone who can evaluate the output of an LLM | | |
| ▲ | satvikpendem 3 hours ago | parent [-] | | There will always be a niche of people writing software, just as today while most work in web dev or backend, there are some who work in embedded or have retro computing as a hobby. |
|
|
|
|