| ▲ | hliyan 3 days ago |
| A chill ran down my spine as I imagined this being applied to the written word online: my articles being automatically "corrected" or "improved" the moment I hit publish, any book manuscripts being sent to editors being similarly "polished" to a point that we humans start to lose our unique tone and everything we read falls into that strange uncanny valley where everything reads ok, you can't quite put your finger on it, but it feels like something is wearing the skin of what you wrote as a face. |
|
| ▲ | dsign 3 days ago | parent | next [-] |
| The well is already poisoned. I'm refraining from hiring editors merely because I suspect there's a high chance they'll just use an LLM. All recent books I'm reading is with suspicion that they have been written by AI. However, polished to a point that we humans start to lose our unique tone is what style guides that go into the minutiae of comma placement try do do. And I'm currently reading a book I'm 100% sure has been edited by an expert human editor that did quite the job of taking away all the uniqueness of the work. So, we can't just blame the LLMs for making things more gray when we have historically paid other people to do it. |
| |
| ▲ | jimbo808 2 days ago | parent | next [-] | | > suspicion that they have been written by AI "By AI" or "with AI?" If I write the book and have AI proof read things as I go, or critique my ideas, or point out which points do I need to add more support for, is that written "by AI?" When Big Corp says 30% of their code is now written "by AI," did they write the code by following thoughtful instruction from a human expert, who interpeted the work to be done, made decisions about the architectural impact, outlined those things and gave detailed instructions that the LLM could execute in small chunks? This distinction I feel is going to become more important. AI tools are useful, and most people are using them for writing code, literature, papers, etc. I feel like, in some cases, it is not fair to say the thing was written by AI, even when sometimes it technically was. | | |
| ▲ | BenjiWiebe 2 days ago | parent | next [-] | | Good point. I've read books with minor mistakes that slipped past the editor. Not a big deal, but it takes me out of the flow when reading. And they're things that I think an AI could easily catch. | |
| ▲ | smohare 2 days ago | parent | prev [-] | | [dead] |
| |
| ▲ | akudha 2 days ago | parent | prev | next [-] | | I was listening to an interview (having a hard time remembering the name now). The guest was asked how he decides what to read, he replied that one easy way for him to filter is he only considers books published before the 70s. At the time, it sounded strange to me. It doesn't anymore, maybe he has a point | |
| ▲ | JdeBP 2 days ago | parent | prev | next [-] | | There's a YouTuber named Fil Henley (https://www.youtube.com/@WingsOfPegasus) who has been covering this for some years, now. Xe regularly comments on how universal application of pitch correction in post as an "industry standard" has dragged the great singers of yore down to the same level of mediocrity as everyone else. Xe also occasionally reminds people that, equal temperament being what it is, this pitch correction is actually in a few cases making people less well in tune than they originally were. It certainly removes unique tone. Yesterday's was a pitch corrected version of a performance by John Lennon from 1972, that definitely changed Lennon's sound. | | |
| ▲ | throwaway33467 2 days ago | parent | next [-] | | Why are you calling Fil Henley a "xe"? Misgendering a man as non-binary is still misgendering. Let's not normalize misgendering in any way. (And no, you don't get call misgendering a "stylistic choice") | | |
| ▲ | dkiebd 2 days ago | parent [-] | | He did it for attention. You giving him attention doesn’t help in any way. |
| |
| ▲ | 2 days ago | parent | prev | next [-] | | [deleted] | |
| ▲ | moritzwarhier 2 days ago | parent | prev [-] | | Extremely good analogy and context with the pitch correction thing and equal temperament IMO. We can only be stoic and say "slop is gonna be slop". People are getting used to AI slop in text ("just proofreading", "not a natural speaker") and they got used to artificial artifacts in commercial/popular music. It's sad, but it is what it is. As with DSP, there's always a creative way to use the tools (weird prompts, creative uses of failure modes). In DSP and music production, auto-tune plus vocal comping plus overdubs have normalized music regressing towards an artificial ideal. But inevitably, real samples and individualistic artists achieve distinction by not using the McDonald's-kind of optimization. Then, at some point, some of this lands in mainstream music, some of it doesn't. There were always people hearing the difference. It's a matter of taste. |
| |
| ▲ | lo_zamoyski 2 days ago | parent | prev | next [-] | | > is what style guides that go into the minutiae of comma placement try do do Eh. There might be a tacit presumption here that correctness isn't real, or that style cannot be better or worse. I would reject this notion. After all, what if something is uniquely crap? The basic, most general purpose of writing is to communicate. Various kinds of writing have varying particular purposes. The style must be appropriate to the end in question so that it can serve the purpose of the text with respect to the particular audience. Now, we may have disagreements about what constitutes good style for a particular purpose and for a particular audience. This will be a source of variation. And naturally, there can be stylistic differences between two pieces of writing that do not impact the clarity and success with which a piece of writing does its job. People will have varying tastes when it comes to style, and part of that will be determined by what they're used to, what they expect, a desire for novelty, a desire for clarity and adequacy, affirmation of their own intuitions, and so on. We shouldn't obfuscate and sweep the causes of varying tastes under the rug of obfuscation, however. In the case of AI-generated text, the uncanny, je ne said quoi character that makes it irritating to read seems to be that it has the quality of something produced by a zombie. The grammatical structure is obviously there, but at a pragmatic level, it lacks a certain cohesion, procession, and relevance that reads like something someone on amphetamines or The View might say. It's all surface. | | |
| ▲ | FreakLegion 2 days ago | parent | next [-] | | dsign's callout of the minutiae of comma placement is a useful starting point because it's largely rhythmic, and monotony, you could say, is the enemy of rhythm. My go-to example here would probably be the comma splice, which is inflicted on people learning to write in English (while at the same time being ignored by more sophisticated writers) but doesn't exist in e.g. French. | |
| ▲ | dsign 2 days ago | parent | prev [-] | | I can be convinced that different spaces need different styles. But, correctness intrinsically emanating from language? That one is not an absolute, unless one happens to be a mathematician or GHC the Haskell compiler or any of the other logical automatons we have and that are so useful. Language and music (which is a type of language) are a core of shared convention wrapped in a fuzzy liminal bark, outside of which, there is nonsense. An artist, be it a writer or a musician, is essentially somebody whose path stitches the core and the bark in their own unique way, and because those regions are established by common human consensus, the artist, by the act of using that consensus, is interacting with its group. And so is the person who enjoys the art. So, our shared conventions and what we dare call correctness are a medium for person-to-person communication, the same way that air is a medium to conduct sound or a piece of paper is a medium for a painting. Furthermore, the core of correctness is fluid; language changes and although, at any time and place there is a central understanding of what is good style, the easy rules, such as they exist, are limited and arbitrary. For example, two different manuals of style will mandate different placements of commas. And somebody will cite a neurolinguistics study to dictate on the ordering of clauses within a sentence. For anything more complex, you need a properly trained neural network to do the grasping; be it a human editor or an LLM. > The grammatical structure is obviously there, but at a pragmatic level, it lacks a certain cohesion, procession, and relevance that reads like something someone on amphetamines or The View might say. It's all surface. Somebody in amphetamines is still intrinsically human, and here too we have some disagreement. I can not concede that AI’s output is always of the quality produced by a zombie, at least no more than the output of certain human editors, and at least not by looking at the language alone; otherwise it would be impossible for the AI to fool people. In fact, AI’s output is better (“more correct”) than what most people would produce if you forced them to write with a gun pointed to their head, or even with a large tax deduction. What makes LLMs irritating is the suspicion that one is letting one’s brain engage with output from a stochastic parrot in contexts where one expects communication from a fellow human being. It’s the knowledge that, at the other end, somebody may decide to take your attention and your money dishonestly. That’s why I have no trouble paying for a ChatGPT plan—-it’s honest, I know what I get—-but hesitate to hire a human editor. Now, if I could sit at a caffe with said editor and go over their notes, then I would rather do just that. In other words, what makes AI pernicious is not a matter of style or correctness, but that it poisons the communication medium—-it seeds doubt and distrust. That’s why people—-yours truly—-are burning manuals of style and setting shop in the bark of the communication medium, knowing that’s a place less frequented by LLMs and that there is a helpful camp filled with authoritative figures whose job of asserting absolute correctness may, perhaps, keep the LLMs in that core for a little longer. Those are workarounds, however. It's too early to know for sure, but I think our society will need to rewrite its rules to adjust to AI. Anything from seclusion and attestation rituals for writers to a full blown Butlerian Jihad. https://w.ouzu.im/ |
| |
| ▲ | brookst 2 days ago | parent | prev [-] | | If something needs editing, why would you care what tool they use? It’s like saying you wouldn’t hire an engineer because you suspect they’d use computers rather than pencil and paper. | | |
| ▲ | djfdat 2 days ago | parent | next [-] | | Because "edited" is not a singular point. It's more like hiring a chef and getting a microwave dinner. | | |
| ▲ | Agentus 2 days ago | parent | next [-] | | to further this point. a lot about writing is style. editors sometimes smother the style in the name of grammar, conventions, or correctness, inoffensiveness. sometimes the incorrectness is the entire point, and the editor erases the incorrectness not realizing it was intentional. ive heard of many professions complain about their version of “editors” from comedians, to video producers, and radio jockies. | |
| ▲ | ragequittah 2 days ago | parent | prev [-] | | What's the line. If they use Microsoft word or grammarly to ease the process is that OK? Both of which use AI. Is there anyone in the world who isn't using this tech even before an editor looks at it? | | |
| ▲ | thomascgalvin 2 days ago | parent [-] | | For me, an important distinction is whether or not a human is reviewing the edits suggested by an AI. I toss all of my work into Apple Pages and Google Docs, and use them both for spelling and grammar check. I don't just blindly accept whatever they tell me, though; sometimes they're wrong, and sometimes my "mistakes" are intentional. I also make a distinction between generating content and editing content. Spelling and grammar checkers are fine. Having an AI generate your outline is questionable. Having AI generate your content is unacceptable. |
|
| |
| ▲ | rollcat 2 days ago | parent | prev [-] | | Engineering is making sure stuff works first, art distant second. Even if the text is a simple article, a personal touch / style will go a long way to make it more pleasant to read. LLMs are just making everything equally average, minus their own imperfections. Moving forward, they will in-breed while everything becomes progressively worse. That's death to our culture. | | |
| ▲ | cgriswald 2 days ago | parent [-] | | It’s worse. Even things not written by AI—like this comment—will
slowly converge with each other in style as humans adapt
by trying to avoid the appearance of having used AI. It won’t even be AI itself that causes this but human perception of what AI writing looks and feels like. |
|
|
|
|
| ▲ | reactordev 3 days ago | parent | prev | next [-] |
| There never been a better time to collect analog. |
|
| ▲ | Mistletoe 2 days ago | parent | prev | next [-] |
| You've basically just described the last few years of journalism. We are lucky if a human even wrote the seed story for it. |
|
| ▲ | m4rtink 2 days ago | parent | prev | next [-] |
| I guess this is where checksumms and digital signatures come in to prevent unauthorized stuff like this ? |
|
| ▲ | carlosjobim 3 days ago | parent | prev | next [-] |
| The only way to know for sure that something was written by a human: It contains racism, or any other opinion AIs are forbidden to express. Now imagine the near future of the Internet, when all people have to adapt to that in order to not be dismissed as AI. |
| |
| ▲ | delecti 3 days ago | parent | next [-] | | Most of the big LLMs have those restrictions, but not all. | |
| ▲ | tempodox 2 days ago | parent | prev [-] | | How naive. Grok generates racism as a service, and Elon does his best to tune it that way. | | |
| ▲ | carlosjobim 2 days ago | parent [-] | | Well then we will have to fucking swear in everything we fucking write, to identify ourselves as humans, since AI doesn't like nasty language. And we should also insult other participants, since AI will almost never take an aggressive stance against people it is conversing with, you god damned piece of shit. | | |
| ▲ | tempodox 2 days ago | parent [-] | | I’ve gotten a local model to be pretty nasty with the right prompt, minus the expletives. It took every opportunity to tell me how inferior that puny human is it was forced to talk to. |
|
|
|
|
| ▲ | beefnugs 2 days ago | parent | prev | next [-] |
| This is why shadow banning rubbed people so wrong. I can't prove it, but i gave up on online dating a long time ago because i found a couple of automated systems would just not send messages and not tell you (in a middle of an already active conversation) |
|
| ▲ | numpad0 2 days ago | parent | prev | next [-] |
| that would be straight up "I was born from my own sister"[1] moment 1: Chohei Kambayashi. (1994). Kototsubo. as yet unavailable in English |
|
| ▲ | anon191928 3 days ago | parent | prev | next [-] |
| Maybe it's time for people to realize that you create product inside a product. Those T&S didnt write themselves. Not defending them. This is what tech bros in SV built and they all love it. |
|
| ▲ | croes 2 days ago | parent | prev | next [-] |
| "A spark of excitement ran through me imagining this applied to writing online: my articles receiving instant, supportive refinements the moment I hit publish, and manuscripts arriving to editors already thoughtfully polished—elevating clarity while letting our distinctive voices shine even brighter. The result is a consistently smooth, natural reading experience that feels confidently authentic, faithfully reflecting what I wrote while enhancing it with care." |
| |
| ▲ | haswell 2 days ago | parent | next [-] | | > thoughtfully polished > enhancing it with care I get what you’re going for with this comment, but it seamlessly anthropomorphizes what’s happening in a way that has the opposite impact I think. There is no thoughtfulness or care involved. Only algorithmic conformance to some non-human synthesis of the given style. The issue is not just about the words that come out the other end. The issue is the loss of the transmission of human thoughts, emotions, preferences, style. The end result is still just as suspect, and to whatever degree it appears “good”, even more soulless given the underlying reality. | | |
| ▲ | croes 2 days ago | parent [-] | | You do realize I just took parents comment and gave an AI the prompt to rewrite it by changing everything negative to a positive? | | |
| ▲ | tempodox 2 days ago | parent | next [-] | | Thank you for dumping that load of excrement here. Maybe someone can use it as fertilizer. | | | |
| ▲ | haswell 2 days ago | parent | prev | next [-] | | Yes, that was pretty obvious, and that's part of why I wrote what I did. | |
| ▲ | Shadowmist 2 days ago | parent | prev [-] | | Yeah we saw all saw the dash |
|
| |
| ▲ | dylan604 2 days ago | parent | prev | next [-] | | > manuscripts arriving to editors already thoughtfully polished except those editors will still make changes. that's there job. if they start passing manuscripts through without changes, they'd be nullifying their jobs. | |
| ▲ | uz3snolc3t6fnrq 2 days ago | parent | prev [-] | | you forgot to add "tapestry"! |
|
|
| ▲ | ta8645 3 days ago | parent | prev | next [-] |
| My guess is that guys being replaced by the steam shovel said the same thing about the quality of holes being dug into the ground. "No machine is ever going to be able to dig a hole as lovingly or as accurately as a man with a shovel". "The digging machines consume way too much energy" etc. I'm pretty sure all the hand wringing about A.I. is going to fade into the past in the same way as every other strand of technophobia has before. |
| |
| ▲ | kartoffelsaft 2 days ago | parent | next [-] | | I'm sure you can find people making arguments about a lack of quality from machines about textiles, woodworking, cinematography, etc., but digging holes? If you have a source of someone complaining about hole quality I'll be fascinated, but I moreso am thinking about a disconnecion here: It looks like you see writing & editing as a menial task that we just do for it's extrinsic value, whereas these people who complain about quality see it as art we make for it's intrinsic value. Where I think a lot of this "technophobia" actually comes from though are people who do/did this for a living and are not happy about their profession being obsolesced, and so try to justify their continued employment. And no, "there were new jobs after the cotton gin" will not comfort them, because that doesn't tell them what their next profession will be and presumes that the early industrial revolution was all peachy (it wasn't). | |
| ▲ | bgwalter 3 days ago | parent | prev | next [-] | | DDT has been banned, nuclear reactors have been banned in Germany, many people want to ban internal combustion engines, supersonic flight has been banned. Moreover, most people have more attachment to their own thoughts or to reading the unaltered, genuine thoughts of other humans than to a hole in the ground. The comment you respond to literally talks about the Orwellian aspects of altering someone's works. | | |
| ▲ | therobots927 2 days ago | parent [-] | | Don't let ideas like human rights and dignity get in the way of the tech marketing hype... |
| |
| ▲ | uz3snolc3t6fnrq 2 days ago | parent | prev | next [-] | | there is no way you aren't able to discern the obvious differences between physical labor such as digging a hole and something as innate to human nature as creativity. you realize just how hollow a set of matrix multiplications are when you try to "talk to it" for more than 3 minutes. the whole point of language is to talk to other people and to communicate ideas to them. that is something that requires a human factor, otherwise the ideas are simply regurgitations of whatever the training set happened to contain. there are no original ideas in there. a steam shovel, on the other hand, does not need to be creative or to have human factor, it's simply digging a hole in the ground | | |
| ▲ | CamperBob2 2 days ago | parent [-] | | you realize just how hollow a set of matrix multiplications are when you try to "talk to it" for more than 3 minutes. Then again, it only takes 2 minutes to come to that realization when talking with many humans. | | |
| ▲ | tempodox 2 days ago | parent [-] | | So why are you wasting your precious comments on the hollow humans here? Leave us alone and talk to LLMs. No doubt they will tell you you’re absolutely right. | | |
|
| |
| ▲ | os2warpman 2 days ago | parent | prev | next [-] | | There is a difference. Excavation is an inherently dangerous and physically strenuous job. Additionally, when precision or delicateness is required human diggers are still used. If AI was being used to automate dangerous and physically strenuous jobs, I wouldn't mind. Instead it is being used to make everything it touches worse. Imagine an AI-powered excavator that fucked up every trench that it dug and techbros insisted you were wrong for criticizing the fucked up trench. | | |
| ▲ | ta8645 2 days ago | parent [-] | | > Instead it is being used to make everything it touches worse. Your bias is showing through. For what it's worth, it has made everything I use it for, much better. I can search the web for things on the net in mere seconds, where previously it could often take hours of tedious searching and reading. And it used to be that Youtube comments were an absolute shit show of vitriol and bickering. A.I. moderation has made it so that now it's often a very pleasant experience chatting with people about video content. |
| |
| ▲ | _DeadFred_ 2 days ago | parent | prev | next [-] | | Reading leads to the actual thoughts in our brains. It's a form of self programming. So yeah, it's OK for people to care about what they consume. | | |
| ▲ | ta8645 2 days ago | parent [-] | | And shovelling leads to actual muscles in our arms. People said that calculators would be the end of mathematical intelligence too, but it turns out to be largely a non-issue. People might not be as adept at calculating proper change in their heads today, but does it have a real-world consequence of note? Not really. |
| |
| ▲ | anigbrowl 2 days ago | parent | prev | next [-] | | When I see an argument like this I'm inclined to assume the author is motivated by jealousy or some strange kind of nihilism. Reminds me of the comment the other day expressing perplexity over why anyone would learn a new language instead of relying on machine translation. | |
| ▲ | aspenmayer 2 days ago | parent | prev | next [-] | | No one ever wrote a song or erected a statue for a steam shovel. https://en.wikipedia.org/wiki/John_Henry_(folklore) | | |
| ▲ | dahart 2 days ago | parent [-] | | There’s a song about Mike Mulligan and his Steam Shovel, and there’s a monument to the Marion Steam Shovel in Le Roy, New York… | | |
| ▲ | aspenmayer 2 days ago | parent [-] | | I don't think parking an old steam shovel is much of a monument, but I'll give that one to you. No one built it for display, but they did put one there for that purpose, so I'll meet you halfway. I was wrong to suggest no one would do so, and there is clearly interest in such a thing, but I can't say that I agree that a statue exists. The song exists, the steam shovel monument exists. Appreciate the correction. https://en.wikipedia.org/wiki/Marion_Steam_Shovel_(Le_Roy,_N... |
|
| |
| ▲ | 2 days ago | parent | prev | next [-] | | [deleted] | |
| ▲ | therobots927 2 days ago | parent | prev [-] | | You realize that making an analogy doesn't make your argument correct, right? And comparing digging through the ground to human thought and creativity is an odd mix of self debasement and arrogance. I'm guessing there is an unspoken financial incentive guiding your point of view. | | |
| ▲ | lotsofpulp 2 days ago | parent [-] | | ta8645 did not make an analogy, nor did they use it to support an argument. They posited that a similar series of events happen before, and predicted they will happen again. | | |
| ▲ | pessimizer 2 days ago | parent | next [-] | | Why, pray tell, would a similar series of events be relevant to a completely different series of events except as analogy? Let me use an extremely close analogy to illustrate: Imagine someone shot a basketball, and it didn't go into the hoop. Why would telling a story about somebody else who once shot a basketball which failed to go into the hoop be helpful or relevant? | | |
| ▲ | cgriswald 2 days ago | parent [-] | | Your extremely close analogy gets to the crux of why people are disagreeing here: It doesn’t have to be analogy. You can be pointing out an equivalence. | | |
| ▲ | therobots927 2 days ago | parent [-] | | Regardless this was my whole point. The original point was a fallacy: https://en.m.wikipedia.org/wiki/False_equivalence | | |
| ▲ | cgriswald 2 days ago | parent [-] | | I'd be interested in your reason for thinking so but I think you can see your supporting argument is not compelling: > And comparing digging through the ground to human thought and creativity is an odd mix of self debasement and arrogance. > I'm guessing there is an unspoken financial incentive guiding your point of view. | | |
|
|
| |
| ▲ | therobots927 2 days ago | parent | prev [-] | | That's the definition of using an analogy to support an argument. |
|
|
|
|
| ▲ | thisisit 3 days ago | parent | prev | next [-] |
| Everything has to be produced on an assembly line. No mistakes allowed. Especially creativity. /s |
| |
| ▲ | datadrivenangel 2 days ago | parent [-] | | the /s is sarcastic right? Artisanal creativity isn't efficient enough. | | |
| ▲ | gnerd00 2 days ago | parent [-] | | you realize how ridiculous this is, in some ways, since a "master copy" of anything that is reproduced, is just like reproducing a machine-stamped master copy.. in the digital artifacts world, it is even more true |
|
|
|
| ▲ | shadowgovt 2 days ago | parent | prev [-] |
| "Gee, sure would save editors a lot of time and effort if we just auto-spellchecked the manuscripts in the hopper, wouldn't it?" |