▲ | munificent 8 days ago | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
> There can be no objective story since the very act of assembling facts requires implicit beliefs about what should be emphasized and what should be left out. History is therefore a constant act of reinterpretation and triangulation, which is something that LLMs, as linguistic averaging machines, simply cannot do. This exactly why tech companies want to replace those jobs with LLMs. The companies control the models, the models control the narrative, the narrative controls the world. Whoever can get the most stories into the heads of the masses runs the world. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | randysalami 8 days ago | parent | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
"tech companies", "companies [who] control the models", "whoever" To be more discrete, patchwork alliances of elites stretching decades and centuries back to concentrate power. Tech companies are under the thumb of the US government and the US government is under the thumb of the elites. It's not direct but it doesn't need to be. Many soft power mechanisms exist and can be deployed when needed e.g. Visa/Mastercard censorship. The US was always founded for elites, by elites but concessions needed to be made to workers out of necessity. With technology and the destruction of unions, this is no longer the case. The veracity of this statement is still up for debate but truth won't stop them from giving it a shot (see WW2). "Whoever can get the most stories into the heads of the masses runs the world." I'd argue this is already the case. It has nothing to do with transformer models or AGI but basic machine learning algorithms being applied at scale in apps like TikTok, YouTube, and Facebook to addict users, fragment them, and destroy their sense of reality. They are running the world and what is happening now is their plan to keep running it, eternally, and in the most extreme fashion. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | glenstein 8 days ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
>History is therefore a constant act of reinterpretation and triangulation, which is something that LLMs, as linguistic averaging machines, simply cannot do. I know you weren't necessarily endorsing the passage you quoted, but I want to jump off and react to just this part for a moment. I find it completely baffling that people say things in the form of "computers can do [simple operation], but [adjusting for contextual variance] is something they simply cannot do." There was a version of this in the debate over "robot umps" in baseball that exposed the limitation of this argument in an obvious way. People would insist that automated calls of balls and strikes loses the human element, because human umpires could situationally squeeze or expand the strike zone in big moments. E.g. if it's the World Series, the bases are loaded, the count is 0-2, and the next pitch is close, call it a ball, because it extends the game, you linger in the drama a bit more. This was supposedly an example of something a computer could not do, and frequently when this point was made it induced lots of solemn head nodding in affirmation of this deep and cherished baseball wisdom. But... why TF not? You actually could define high leverage and close game situations, and define exactly how to expand the zone, and machines could call those too, and do so more accurately than humans. So they could better respect contextual sensitivity that critics insist is so important. Even now, in fits and starts, LLMs are engaging in a kind of multi-layered triangulating, just to even understand language. It can pick up on multilayered things like subtext or balance of emphasis, or unstated implications, or connotations, all filtered through rules of grammar. It doesn't mean they are perfect, but calibrating for context or emphasis that is most important for historical understanding seems absolutely within machine capabilities, and I don't know what other than punch drunk romanticism for "the human element" moves people to think that's an enlightened intellectual position. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | nudgeOrnurture 6 days ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
> a constant act of reinterpretation and triangulation, which is something that LLMs, as linguistic averaging machines, simply cannot do. Just a matter of time until they can do it. I actually believe it will be an "organic" nature for LLMs to correct narratives that require correction after assembling all facts. The objective story is a discourse, including the nonsense that is often necessary for a few lines or more before one gets to the core of something or builds up the strength of character to say the truth. Objectivity is a conversation, a neverending one and getting in the way via censorship, gaslighting, cancel culture and what not is no more than an act of vanity. Humanity's age of consciousness is getting fucked pretty bad atm and we won't recover "in time" to save enough minds before hitting the road towards singularity but I'm positive Robots will be able to salvage enough pieces later on, and simulate it to train us to be better. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | throwawayq3423 8 days ago | parent | prev | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
I think you dramatically overestimate the effectiveness of trying to shape narratives and change people's minds. Yes, online content is incredibly influential, but it's not like you can just choose which content is effective. The effectiveness is tied to a zeitgeist that is not predictable, as far as I have seen. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | surebut 8 days ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
That’s their self selecting goal, sure. Fortunately for humanity the main drivers are old as hell, physics is ageist. Data centers are not a fundamental property of reality. They can be taken offline; sabotage or just loss of skills in time to maintain them leading to cascading failures. A new pandemic could wipe out billions and the loss of service workers cause it all to fail. Wifi satellites can go unreplaced. They're a long long ways from "protomolecule" that just carries on infinitely on its own CEOs don't really understand physics. Signal loss and such. Just data models that only mean something to their immediate business motives. They're more like priests; well versed in their profession, but oblivious to how anything outside that bubble works. |