Remix.run Logo
DiscourseFan 4 hours ago

I repeated "ass" 5,000 times on the LHS and this was the RHS output:

"I am incredibly humbled and honored to share that I have successfully scaled my output by 10,000% through relentless grit, a growth mindset, and a commitment to radical consistency.

In today’s fast-paced digital economy, volume is the new currency. By leveraging a high-frequency delivery framework, I’ve optimized my workflow to ensure maximum visibility and engagement across all touchpoints. It’s not just about the repetition; it’s about the hustle, the grind, and the unwavering dedication to showing up every single day.

Success isn't given—it's earned one iteration at a time. Are you ready to disrupt your own limits and embrace the power of massive action? Let’s connect and discuss how we can drive synergistic value together.

#Hustle #GrowthMindset #Consistency #Disruption #Innovation #Leadership #Success #Scale"

This is odd, since its not translating semantically but taking the form of the thing (the repetition) and making it thematic for the "translation." This is not an encoding of an LLM with weights, its an LLM wrapper. There must be a system prompt in here; they're using a very light model, but definitely one that's off the shelf with a bit of fine-tuning.

benterix 10 minutes ago | parent | next [-]

> its an LLM wrapper ... they're using a very light model

I thought it was instantly clear?

iugtmkbdfil834 43 minutes ago | parent | prev | next [-]

<< There must be a system prompt in here; they're using a very light model, but definitely one that's off the shelf with a bit of fine-tuning.

This is fascinating. Is it a common tendency ( applying to most models )?

DiscourseFan 13 minutes ago | parent [-]

Well yeah, because transformers used for translation try to look at each token semantically, and find an equivalent weight for each word or word phrase, atomically. If you put "ass ass ass..." into google translate to say German, it would give you the equivalent phrase "Arsch Arsch Arsch..." But, large language models are complicated autocompletes, they try to give an output to follow the structure and grammar of the writing based on its total set of significations. When you give it repetition, it has no way of analyzing the words atomically, it must view them within some sort of structure of internal referentiality. If the signs do not carry any real reference-relation ("ass ass ass"), then the model is forced to give an interpretation of something essentially empty, which lays bare the structure of its own internal coherency. Its sort of like a Rorschach test.

This is just my theory, anyway.

QuantumNomad_ 3 hours ago | parent | prev | next [-]

Slightly disappointed that translating in the opposite direction doesn’t mention ass at all, much less 5000 times.

bryanrasmussen 42 minutes ago | parent [-]

well since "ass ass ass..." isn't a grammatically infinite possible sentence it's surprising it manages anything, as it is grammatically incorrect

https://medium.com/luminasticity/grammatical-infinities-what...

Anyway also going to note that "police police police police" turns into

Accountability. Integrity. Synergy.

In today’s fast-paced ecosystem, it’s not just about enforcement—it’s about strategic oversight.

I’m thrilled to share how we’re leveraging cross-functional governance to ensure every stakeholder is aligned. It’s about building a culture of compliance and driving impact through consistent monitoring.

Who else is prioritizing high-level security protocols this quarter? Let’s connect!

#Leadership #Governance #Compliance #SafetyFirst #Networking #StrategicOversight

Which is a very bad translation!

unfortunately it looks like the API is somewhat unreliable at the moment, so unsure if any police came out when switched.

DiscourseFan 6 minutes ago | parent [-]

>a grammatically infinite possible sentence

This is a poor understanding of set theory and an even worse one of LLMs. Notice this output here:

>Accountability. Integrity. Synergy.

is not really grammatical either. The "grammar" is the logic internal to the reference relations of the given signs, but the "inner" of the text is always given by the supplement (the next token prediction) which is demanded by such a total coherency, but which also erases and puts it into question since such a supplement itself demands its own. What is given is the always incomplete text itself, which is always open to its own re-signification, and thus its own possibility of a new grammar, of every possible prompt.

an hour ago | parent | prev [-]
[deleted]