Remix.run Logo
bradley13 3 days ago

This applies to so many AIs. I don't want a bubbly sycophant. I don't want a fake personality or an anime avatar. I just want a helpful assistant.

I also don't get wanting to talk to an AI. Unless you are alone, that's going to be irritating for everyone else around.

uncircle 3 days ago | parent | next [-]

I want an AI modeled after short-tempered stereotypical Germans or Eastern Europeans, not copying the attitude of non-confrontational Californians that say “dude, that’s awesome!” a dozen times a day.

And I mean that unironically.

finaard 3 days ago | parent | next [-]

As a German not working in Germany - I often get the feedback that the initial contact with me is rather off-putting, but over time people start appreciating my directness.

j4coh 3 days ago | parent [-]

Bless your heart.

anal_reactor 3 days ago | parent | prev | next [-]

The problem is, performing social interaction theatre is way more important than actually using logic to solve issues. Look at how many corporate jobs are 10% engineering and 90% kissing people's assess in order to maintain social cohesion and hierarchy. Sure, you say you want "short-tempered stereotypical Germans or Eastern Europeans" but guess what - most people say some variation of that, but when they actually see such behavior, they get upset. So we continue with the theatre.

For reference, see how Linus Torvalds was criticized for trying to protect the world's most important open source project from weaponized stupidity at the cost of someone experiencing minor emotional damage.

uncircle 3 days ago | parent [-]

That is a fair assessment, but on the other hand, yes men are not required to do things, despite people liking them. You can achieve great things even if your team is made of Germans.

My tongue-in-cheek comment wonders if having actors with a modicum of personality to be better than just being surrounded by over-enthusiastic bootlickers. In my experience, many projects would benefit from someone saying “no, that is silly.”

bluGill 3 days ago | parent | prev | next [-]

While you are not alone, all evidence points to the vast majority of people preferring "yes men" as their advisors. Often to their eventual harm.

threetonesun 3 days ago | parent [-]

One would think that if AI was as good at coding as they tell us it is a style toggle would take all of five, ten minutes tops.

rob74 3 days ago | parent | prev | next [-]

Ok, then I can write an LLM too - because the guys you mention, if you asked them to write your code for you, would just tell you to get lost (or a more strongly phrased variation thereof).

Yizahi 3 days ago | parent | prev [-]

Not possible.

/s

scotty79 3 days ago | parent | prev | next [-]

Sure but different people have different preferences. Some people mourn replacement of GPT4 with 5 because 5 has way less of a bubbly personality.

cubefox 3 days ago | parent | next [-]

There is evidence from Reddit that particularly women used GPT-4o as their AI "boyfriend". I think that's unhealthy behavior and it is probably net positive that GPT-5 doesn't do that anymore.

scotty79 3 days ago | parent | next [-]

Why is it unhealthy? If you just want a good word that you don't have in your life why should you bother another person if machine can do it?

cubefox 2 days ago | parent [-]

Because it's a mirage. People want to be loved, but GPT-4o doesn't love them. It only creates an optical illusion of love.

9rx 2 days ago | parent | next [-]

People want the feelings associated with love. They don't care how they get it.

The advantage of "real" love, health wise, is that the other person acts as a moderator. When things start to get out of hand they will back away. Alternatives, like drugs, tend to spiral of out of control when an individual's self-control is the only limiting factor. GPT on the surface seems more like being on the drug end of the spectrum, ready to love bomb you until you can't take it anymore, but the above suggests that it will also back away, so perhaps its love is actually more like another person than it may originally seem.

cubefox 2 days ago | parent [-]

> People want the feelings associated with love. They don't care how they get it.

Most people want to be loved, not just believe they are. They don't want to be unknowingly deceived. For the same reason they don't want to be unknowingly cheated on. If someone tells them their partner is a cheater, or an unconscious android, they wouldn't be mad about the person who gives them this information, but about their partner.

That's the classic argument against psychological hedonism. See https://en.wikipedia.org/wiki/Experience_machine

scotty79 2 days ago | parent | next [-]

> Most people want to be loved, not just believe they are.

Many people who are genuinely loved don't feel loved. So people really are more after the feeling than the fact.

9rx 2 days ago | parent | prev [-]

> For the same reason they don't want to be unknowingly cheated on.

That's the thing, though, there is nothing about being a cheater that equates to loss of love (or never having loved). In fact, it is telling that you shifted gears to the topic of deceit rather than love.

It is true that feelings of love are often lost when one has been cheated on. So, yes, it is a fair point that for many those feelings of love aren't made available if one does not also have trust. There is a association there, so your gear change is understood. I expect you are absolutely right that if those aforementioned women dating GPT-4o found out that it wasn't an AI bot, but actually some guy typing away at a keyboard, they would lose their feelings even if the guy on the other side did actually love them!

Look at how many people get creeped out when they find out that a person they are disinterested in loves them. Clearly being loved isn't what most people seek. They want to feel the feelings associated with love. All your comment tells, surprising nobody, is that the feelings of love are not like a tap you can simply turn on (well, maybe in the case of drugs). The feelings require a special environment where everything has to be just right, and trust is often a necessary part of that environment. Introduce deceit and so goes the feelings.

scotty79 2 days ago | parent | prev [-]

If you get a massage from massage machine is it also a mirage? If you use a vibrator is it also a mirage? Why it suddenly becomes an unhealthy mirage if you need words to tickle yourself?

cubefox 2 days ago | parent [-]

A vibrator still works as intended if you believe it doesn't love you. GPT-4o stops working as intended if you believe it doesn't love you. The latter relies on an illusion, the former doesn't.

(More precisely, a vibrator still relies on an illusion in the evolutionary sense: it doesn't create offspring, so over time phenotypes who like vibrators get replaced by those who don't.)

scotty79 2 days ago | parent [-]

That's simply not true. Vibrators don't really work that well if you somehow suppress the fantasies during use. Same way that GPT-4o works better if you fantasize briefly that it might love you when it says what it does. Almost all people who use it in this manner are fully aware of its limitations. While they are phrasing it as "I lost my love" their complaints are really of the kind of "my toy broke". And they find similar mitigation strategies for the problem, finding another toy, giving each other tips on how to use what's available.

As for the evolutionary perspective, evolution is not that simple. Gay people typically have way less offspring than vibrator users and somehow they are still around and plentiful.

Brains are messy hodgepodge of various subsystems. Clever primates found multitude of way how to mess with them to make life more bearable. So far the species continuous regardless.

ivan_gammel 3 days ago | parent | prev [-]

GPT-5 still does that as they will soon discover.

cubefox 3 days ago | parent [-]

No. They complained about GPT-5 because it did not act like their boyfriend anymore.

catigula 3 days ago | parent | prev | next [-]

GPT-5 still has a terrible personality.

"Yeah -- some bullshit"

still feels like trash as the presentation is of a friendly person rather than an unthinking machine, which it is. The false presentation of humanness is a huge problem.

ted_bunny 3 days ago | parent [-]

I feel strongly about this. LLMs should not try to write like humans. Computer voices should sound robotic. And when we have actual androids walking around, they should stay on the far side of the uncanny valley. People are already anthropomorphizing them too much.

Applejinx 2 days ago | parent [-]

It can't, though. It's language. We don't have a body of work constituting robots talking to each other in words. Hardly fair to ask LLMs not to write like humans when humans constructed everything they're built on.

ted_bunny 2 days ago | parent | next [-]

It's perfectly capable of writing in a succinct and emotionless style, which people refer to as robotic.

catigula 2 days ago | parent | prev [-]

These models are purposely made to sound more 'friendly' through RLHF

scotty79 2 days ago | parent [-]

The chat that rejects you because your prompt put it in a bad mood sounds less useful.

catigula 2 days ago | parent [-]

How about the emotionless servant chat?

WesolyKubeczek 3 days ago | parent | prev [-]

I, for one, say good riddance to it.

bn-l 3 days ago | parent [-]

But it doesn’t say ima good boy anymore :(

giancarlostoro 2 days ago | parent | prev | next [-]

I did as a test, Grok has "workspaces" and you can add a pre-prompt. So I made a Kamina (from Gurren Lagann) "worspace" so I could ask it silly questions and get back hyped up answers from "Kamina" it worked decently, my point is some tools out there let you "pre-prompt" based on your context. I believe Perplexity has this as well, they don't make it easy to find though.

benrapscallion 2 days ago | parent [-]

Where is this setting in Perplexity?

giancarlostoro 2 days ago | parent [-]

They call it spaces, so when you create a "Space" on Desktop at least, you look on the right hand corner for the three dots, next to the Share button, and click "Settings":

> Give instructions to Perplexity to customize how answers are focused and structured.

You type into that box, now every space you make can have a fully custom "pre-prompt" as I call it.

I only just started using Perplexity, they really need to rework their UI a little bit.

andrewstuart 3 days ago | parent | prev [-]

I want no personality at all.

It’s software. It should have no personality.

Imagine if Microsoft Word had a silly chirpy personality that kept asking you inane questions.

Oh, wait ….

gryn 3 days ago | parent [-]

Keep Clippy's name out of you mouth ! he's a good boy. /s