Remix.run Logo
Al-Khwarizmi 6 hours ago

My impression is that Gemini does it in a quite natural way. It answers your questions, and then suggests possible related questions that you might ask, which I find useful.

But ChatGPT feels extremely baity. Like it doesn't answer your question, but only 80% of it, leaving the other 20% on purpose for the bait. And then when you ask the second question it answers with another incomplete fact leaving things for the bait, and so on.

As an analogy, it's as if when asked for the seasons of the year, Gemini said "spring, summer, autumn and winter, do you also want to know when each season starts and ends, or maybe they climate?" and ChatGPT said "The first three seasons are spring, summer and autumn. The fourth one is really interesting and many people don't know it, would you like to tell me about it?" It's an exaggeration, of course, but in complex questions it feels to me exactly like that. And I find it so annoying that I'm thinking of canceling my subscription if it keeps behaving that way.

what 3 hours ago | parent [-]

It’s worse. It gives you all 4 seasons but suggests there’s a secret 5th season most people don’t know about.