Remix.run Logo
9rx a day ago

The problem isn't so much the five seconds, it is the muscle memory. You become accustomed to blindly hitting "Yes" every time you've accidentally typed something into the text box, and then that time when you actually put a lot of effort into something... Boom. Its gone. I have been bitten before. Something like the parent described would be a huge improvement.

Granted, it seems the even better UX is to save what the user inputs and let them recover if they lost something important. That would also help for other things, like crashes, which have also burned me in the past. But tradeoffs, as always.

fckgw a day ago | parent | next [-]

Which is fine! That's me making the explicit choice that yes, I want to close this box and yes, I want to lose this data. I don't need an AI evaluating how important it thinks I am and second guessing my judgement call.

I tell the computer what to do, not the other way around.

9rx a day ago | parent [-]

You do, however, need to be able to tell the computer that you want to opt in (or out, I suppose) of being able to using AI to evaluate how important it thinks your work is. If you don’t have that option, it is, in fact, the computer telling you what to do. And why would you want the computer to tell you what to do?

addaon a day ago | parent | prev | next [-]

> You become accustomed to blindly hitting "Yes" every time you've accidentally typed something into the text box, and then that time when you actually put a lot of effort into something... Boom. Its gone.

Wouldn't you just hit undo? Yeah, it's a bit obnoxious that Chrome for example uses cmd-shift-T to undo in this case instead of the application-wide undo stack, but I feel like the focus for improving software resilience to user error should continue to be on increasing the power of the undo stack (like it's been for more than 30 years so far), not trying to optimize what gets put in the undo stack in the first place.

poopooracoocoo a day ago | parent | next [-]

Now y'all are just analysing the UX of YouTube and Chrome.

The problem is that by agreeing to close the tab, you're agreeing to discard the comment. There's currently no way to bring it back. There's no way to undo.

AI can't fix that. There is Microsoft's "snapshot" thing but it's really just a waste of storage space.

johnnyanmac a day ago | parent [-]

I mean, it can. But so can a task runner that periodically saves writing to a clipboard history. The value is questionable, but throwing an LLM at it does feel overkill on terms of overhead.

9rx a day ago | parent | prev [-]

> Wouldn't you just hit undo?

Because:

1. Undo is usually treated as an application-level concern, meaning that once the application has exited there is no specific undo, as it is normally though of, function available. The 'desktop environment' integration necessary for this isn't commonly found.

2. Even if the application is still running, it only helps if the browser has implemented it. You mention Chrome has it, which is good, but Chrome is pretty lousy about just about everything else, so... Pick your poison, I guess.

3. This was already mentioned as the better user experience anyway, albeit left open-ended for designers, so it is not exactly clear what you are trying to add. Did you randomly stop reading in the middle?

officeplant a day ago | parent | prev | next [-]

>You become accustomed to blindly hitting "Yes" every time you've accidentally typed something into the text box, and then that time when you actually put a lot of effort into something... Boom. Its gone.

I'm not sure we need even local AI's reading everything we do for what amounts to a skill issue.

9rx a day ago | parent [-]

You're quite right that those with skills have no need for computers, but for the rest of us there is no need for them to not have a good user experience.

pavel_lishin a day ago | parent | prev | next [-]

I have the exact opposite muscle memory.

th0ma5 a day ago | parent | prev [-]

I think this is covered in the Bainbridge automation paper https://en.wikipedia.org/wiki/Ironies_of_Automation ... When the user doesn't have practiced context like you described, to be expected to suddenly have that practiced context to do the right thing in a surprise moment is untenable.