Remix.run Logo
causal 5 hours ago

> amongst smart people i know there's a surprisingly high correlation between those who continue to be unimpressed by AI and those who use a hobbled version of it.

I've noticed this too, and I think it's a good thing: much better to start using the simplest forms and understand AI from first principles rather than purchase the most complete package possible without understanding what is going on. The cranky ones on HN are loud, but many of the smart-but-careful ones end up going on to be the best power users.

randusername 5 hours ago | parent | next [-]

I think you have to get in early to understand the opportunities and limitations.

I feel lucky to have experienced early Facebook and Twitter. My friends and I figured out how to avoid stupidity when the stakes were low. Oversharing, getting "hacked", recognizing engagement-bait. And we saw the potential back when the goal was social networking, not making money. Our parents were late. Lambs for the slaughter by the time the technology got so popular and the algorithms got so good and users were conditioned to accept all the ads and privacy invasiveness as table stakes.

I think AI is similar. Lower the stakes, then make mistakes faster than everyone else so you learn quickly.

bobson381 4 hours ago | parent | next [-]

So acquiring immunity to a lower-risk version of the service before it's ramped up? e.g. jumping on FB now as a new user is vastly different from doing so in 2014 - so while you might go through the same noob-patterms, you're doing so with a lower-octane version of the thing. Like the risk of AI psychosis has probably gone up for new users, like the risk of someone getting too high since we started optimizing weed for maximum THC. ?

mmahemoff 4 hours ago | parent | prev [-]

There's also a massive selection bias when the cohort is early adopters.

Another thing about early users is they are also longer-term users (assuming they are still on the platform) and have seen the platform evolve, which gives them a richer understanding of how everything fits together and what role certain features are meant to serve.

aa-jv 5 hours ago | parent | prev [-]

(Disclaimer: systems software developer with 30+ years experience)

I was initially overly optimistic about AI and embraced it fully. I tried using it on multiple projects - and while the initial results were impressive, I quickly burned my fingers as I got it more and more integrated with my workflow. I tried all the things, last year. This year, I'm being a lot more conservative about it.

Now .. I don't pay for it - I only use the bare bones versions that are available, and if I have to install something, I decline. Web-only ... for now.

I simply don't trust it well enough, and I already have a disdain for remotely-operated software - so until it gets really, really reliable, predictable and .. just downright good .. I will continue to use it merely as an advanced search engine.

This might be myopic, but I've been burned too many times and my projects suffered as a result of over-zealous use of AI.

It sure is fun watching what other folks are daring to accomplish with it, though ..

AlienRobot 4 hours ago | parent [-]

This week Adobe decided, out of nowhere, to kill their 2D animation product (Animate, which is based on Flash) to focus on AI. I'm already seeing animators post that Adobe killed their entire career.

Although that feels a bit exaggerated, I feel it's not far from the truth. If there were, say, 3 closed source animation software that could do professional animation in total, and they just all decided to just kill the product one day, it would actually kill the entire industry. Animators would have no software to actually create animation with. They would have to wait until someone makes one, which would take years for feature parity, and why would anyone make one when the existing software thought such product wasn't a good idea to begin with?

I feel this isn't much different with AI. It's a rush to make people depend on a software that literally can't run on a personal computer. Adobe probably loves it because the user can't pirate the AI. If people forget how to use image editing software and start depending entirely on AI to do the job, that means they will forever be slaves to developers who can host and setup the AI on the cloud.

Imagine if people forgot how to format a document in Word and they depended on Copilot to do this.

Imagine if people forgot how to code.

puelocesar 2 hours ago | parent | next [-]

Now I think you touched the perfect point on why this is being shoved through our throats, and why I'm very reticent in using it.

This is not about big increases of productivity, this is whole thing about selling dependence on privately controlled, closed source tools. To concentrate even more power in the hands of a very few, morally questionable people.

koakuma-chan 3 hours ago | parent | prev [-]

Sounds like a good startup idea. Make software for animators and slap AI on it.