Remix.run Logo
lukev 2 hours ago

This is being downvoted but it shouldn't be.

If the ultimate goal is having a LLM control a computer, round-tripping through a UX designed for bipedal bags of meat with weird jelly-filled optical sensors is wildly inefficient.

Just stay in the computer! You're already there! Vision-driven computer use is a dead end.

zmmmmm 38 minutes ago | parent | next [-]

you could say that about natural language as well, but it seems like having computers learn to interface with natural language at scale is easier than teaching humans to interface using computer languages at scale. Even most qualified people who work as software programmers produce such buggy piles of garbage we need entire software methodologies and testing frameworks to deal with how bad it is. It won't surprise me if visual computer use follows a similar pattern. we are so bad at describing what we want the computer to do that it's easier if it just looks at the screen and figures it out.

ashirviskas an hour ago | parent | prev | next [-]

Someone ping me in 5 years, I want to see if this aged like milk or wine

chasd00 2 hours ago | parent | prev [-]

i replied as much to a sibling comment but i think this is a way to wiggle out of robots.txt, identifying user agent strings, and other traditional ways for sites to filter for a bot.

lukev an hour ago | parent | next [-]

Right but those things exist to prevent bots. Which this is.

So at this point we're talking about participating in the (very old) arms race between scrapers & content providers.

If enough people want agents, then services should (or will) provide agent-compatible APIs. The video round-trip remains stupid from a whole-system perspective.

mvdtnz 37 minutes ago | parent | prev [-]

I mean if they want to "wriggle out" of robots.txt they can just ignore it. It's entirely voluntary.