| ▲ | jwilber 3 hours ago | ||||||||||||||||||||||||||||||||||
I agree with the sentiment that there are use cases for web scraping where an agent is preferable to a cron job, but I think your particular example can certainly be achieved with a cron job and a basic parser script. Just have Claude write it. | |||||||||||||||||||||||||||||||||||
| ▲ | BeetleB 3 hours ago | parent [-] | ||||||||||||||||||||||||||||||||||
I didn't say it's not doable. I'm not even saying it's hard. But nothing beats telling Claw to do it for me while I'm in the middle of groceries. Put another way: If it can do it (reliably), why on Earth would I babysit Claude to write it? The whole point is this: When AI coding became a thing, many folks rediscovered the joy of programming, because now they could use Claude to code up stuff they wouldn't have bothered to. The barrier to entry went down. OpenClaw is simply that taken to the next level. And as an aside, let's just dispense with parsing altogether! If I were writing this as a script, I would simply fetch the text of the page, and have the script send it to an LLM instead of parsing. Why worry about parsing bugs on a one-off script? | |||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||