| ▲ | bredren 5 hours ago | |
Why not just write a skill and script that calls crawl4ai or similar and do this using Claude code? You can store the page as markdown for future sessions, mash the data w other context, you name it. The web Claude is incredibly limited both in capability and workflow integration. Doesn’t matter if you’re dealing with bids from arbor contractors or researching solutions for a DB problem. | ||
| ▲ | Barbing 4 hours ago | parent | next [-] | |
Want this w/o killing the free open web. Maybe I run an old PC adjacent to the scraper to manually visit the scraped pages without an adblocker, & buy something I need from an ad periodically (while a cohesive response is being generated in the meantime) Ya sounds dumb, wishing for a middle ground that lets us be effective but also good netizens. Maybe that Cloudflare plan to charge the bots… | ||
| ▲ | layer8 4 hours ago | parent | prev [-] | |