▲ | parhamn 4 days ago | ||||||||||||||||||||||
Do we know how big the "batch processing" market is? I know the major providers offer 50%+ off for off-peak processing. I assumed it was to slightly correct this problem and on the surface it seems like it'd be useful for big data places where process-eventually is enough, e.g. it could be a relatively big market. Is it? | |||||||||||||||||||||||
▲ | sdesol 4 days ago | parent [-] | ||||||||||||||||||||||
I don't think you need to be big data to benefit. A major issue we have right now is, we want the coding process to be more "Agentic", but we don't have an easy way for LLMs to determine what to pull into context to solve a problem. This is a problem that I am working on with my personal AI search assistant, which I talk about below: https://github.com/gitsense/chat/blob/main/packages/chat/wid... Analyzers are the "Brains" for my search, but generating the analysis is both tedious and can be costly. I'm working on the tedious part and with batch processing, you can probably process thousands of files for under 5 dollars with Gemini 2.5 Flash. With batch processing and the ability to continuously analyze 10s of thousands of files, I can see companies wanting to make "Agentic" coding smarter, which should help with GPU utilization and drive down the cost of software development. | |||||||||||||||||||||||
|