If it can't be solved by a script then what's problem with seeing if you can use LLMs ?
I guess I just don't see your point. So a few purported applications are not very sensible. So what ? This is every breakthrough ever.