▲ | primitivesuave 2 days ago | |
IMO it is a smart decision to implement this as a self-hosted system, and have the AI make PRs against the IaC configuration - for devops matters, human-in-the-loop is a high priority. I'm curious how well this would work if I'm using Pulumi or the AWS CDK (both are well-known to LLMs). I consulted for an early stage company that was trying to do this during the GPT-3 era. Despite the founders' stellar reputation and impressive startup pedigree, it was exceedingly difficult to get customers to provide meaningful read access to their AWS infrastructure, let alone the ability to make changes. | ||
▲ | nickpapciak 2 days ago | parent [-] | |
LLMs are pretty awesome at Terraform, probably because there is just so much training data. They are also pretty good at the AWS CDK and Pulumi to a bit of a lesser extent, but I think giving them access to documentation is what helps make them the most accurate. Without good documentation the models start to hallucinate a bit. And yeah, we are noticing that it’s difficult to convince people to give us access to their infrastructure. I hope that a BYOC model will help with that. |