| ▲ | mrweasel 21 hours ago | |||||||
> The secret is held by the metadata server that the CI instance has access to But how does the metadata server know that the CI instance is allowed to access the secret? Especially when the CI/CD system is hosted at a 3rd. party. It needs to present some form of credentials. The CI system may also need permission or credentials for a private repository of packages or artifacts needed in the build process. For me, a CI/CD system needs two things: Secret management and the ability to run Bash. | ||||||||
| ▲ | gcr 20 hours ago | parent | next [-] | |||||||
Yeah I was confused about that bit too. AWS and GCP's metadata servers know which instances were deployed, so they presumably have some way of verifying the instance's identity out-of-band, such as being tagged by an internal job or machine identifier. As for deploying from a trusted service without managing credentials, PyPI calls this "trusted publishing": https://docs.pypi.org/trusted-publishers/ From the docs: 1. Certain CI services (like GitHub Actions) are OIDC identity providers, meaning that they can issue short-lived credentials ("OIDC tokens") that a third party can strongly verify came from the CI service (as well as which user, repository, etc. actually executed); 2. Projects on PyPI can be configured to trust a particular configuration on a particular CI service, making that configuration an OIDC publisher for that project; 3. Release automation (like GitHub Actions) can submit an OIDC token to PyPI. The token will be matched against configurations trusted by different projects; if any projects trust the token's configuration, then PyPI will mint a short-lived API token for those projects and return it; 4. The short-lived API token behaves exactly like a normal project-scoped API token, except that it's only valid for 15 minutes from time of creation (enough time for the CI to use it to upload packages). You have to add your github repository as a "trusted pulbisher" to your PyPI packages. Honetsly the whole workflow bothers me -- how can PyPI be sure it's talking to github? what if an attacker could mess with PyPI's DNS? -- but it's how it's done. | ||||||||
| ||||||||
| ▲ | hinkley 17 hours ago | parent | prev [-] | |||||||
It would be good if it could also scan build output like code coverage and test results. But that’s about all it should do. I keep meaning to write a partially federated CI tool that uses Prometheus for all of its telemetry data but never get around to it. I ended up carving out a couple other things I’d like to be part of the process as a separate app because I was still getting panopticon vibes and some data should just be private. | ||||||||