| |
| ▲ | oefrha a day ago | parent | next [-] | | 1. Not all secrets can be rotated. E.g. I can't just "rotate" my home address, which I prefer to be private. 2. Even for rotatable secrets, "I don't think there is any potential further damage" rests on the assumption that the secret is 100% invalidated everywhere. What if there are obscure and/or neglected systems, possibly outside of your control, that still accept that secret? No system is bug-free. If I can take steps to minimize access to an invalidated secret, I will. | | |
| ▲ | jofzar a day ago | parent | next [-] | | > 1. Not all secrets can be rotated. E.g. I can't just "rotate" my home address, which I prefer to be private. Reporter can sell their current house and move to another home as a workaround Closing ticket as workaround provided. | | | |
| ▲ | matsemann a day ago | parent | prev | next [-] | | Also avoids false positives in the future from automated scanners, bounty hunters etc. if you clean up now. | |
| ▲ | whyever 17 hours ago | parent | prev [-] | | Ok, so how would such a secret end up in a commit? E.g., I don't see why I would have my home address anywhere close to a code repository. Maybe if I used the wrong "secret" email address when authoring the commit? If it's not possible to invalidate your compromised software secrets, I would argue that you have bigger and more urgent problems to fix. But fair enough: Deleting them from GitHub might reduce the impact in such cases. | | |
| ▲ | oefrha 15 hours ago | parent [-] | | That's just an example... To give a more real example, I have accidentally committed and pushed my own private data (e.g. from my private social feed) used in testing. That could include my address too, so the example was quite possible to begin with. |
|
| |
| ▲ | chickenzzzzu a day ago | parent | prev [-] | | Anyone who puts weight on digging through a project to see if they've ever leaked a secret is guilty of encouraging an antipattern-- the guaranteed outcome is you'll have an organization petrified of shipping anything, in case someone interprets it as bad or a security risk, etc. | | |
| ▲ | mk89 a day ago | parent [-] | | You can see it that way, however, there are automated tools to scan for secrets. Even github does it. In my opinion, this educates the developers to be more careful and slightly more security oriented, rather than afraid of shipping code. I would also like to remind that a leaked AWS secret can cost 100Ks of $ to an organization. And AWS won't help you there. It can literally break your company and get people unemployed, depending on the secret/saas. | | |
| ▲ | chickenzzzzu a day ago | parent [-] | | While I am not suggesting that people should go out and leak their secret keys or push a buffer overflow, the fastest way to learn that you have this problem is by pushing that code to the internet on a project that isn't important. The AWS secret key thing doesn't hold up here, you just really shouldn't do it, but how about an ec2 ssh key or passwords in plaintext? How did I learn about parameterized queries for SQL injection and XML escape vulnerabilities? By waking up to a Russian dude attacking my Java myspace clone. No amount of internal review and coding standards and etc will catch all of these things. You can only hope that you build the muscle memory to catch most of them, and that muscle memory is forged through being punched in the face Lastly, any pompous corporate developer making 200k a year or more who claims they've never shipped a vuln and that they write perfect code the first time is just a liar. | | |
| ▲ | fisf a day ago | parent [-] | | > No amount of internal review and coding standards and etc will catch all of these things. You can only hope that you build the muscle memory to catch most of them, and that muscle memory is forged through being punched in the face Everything you mentioned is security 101, widely known, and can be caught by standard tools. Shrugging that off as a learning experience does not really hold much water in a professional context. | | |
| ▲ | chickenzzzzu a day ago | parent [-] | | "In a professional context". Spare me. Don't act like every company on earth has a free, performant, 100% accurate no false positive linter hooked up to their magical build pipeline. Have you seen the caliber of companies that have been affected by CVEs and password/PII leaks since just covid? It's everyone The responsibility is on the programmer to learn and remember these things. Period, end of story. Just as smart pointers are a bandaid on a bigger problem with real consequences (memory fragmentation and cache misses), so too is a giga-linter that serves as permanent training wheels for so called programmers. |
|
|
|
|
|