| ▲ | rascul 7 hours ago |
| Obscurity can be fine but it's not security. I think of it like cover and concealment in the military. Security is cover. Something you can get behind so the bullets don't hit you. Obscurity is concealment. Harder to see, harder to find, so the enemy doesn't know where to shoot, but it's not stopping any bullets. Both have advantages and disadvantages and can complement each other depending on how they're used. |
|
| ▲ | red369 5 minutes ago | parent | next [-] |
| Well off-topic, but did you recently listen to Andy Stumpf on a podcast? Asking because of the Baader–Meinhof phenomenon :) |
|
| ▲ | mday27 6 hours ago | parent | prev | next [-] |
| This is an especially good analogy because facing a well-resourced adversary in cybersecurity is like finding out that the enemy brought artillery -- hopefully you weren't relying entirely on obscurity because pretty soon there will be nowhere to hide |
| |
|
| ▲ | staticassertion 6 hours ago | parent | prev | next [-] |
| I don't think that really works because obscurity isn't harder to see or find. I don't know the analogy, it's like standing out in the open and being like "yeah but who would think to look here lol". |
| |
| ▲ | willis936 6 hours ago | parent | next [-] | | I think you're misinterpreting "obscurity" for "lack of obscurity". If you have a vulnerability in an API interface that is completely undocumented that is a vulnerability that is obscured. It's hiding in the woods, not standing in a field. To keep with the analogy: no one is going to stand in a field when people are shooting at you. So then why do a small subset of vocal people online suggest that you just put your bulletproof vest and claim that hiding in the woods, regardless of the vest, is a bad idea? | | |
| ▲ | arcfour an hour ago | parent | next [-] | | You know when people are shooting at you. You don't know when or if people are exploring undocumented/obscure features of your system and what they have learned about it that you were trying to hide. Therefore, the safest assumption to make is that an adversary already has figured out all of your obscurity, because they always can do this given sufficient time and interest, at which point the only thing between them and you is your security. That is why we design systems without obscurity and only care about security. | | |
| ▲ | willis936 an hour ago | parent [-] | | I agree that it's a good principle but it's taken too far when justifying needlessly growing risk surface area. Like the principle is useful to justify security hardening. It is not useful when used to increase the odds of being attacked. |
| |
| ▲ | staticassertion 5 hours ago | parent | prev [-] | | This isn't about what's a good idea or bad idea. Perhaps it's best to simply leave analogies behind, otherwise we'll just focus on the wrong thing. Security through obscurity merely means that your system is atypical. It's not hidden, it's not secret, it's not hard to find, it's not hard to examine, it's not less visible, etc - there is nothing inherently different about the systems at all other than that one is more common than the other. It's just less typical. | | |
| ▲ | willis936 2 hours ago | parent | next [-] | | What you're describing is a thing that is not obscured. Don't refer to things as obscured if they are not obscured. When others talk about about things that are obscured they are talking about things that are obscured, not things that are not obscured. | | | |
| ▲ | dreambuffer an hour ago | parent | prev [-] | | I'm having a hard time understanding what you mean here. If something is obscured, by definition it is less visible. Being 'less typical' is a form of security because most attacks rely on some form of pattern recognition, and obscurity literally dissolves patterns into noise. | | |
| ▲ | staticassertion 44 minutes ago | parent [-] | | You're overly focusing on the term and not the meaning. The term comes about from people choosing tools like "foxit" or "Opera" and saying that those products are safer than their cohorts Adobe/ Firefox because they are attacked less often. This notion was termed "security through obscurity" ie: "you use the less popular option, therefor that option is safer". It has nothing to do with "obscuring" in the sense of "hiding", that's a linguistic quirk of a colloquial term. If you were actually taking action to reduce the ability to understand a system in a way that you could meaningfully defend, it would no longer be "security through obscurity". The argument has persisted because there are two different questions that sound the same (X is less typical than Y): 1. Is "X" safer than "Y"? 2. Is a user of "X" safer than a user of "Y"? When looking at (1) in isolation, you can say things like "X lacks security features, therefor Y is safer" and "X is less often used, therefor X is safer", etc. This is a question about the posture of the project itself, in isolation. (2) is about the context for users. The reality is that X, which perhaps is fundamentally less well built software, may actually have users who are attacked far less frequently. Both are likely to favor "rarity is a poor indicator of safety" as we generally reject mitigation approaches that rely on attackers to behave specific ways, but what's important is that these are completely different questions and neither has to do with being obscured but rather rare. None of this is about what is "obscured" or not. If something is obscured or obfuscated, that is a technique that can be evaluated separately by its own merits (ie: how hard is deobfuscation, how easy is it to adapt to deobfuscation, etc). All of this is about whether you're evaluating (1) or (2) - and in the case of (1), which is what the criticism always has focused on, the answer is that "rarity" is not a mitigation. | | |
|
|
| |
| ▲ | singpolyma3 3 hours ago | parent | prev [-] | | The first rule of not being seen: to not stand up. | | |
|
|
| ▲ | lucketone 5 hours ago | parent | prev | next [-] |
| All modes of cyber security depend on some obscurity (e.g. password) Ideally we want a viable plan B, for when it’s leaked/figured out. (E.g. generate new passwords) (For convenience let’s label air-gap as kind of physical security) |
| |
| ▲ | pdpi 3 hours ago | parent | next [-] | | > All modes of cyber security depend on some obscurity (e.g. password) That's not what the expression means. "Security through obscurity" has a very specific meaning — that your system's security depends on your adversary not understanding how it works. E.g. understanding RSA is a few wikipedia articles away, and that doesn't compromise its security, so RSA isn't security through obscurity. | | |
| ▲ | sroussey 2 hours ago | parent | next [-] | | No, "Security through obscurity" is a valid and useful layer. A lot of weight hangs on your word “depends” though, in which case if it is the only layer then you will eventually have, uh, problems. I’ve used it for a long long time. Like in 1999 I’d have a knock on certain ports in a certain order to unlock the ssh port. And lots of weird stuff to stop forum spam. Which could work for weeks or months or even a year. | | |
| ▲ | pdpi an hour ago | parent [-] | | Port knocking isn't security through obscurity. Given the knowledge that you have a port knocking system in place doesn't tell me what specific sequence of knocks will open up the service I want to target. Even just a two knock sequence gives you a key with 32 bits of entropy, which makes it trivial to block attempts at bruteforcing the key. | | |
| ▲ | ZoomZoomZoom 32 minutes ago | parent [-] | | I don't see how your argument makes sense. It's all just bits of entropy in the end, be it knowing a port to connect to or a character in your key. | | |
| ▲ | pdpi 10 minutes ago | parent [-] | | Yeah absolutely. That was precisely my point — Requiring a secret (be it a password or the private part of an asymmetric key) isn't security through obscurity, and finding the sequence of knocks is equivalent to finding a password of equivalent complexity. |
|
|
| |
| ▲ | strken 2 hours ago | parent | prev [-] | | Lucketone likely knows this and was pointing out that "obscurity" is a misleading word to use when talking about systems which all rely on obscurity, in the plain English sense of the word. | | |
| ▲ | pdpi an hour ago | parent [-] | | We're in a technical forum, discussing a term of art that refers to a very specific bad practice. Lucketone's argument is essentially saying that the bad practice itself isn't actually a bad practice by equivocating the term of art and the plain language definition. |
|
| |
| ▲ | 0123456789ABCDE 4 hours ago | parent | prev [-] | | i don't know a lot about the subject, but the little i know tells me this is not the way to look at this your password (plain text) is secret because only you are supposed to a have it. in the digital realm, sharing the contents of the password (plain-text) is be akin to making a copy of it — undesirable now, the algorithm that hashes the plain-text for comparison with the stored hash, that can be know by anyone, and typically is so password ≠ hashing algorithm | | |
| ▲ | lucketone 4 hours ago | parent [-] | | Yes. Password and hashing algorithm are distinct things. I fully agree with you. |
|
|
|
| ▲ | walrus01 3 hours ago | parent | prev | next [-] |
| Because I love how seriously the DoD takes newly invented terms, we have: "The Integrated Survivability Onion" https://cogecog.com/the-threat-onion/ 1. Don't be seen. 2. Don't be acquired 3. Don't be hit 4. Don't be penetrated 5. Don't be killed It's actually not a bad mental model training aid for teaching people who might find themselves in an active combat environment. |
|
| ▲ | m463 2 hours ago | parent | prev | next [-] |
| I kind of wonder if the analogy might also carry over to the age of AI. if you were hiding in cover during ww1, maybe you had a chance. But if you were hiding from the Terminator, who is "Tireless, Fearless, Merciless", it might not last that long. same might be said of exploits hiding from people... vs AI. |
|
| ▲ | Lammy 2 hours ago | parent | prev [-] |
| > Obscurity can be fine but it's not security. All security is security through obscurity. When it gets obscure enough we call it “public key cryptography”. Guess the 2048-bit prime number I'm thinking of and win a fabulous prize! (access to all of my data) |