Remix.run Logo
staticassertion 6 hours ago

I don't think that really works because obscurity isn't harder to see or find. I don't know the analogy, it's like standing out in the open and being like "yeah but who would think to look here lol".

willis936 6 hours ago | parent | next [-]

I think you're misinterpreting "obscurity" for "lack of obscurity". If you have a vulnerability in an API interface that is completely undocumented that is a vulnerability that is obscured. It's hiding in the woods, not standing in a field.

To keep with the analogy: no one is going to stand in a field when people are shooting at you. So then why do a small subset of vocal people online suggest that you just put your bulletproof vest and claim that hiding in the woods, regardless of the vest, is a bad idea?

arcfour an hour ago | parent | next [-]

You know when people are shooting at you. You don't know when or if people are exploring undocumented/obscure features of your system and what they have learned about it that you were trying to hide.

Therefore, the safest assumption to make is that an adversary already has figured out all of your obscurity, because they always can do this given sufficient time and interest, at which point the only thing between them and you is your security.

That is why we design systems without obscurity and only care about security.

willis936 an hour ago | parent [-]

I agree that it's a good principle but it's taken too far when justifying needlessly growing risk surface area. Like the principle is useful to justify security hardening. It is not useful when used to increase the odds of being attacked.

staticassertion 5 hours ago | parent | prev [-]

This isn't about what's a good idea or bad idea. Perhaps it's best to simply leave analogies behind, otherwise we'll just focus on the wrong thing.

Security through obscurity merely means that your system is atypical. It's not hidden, it's not secret, it's not hard to find, it's not hard to examine, it's not less visible, etc - there is nothing inherently different about the systems at all other than that one is more common than the other. It's just less typical.

willis936 2 hours ago | parent | next [-]

What you're describing is a thing that is not obscured. Don't refer to things as obscured if they are not obscured. When others talk about about things that are obscured they are talking about things that are obscured, not things that are not obscured.

staticassertion 38 minutes ago | parent | next [-]

You can see my other comment on this. The word "obscure" is not very relevant to the phrase "security through obscurity".

an hour ago | parent | prev [-]
[deleted]
dreambuffer an hour ago | parent | prev [-]

I'm having a hard time understanding what you mean here. If something is obscured, by definition it is less visible. Being 'less typical' is a form of security because most attacks rely on some form of pattern recognition, and obscurity literally dissolves patterns into noise.

staticassertion an hour ago | parent [-]

You're overly focusing on the term and not the meaning. The term comes about from people choosing tools like "foxit" or "Opera" and saying that those products are safer than their cohorts Adobe/ Firefox because they are attacked less often.

This notion was termed "security through obscurity" ie: "you use the less popular option, therefor that option is safer". It has nothing to do with "obscuring" in the sense of "hiding", that's a linguistic quirk of a colloquial term. If you were actually taking action to reduce the ability to understand a system in a way that you could meaningfully defend, it would no longer be "security through obscurity".

The argument has persisted because there are two different questions that sound the same (X is less typical than Y):

1. Is "X" safer than "Y"?

2. Is a user of "X" safer than a user of "Y"?

When looking at (1) in isolation, you can say things like "X lacks security features, therefor Y is safer" and "X is less often used, therefor X is safer", etc. This is a question about the posture of the project itself, in isolation.

(2) is about the context for users. The reality is that X, which perhaps is fundamentally less well built software, may actually have users who are attacked far less frequently.

Both are likely to favor "rarity is a poor indicator of safety" as we generally reject mitigation approaches that rely on attackers to behave specific ways, but what's important is that these are completely different questions and neither has to do with being obscured but rather rare.

None of this is about what is "obscured" or not. If something is obscured or obfuscated, that is a technique that can be evaluated separately by its own merits (ie: how hard is deobfuscation, how easy is it to adapt to deobfuscation, etc). All of this is about whether you're evaluating (1) or (2) - and in the case of (1), which is what the criticism always has focused on, the answer is that "rarity" is not a mitigation.

dreambuffer 14 minutes ago | parent [-]

I understand it now, thanks

singpolyma3 3 hours ago | parent | prev [-]

The first rule of not being seen: to not stand up.

gerdesj 2 hours ago | parent [-]

Not stand out.