| ▲ | direwolf20 22 days ago |
| Use encryption > even if you took the picture yourself. I'd hope the punishment is more severe in that case! |
|
| ▲ | simonh 21 days ago | parent | next [-] |
| It's a tricky issue. In many countries it's not illegal and quite common for children to run around naked in public, during the summer on beaches for example, and so millions of people have holiday photos that are technically CSAM in their possession that they don't even know they have. |
| |
| ▲ | direwolf20 21 days ago | parent [-] | | CSAM must be for sexual gratification usually. A medical anatomy textbook isn't CSAM. | | |
| ▲ | woooooo 21 days ago | parent | next [-] | | And now you're in court strenuously arguing that you weren't sexually gratified by the photo of your kid in the tub. Obviously most people are sensible most of the time but sometimes they are not. | |
| ▲ | 21 days ago | parent | prev | next [-] | | [deleted] | |
| ▲ | chrisjj 21 days ago | parent | prev [-] | | More than that. CSAM is evidence of abuse. Hence the "A". And nudity is not required. | | |
| ▲ | direwolf20 21 days ago | parent [-] | | CSAM has a meaning identical to child porn but doesn't make that meaning explicit. Drawn or generated depictions of child nudity can be considered CSAM in some jurisdictions. | | |
| ▲ | chrisjj 21 days ago | parent | next [-] | | "CSAM isn’t pornography—it’s evidence of criminal exploitation of kids." That's from RAINN, the US's largest anti-sexual violence organisation. | |
| ▲ | mschuster91 21 days ago | parent | prev [-] | | Yep. Germany is very very strict for example. Even textual descriptions fall under that law. |
|
|
|
|
|
| ▲ | mschuster91 21 days ago | parent | prev [-] |
| > I'd hope the punishment is more severe in that case! I'm talking about kids making photos of themselves. Which has been an issue multiple times in the past. |