| ▲ | mayhemducks 23 days ago | ||||||||||||||||
This is my attempt to answer your question about "what kind of algorithm can you implement to detect something dangerous". Disclaimer though, I agree that the proposed regulation is way too broad and will have unintended consequences as written. If you look at how Apple detects contraband imagery, they hash every image that gets uploaded into the photos app. Those hashes are transmitted to servers that compare them to hashes of known contraband. A similar system could theoretically be used for STL files. So it isn't about detecting exact shapes, it's about preventing printing of STL files that are already known to be dangerous. This would make it harder to illegally manufacture parts for weapons because it would make it much harder to share designs. If you didn't have the knowledge or skill to design a reliable FCU, you would have to find a design someone with that knowledge and skill created - which the printer could theoretically detect with a cryptographic signature. As the original author of the post pointed out though, this could and would be bypassed by actual criminals. As with most things like this, it's probably impossible to prevent entirely, only to make it more difficult. | |||||||||||||||||
| ▲ | pfranz 23 days ago | parent | next [-] | ||||||||||||||||
> If you look at how Apple detects contraband imagery, they hash every image that gets uploaded into the photos app. Those hashes are transmitted to servers that compare them to hashes of known contraband. You're spelling out a specific process in detail--which is the only reason I'm picking on details. Do you have anything documenting what you're describing? From what I remember, Apple's system was proposed, but never shipped. They proposed hashing your photos locally and comparing them to a local database of known CSAM images. Only when there was was a match, they would transmit the photos for manual confirmation. This describes Apple's proposal [1]. I believe what did ship is an algorithm to detect novel nude imagery and gives some sort of warning for kids sending or receiving that data. None of that involves checks against Apple's server. I do think other existing photo services will scan only photos you've uploaded to their cloud. I'm happy to make corrections. To my knowledge, what you're describing hasn't been done so far. [1] https://www.hackerfactor.com/blog/index.php?/archives/929-On... | |||||||||||||||||
| |||||||||||||||||
| ▲ | rblatz 22 days ago | parent | prev | next [-] | ||||||||||||||||
Ok that works for STL files, but printers don’t print STL files they print g code. G code is generated by slicers, and depending on your printer and settings the g code will be different. Is this law obligating printer manufactures to lock down their printer to slicers that can do the STL naughty check? | |||||||||||||||||
| ▲ | rolph 23 days ago | parent | prev [-] | ||||||||||||||||
what part of the dangerous part is the actually dangerous part? its a framing trap to think you have to print or cnc the whole thing in one job. split it up into many smaller jobs, each one not looking dangerous, rezero start the next section as if its a new job, spiff it all up with a session of crank and curse finishing, and the blockade is meaningless. | |||||||||||||||||