- cross-posted to:
- technology
- cross-posted to:
- technology
New Mexico is seeking an injunction to permanently block Snap from practices allegedly harming kids. That includes a halt on advertising Snapchat as “more private” or “less permanent” due to the alleged “core design problem” and “inherent danger” of Snap’s disappearing messages. The state’s complaint noted that the FBI has said that “Snapchat is the preferred app by criminals because its design features provide a false sense of security to the victim that their photos will disappear and not be screenshotted.”
So if someone generates a minor’s image and it’s not nude, is that not CSAM?
I’m genuinely asking, I always thought it was about sexualizing children, not whether they are nude or not.
I don’t think so. People keep throwing that acronym around but I suspect they didn’t read the article and find out that it was one normal picture of a high school-aged girl.
I actually read it and then made a comment because even though it’s a profile picture, the intent is to have a viewer sexual the picture and thereby sexualizing a minor.
I do get how it’s a normal picture, but it made me think of this slippery slope and where the line is.