hh

In principle, no.
hh 10/29 7:14:33 πŸ’•
I don't know what sort of people you follow or searches you do, but I have never encountered one single instance of child porn in Nostr. If you find illegal stuff, call the cops. It's not your job to police people.
⬆
Yeah, fairly simple indeed. Scan for nudity and then scan the same image for the age of people in it. Then human has to verify if it’s CSAM and report it if it is. Fairly effective given the simplicity, and good at catching AI CSAM
⬆
jb55 10/29 5:12:48 πŸ’•
nostr:npub137c5pd8gmhhe0njtsgwjgunc5xjr2vmzvglkgqs5sjeh972gqqxqjak37w was describing what they do, its not perfect but apparently it works decently well, at least for public uploaders
⬆
2e9834d3bfa0ff8c5bd515f54eb09bd0d7ccdf27c7e02dfd09e218966ae546de
⬆
semisol 10/29 4:33:39 πŸ’•πŸΆ
it’s much worse than it appears nostr:note1lj29p7mct8a4tyg7m8cyh57cm7563l5c7nynzyevjdng8t0n77rs8r0p83
hh 9/19 22:40:25 πŸ’•
Isn't this sort of attack mitigated by massive use?
⬆
b98e16ed... 9/19 22:34:14 πŸ’•
nostr:nevent1qqsfsgu6070eq89qwet245eazc7s5uf2un3pp9ck5sje7g85ydmxq0sppemhxue69uhkummn9ekx7mp0qgsd4dkxqewy8xum47ctpu0ltgxxsfemeewpjkdyzk9ddfcg286s0dsrqsqqqqqp226fpz
⬆
⬆
fee4893bd74bd5765ac429d8ac5fa6cc1f4a222a9dd6448e63cb38dcf2285ca5
NIP-07
πŸ’•
Send kind:7