dmv.community is one of the many independent Mastodon servers you can use to participate in the fediverse.
A small regional Mastodon instance for those in the DC, Maryland, and Virginia areas. Local news, commentary, and conversation.

Administered by:

Server stats:

172
active users

#csam

5 posts5 participants3 posts today

Vast #pedophile network shut down in Europol’s largest #CSAM operation

#Europol has shut down one of the largest dark web pedophile networks in the world, prompting dozens of arrests worldwide and threatening that more are to follow
#darkweb

arstechnica.com/tech-policy/20

Ars Technica · Vast pedophile network shut down in Europol’s largest CSAM operationBy Ashley Belanger

shocking allegations that #Passes, a social media platform run by Thiel Fellow #LucyGuo and funded by crypto bros, was basically #OnlyFans but (wink wink) for minors...

Financial backers included #JakePaul, #ParisHilton, as well as multiple bros who are attending the White House Crypto Summit on Friday,(e.g. "crypto czar" #DavidSacks's #CraftVentures and Kyle Samani's #MulticoinCapital).

Xitter post by a founder who objected to her own company being acquired by such a deeply amoral organization: x.com/jasminericegirl/status/1

(Lucy Guo was also a Forbes 30 Under 30 person... lol)

#Europol arrests 25 users of online network accused of sharing #AI #CSAM

#Europe is cracking down on AI-generated #sex images of minors. So far, Europol has arrested 25 people in a large-scale ongoing probe called #OperationCumberland and confirmed that more arrests are expected in the coming weeks.

arstechnica.com/tech-policy/20

Ars Technica · 25 arrested so far for sharing AI sex images of minors in largest EU crackdownBy Ashley Belanger

25 arrested in major crackdown of AI-generated child abuse images network.

Danish authorities led an operation, which resulted in 25 arrests, 33 house searches and 173 devices being seized across 20 countries.

The operation is "one of the first cases" involving AI-generated images of child sexual abuse material, according to Europe's law enforcement agency Europol, which supported the action.

mediafaro.org/article/20250228

POLITICO · 25 arrested in major crackdown of AI-generated child abuse images networkBy Pieter Haeck
#AI#ChildAbuse#CSAM

In early February though, Aidan Harding wasn’t in court for his “activism” as a white supremacist, but for possession of CSAM (child sexual abuse material), all as part of his bid to be welcomed into Nazi Satanist cult and off-shoot of the Order of Nine Angles, “764.” The CSAM was found on his devices that were seized in the December raid. Harding was later arrested on firearms transfer/falsification charges on January 16, 2025 and while under arrest, remanded into federal custody because of the CSAM charges.

The judge allowed evidence showcasing Aidan’s Nazi and white supremacist ideologies to be admitted, because they are deeply relevant to his possession of CSAM. Members of 764 seek to manipulate, blackmail, and otherwise extort minors — often children of color and LGBTQ kids — into acts of self-harm and mutiliation, animal abuse, bestiality, violence, sexual exploitation, and even suicide on-camera or livestream — all of which they document in photographs and video. This “content” then serves to gain 764 members increasing clout within their cult.

Aidan Harding was quickly identified earlier this year after he and Noah Cron “demonstrated” on the Liberty Bridge, waving swastika flags, shouting epithets and slurs, and threatening drivers with mace. Their actions thus far — from waving flags to spray painting Patriot Front graffiti under a bridge in the south hills — seem relatively mild, but the evidence presented in court on February 12 of this year points to a pattern of escalation.

pghfashwatch.noblogs.org/post/

#CSAM #ChildAbuse #SexualAbuse #O9A #Pittsburgh #Grooming #764 #NaziSatanism #AidanHarding #NoahCron #NeoNazis #fcknzs #fascism #OrderOfNineAngles #PatriotFront

pghfashwatch.noblogs.orgPittsburgh Nazi Aidan Harding Charged with Possession of CSAM, Linked to Satanic Hitler-Worship Cult 764 – pghfashwatch

'The EU’s chat control legislation is reportedly back on the table... Polish officials have now tabled a new proposal, which is open for feedback until 20 February... According to Mlex, the bill has now been significantly watered down. CSAM detection orders, for example, have been removed.'
#law #eu #privacy #surveillance #csam #encryption #security #cybersecurity secure.dialog-mail.com/v/14566

secure.dialog-mail.com###betreff###
Continued thread

#DOGE Teen Ran Image-Sharing Site Linked to URLs Referencing Pedophilia & the KKK

The site launched by #EdwardCoristine in 2021 promised to protect the privacy of its users, stating, “All your images are encrypted. We do not log IP addresses, device agents or anything else.”
muskwatch.com/p/doge-teen-ran-

"URLs ... referenced the sale of child sexual abuse material, racial slurs, & rape. Among the links were “child-porn. store” and “kkk-is-cool. club,”"

The nightly consistency checker for my comics aggregator (comics.kamens.us/) notified me about a week ago that #DarrinBell's editorial cartoon and "Candorville" strip had disappeared from #ComicsKingdom.
I looked into it today to see if they had moved to another syndicate I could point the aggregator at, and found this: nbcnews.com/news/us-news/pulit
I think this may be the first time a syndicated strip has stopped being published because its author was arrested for #CSAM. 🤔
#comics

Replied in thread

@nullagent @jack Or it's this, scanning for #CSAM

actionnetwork.org/petitions/go

You should get rid of it if you have something to hide.

actionnetwork.orgGoogle: Scan Android Devices for CSAM The trade in child sexual abuse material (CSAM) online is growing -- more than 65 million images were reported last year alone. Android is the most popular personal device OS that doesn’t find and report these heinous images. Google must commit to finding and reporting CSAM on Android. Apple recently announced an important new step in the fight against child sexual abuse: they’ll scan all new iPhones for child sexual abuse material (CSAM) and report it. This change means abuse can be discovered much sooner -- not days or weeks after when images are uploaded or shared -- and could save child victims and protect their privacy. These scans are done entirely by machines that exclusively look for a “digital fingerprint” of abusive material and flag it as potentially illegal -- a solution that balances the welfare and privacy of kids with that of iPhone users. Android phones, owned and operated by Google, don’t have the same device scanning in place. Users must upload photos to a service for abusive images to be detected -- allowing millions of images to be shared stealthily and victims to go undetected for longer. Google, stop failing kids and start scanning for CSAM on Android devices

Good grief, now they're threatening NCMEC:

"This creates an impossible choice: either we lose the primary infrastructure for fighting online child exploitation, or we institutionalize discrimination that puts already-vulnerable children at even greater risk. Both outcomes achieve exactly the opposite of what any legitimate child protection effort should do."

techdirt.com/2025/02/07/magas-

Techdirt · MAGA’s Sickening Hypocrisy: From ‘Save The Children’ To ‘Defund The Org That Actually Saves Children’After years of screaming “save the children” while baselessly accusing others of exploiting kids, the Trump administration is now trying to destroy the actual infrastructure that saves …
#NCMEC#USPol#CSAM
Replied in thread

@stefan

On that page, they are indicating that their biggest financial challenge is their "Content Classification Service (CCS)", a hash-and-match "solution" that is mostly promoted for #CSAM identification. They're also mentioning reporting the respective content to authorities, if found:

»We integrate with your instance using webhooks, processing content you send us to hash the media and match it with known CSAM. Your media never leaves IFTAS. If we find pertinent matches we’ll provide human review, notify you via email and issue a takedown request. We perform any required reporting and law enforcement record-keeping so you don’t have to.«

In the course of the #ChatControl debate in the #EU, it has been widely discussed how such hash-and-match "solutions" can be extended to detect any kind of content, simply because those in control of the hash databases control what gets detected and reported..

IFTAS themselves say: »Future classification services will include hash and match options for non-consensual intimate images, terroristic and violent extremism content, spam, and more.«

In short, the Four Horsemen of the Infocalypse are riding again.

They also say »We cannot open the underlying hash matching databases«, which means: no local hash databases, hence remote hashing is required.

I'd think twice if I really want to pass on all content of an instance to an external entity, for hashing and and potential reporting. We are talking about "China and dissident content" situations here, but in the context of the USA of 2025 and beyond – I'd certainly not want to donate to help implementing this.

@iftas

Inside the bust that took down Pavel Durov—and upended Telegram

The Russian-born CEO styles himself as a free-speech crusader and a scourge of the surveillance state. Here’s the real story behind Pavel Durov’s arrest and what happened next.

mediafaro.org/article/20250204

WIRED · Inside the Bust That Took Down Pavel Durov—and Upended TelegramBy Darren Loucaides