rahaeli's Avatar

rahaeli

@rahaeli.bsky.social

Cofounder @dreamwidth.org / disabled queer cat lady / running social media since before it was "social media" and Trust & Safety since the dawn of time / do not cite the deep magic to me, I wrote it / no, I'm allergic to that, too

35,438 Followers  |  379 Following  |  103,960 Posts  |  Joined: 05.05.2023
Posts Following

Posts by rahaeli (@rahaeli.bsky.social)

Two boxes of mango, one of churro, one cookies and cream. But they have not showed up yet :(

06.03.2026 07:15 — 👍 0    🔁 0    💬 0    📌 0

Yeah. And in the absence of any real information about what actually does help, it's impossible to figure out what to do about it. There's a reason why our policy about the topic runs like 5000 words when I write it out, sigh.

06.03.2026 07:14 — 👍 1    🔁 0    💬 0    📌 0

The impression I get is half the House wants stronger protections and half the House opposes it in its current form because it pre-empts state laws, but that's just an impression.

06.03.2026 06:23 — 👍 0    🔁 0    💬 1    📌 0

I'm never going to be able to afford avocado toast at this rate!

06.03.2026 06:18 — 👍 3    🔁 0    💬 0    📌 0

($6 over multiple boxes of mochi, I should specify.)

06.03.2026 06:17 — 👍 34    🔁 0    💬 1    📌 0

Some nights you look at your freezer, realize you are out of mochi, calculate that placing an order with the mochi taxi is only a $6 markup with tip over sending your wife to the grocery store tomorrow, and order some fucking mochi at 1AM

06.03.2026 06:16 — 👍 75    🔁 1    💬 6    📌 0

I'm glad the delightful dryer lint could help cheer her up!

06.03.2026 06:13 — 👍 1    🔁 0    💬 0    📌 0

But the categories aren't immutable and some people fall into both, and the likelihood is just a likelihood and not a hard and fast predictor. And it's difficult-to-impossible to distinguish them :/

06.03.2026 06:12 — 👍 8    🔁 0    💬 1    📌 0

--why it's "okay" for them to be around children as long as they pinky swear they won't actually do anything sexual with them. That's the group more likely to slippery-slope justify themselves into actually abusing a child one tiny piece at a time.

06.03.2026 06:12 — 👍 8    🔁 0    💬 1    📌 0

The little I do know about it, and again I've done a bunch of research to try to determine what the harm reduction principle is, says the first group are more likely to be the ones horrified by the attraction and the second is more likely to self-justify --

06.03.2026 06:12 — 👍 8    🔁 0    💬 1    📌 0

--and some people are attracted to what they see as a manifestation of their own idealized memories of childhood. (That's discounting people who are just opportunistic predators, of course: many actual child sex abusers are, and aren't attracted to children so much as the power differential.)

06.03.2026 06:12 — 👍 6    🔁 0    💬 1    📌 0

From what I know of the research, the current theory is that some people experience attraction to children because something (and we don't know what) froze their attraction "gaze", so to speak, as they grew up (whereas normal developmental progress = your gaze matures as you do)--

06.03.2026 06:12 — 👍 6    🔁 0    💬 1    📌 0

It's my tiny little labor of love that we stubbornly keep running for the 30,000 or so people who refuse to give up the old text-based internet, heh.

06.03.2026 05:55 — 👍 3    🔁 0    💬 0    📌 0

--before watching it in realtime (so you get a sense of what is going to happen but at a slight remove before you watch the whole thing in detail). I can't help with courts being stuck in the Stone Age, though ;)

06.03.2026 05:54 — 👍 0    🔁 0    💬 0    📌 0

For video: in addition to greyscale, mute the sound (or turn it as low as possible, if you need to be able to hear it), put it in as small a window as you can to make out the necessary detail instead of fullscreen, and scrub through at high speed using the scroll bar first--

06.03.2026 05:54 — 👍 0    🔁 0    💬 1    📌 0

I don't expect oral args to be of any real substance; it's mostly going to get decided on the briefings

06.03.2026 05:43 — 👍 1    🔁 0    💬 1    📌 0
Preview
GitHub - dreamwidth/dreamwidth: Dreamwidth's open source repository Dreamwidth's open source repository. Contribute to dreamwidth/dreamwidth development by creating an account on GitHub.

That's standard in open source software! We use GitHub issues: github.com/dreamwidth/d...

06.03.2026 00:48 — 👍 3    🔁 0    💬 0    📌 0

Sites definitely use a shitton of ML and image classification for that kind of stuff, yeah. It's not perfect but it reduces the field of what humans have to look at.

06.03.2026 00:44 — 👍 6    🔁 0    💬 1    📌 0

If you ever are so unfortunate as to have to do that again: set the computer monitor to grayscale or invert the color settings. It lets you make out what the thing is, but stops it from being so visceral.

06.03.2026 00:42 — 👍 5    🔁 0    💬 1    📌 0

Yep. It's awful! What we had before it was awful! There's no good way to handle any of it! Sigh.

06.03.2026 00:41 — 👍 3    🔁 0    💬 0    📌 0

(Because it would require spending more money. This is the sole issue on which I think we should *increase* law enforcement funding instead of restricting it.)

06.03.2026 00:40 — 👍 4    🔁 0    💬 0    📌 0

It's the kind of thing that those of us who work in the industry avoid talking about a lot, because it's so incredibly depressing, sigh. Senator Wyden has been trying to pass a law to help address some of the systemic problems for like the last four congressional sessions but it never goes anywhere.

06.03.2026 00:38 — 👍 4    🔁 0    💬 1    📌 0

A lot of that is because the reports were not actually about CSAM, including a huge amount of drawn content (because sites are incentivized to over-report). But not all of it, and I will avoid getting into further detail to avoid distressing folks, but it means a LOT of stuff goes uninvestigated.

06.03.2026 00:25 — 👍 5    🔁 0    💬 1    📌 0

But just as an example for why the distinction is so important: in 2024, NCMEC, the clearinghouse providers are required by law to report CSAM to, was so inundated they could only forward 0.02% of reports to law enforcement in the first place.

06.03.2026 00:25 — 👍 5    🔁 1    💬 1    📌 0

That's for reporting to a website (and, importantly, pick the reporting options that do not route it into the reporting queue for actual CSAM, for reasons described here: bsky.app/profile/raha... ). For regular discussion, "sexual depictions of fictional children" is fine.

06.03.2026 00:16 — 👍 5    🔁 1    💬 1    📌 0

I usually go with an adapted version of the language in 1466A: "potentially-obscene, non-photorealistic depiction of the sexual abuse of a child/children"

06.03.2026 00:12 — 👍 5    🔁 0    💬 1    📌 0

There's a really good reason DW has extremely limited image hosting, and it's only partially because of the cost of storing and serving it.

06.03.2026 00:07 — 👍 32    🔁 0    💬 0    📌 0

It took until around 2023 for me to be able to look at a particular image and go "huh, I bet a normal person would find that pretty upsetting", and I was seriously startled to find myself thinking it, because I thought the desensitization was permanent! The damage it causes is *that bad*.

06.03.2026 00:06 — 👍 31    🔁 1    💬 3    📌 0

Like, it's very rarely a concern on DW but back on LJ I used to have to look at hosted images and watch embedded video of all kinds of violence up to and including murder, beheadings, torture, etc. I left that job in 2007.

06.03.2026 00:06 — 👍 13    🔁 0    💬 1    📌 0

Oh, unquestionably, image classification for moderation as a whole (and not just for CSAM; also things like violence, gore, murder videos, etc) is THE single greatest advance in the field of content moderation, like, ever. It saves so much trauma.

06.03.2026 00:06 — 👍 19    🔁 1    💬 1    📌 0