It's a special kind of empty
11.02.2026 01:08 — 👍 0 🔁 0 💬 0 📌 0@kconrad.bsky.social
Prof. of English at University of Kansas. Tech and culture. AI ethics. Critical AI literacy. Pandora’s Bot on Substack. Library of Babel Group. Zine publisher. #aiethics #criticalailiteracy
It's a special kind of empty
11.02.2026 01:08 — 👍 0 🔁 0 💬 0 📌 0Trailer itself aside, boy oh boy the marketing head’s response looks absolutely human written. Not extruded at all. Way to build confidence about the script!
screenrant.com/mandalorian-...
“No matter how Ring and other surveillance tech companies may downplay it, there’s no world in which finding lost dogs is the final end-use for this technology.”
09.02.2026 23:00 — 👍 1436 🔁 513 💬 32 📌 36“The surgeon warned Acclarent “there were issues that needed to be resolved”... Despite that Acclarent “lowered its safety standards to rush the new technology to market,” & set “as a goal only 80% accuracy for some of this new technology before integrating it into the TruDi Navigation System.”
10.02.2026 02:38 — 👍 350 🔁 61 💬 4 📌 10Library of Babel Group logo with hexagon pattern.
Just a reminder to consider the Library of Babel Group. We have projects afoot and opportunities to network with like-minded folks. www.law.georgetown.edu/privacy-tech...
03.12.2025 18:56 — 👍 7 🔁 5 💬 0 📌 0“In its discussions with government officials, Anthropic representatives raised concerns that its tools could be used to spy on Americans or assist weapons targeting without sufficient human oversight, some of the sources told Reuters.”
02.02.2026 17:28 — 👍 61 🔁 18 💬 2 📌 8With news today about a surge in gas-fired power generation for data centers and AI, here's an occasional reminder that I maintain Against AI and Its Environmental Harms, an open library with journalism, peer-reviewed research, and other media for teaching: pad.riseup.net/p/Against_AI...
30.01.2026 00:20 — 👍 97 🔁 44 💬 0 📌 0The college at which I'm employed, which has signed a contract with the AI firm that stole books from 131 colleagues & me, paid a student to write an op-ed for the student paper promoting AI, guided the writing of it, and did not disclose this to the paper. www.thedartmouth.com/article/2026...
29.01.2026 22:40 — 👍 2076 🔁 816 💬 65 📌 92Let them eat slop
29.01.2026 07:38 — 👍 96 🔁 26 💬 4 📌 0Pygmalion Displacement:
9) Social bonding: Do the users and/or creators of the AI develop interpersonal-like relationships with it?
10) Psychological service: Does the AI function to subserve and enhance the egos of its creators and/or users?
PDF: doi.org/10.31235/osf... @spookyachu.bsky.social
Hey so here's a little retrospective on Google's fall visit to my campus. open.substack.com/pub/kconrad/...
25.01.2026 01:19 — 👍 11 🔁 6 💬 1 📌 3CNN headline: "Elon Musk makes bold pitch for super-fast rocket travel"in front of an image of Lyle Lanley, monorail conman from the Simpsons.
In related news,
25.01.2026 01:59 — 👍 0 🔁 0 💬 0 📌 1Hey so here's a little retrospective on Google's fall visit to my campus. open.substack.com/pub/kconrad/...
25.01.2026 01:19 — 👍 11 🔁 6 💬 1 📌 3Oh yay!
25.01.2026 01:17 — 👍 1 🔁 0 💬 0 📌 0"AI didn't just increase its footprint in Washington in 2025. It ate tech lobbying whole." www.axios.com/2026/01/23/a...
24.01.2026 14:16 — 👍 42 🔁 27 💬 0 📌 0What is wild to me is the defense, BY THE NEURIPS BOARD, that fabricated citations do not mean "the content of the papers themselves [is] necessarily invalidated"
It does. It very much does. What do you think citing other work is for? What do you think writing a paper is for? What do you *think*?
"Drawing on a 20.3-million-query audit of ChatGPT, we map systematic biases in the model's representations of countries, states, cities, and neighbourhoods. From these empirics, we argue that bias is not a correctable anomaly but an intrinsic feature of generative AI”.
ht: Dagmar Monett
Perhaps "Abject Victorian orphan" is the line between "seen a ghost" and "processing bad news"
16.01.2026 17:09 — 👍 1 🔁 0 💬 0 📌 0Agreed! I think "jackass" could also be called "ghoul" - just to harmonize with the "seen a ghost" (which shades occasionally into "abject Victorian orphan" when he's being apologetic but I would say is still the same category).
16.01.2026 17:08 — 👍 1 🔁 0 💬 1 📌 0Democratic accountability requires following power where it moves. Traditional mechanisms—public notice, comment, judicial review—were designed for formal rulemaking. We now need accountability frameworks that cut through the mirage of AI deregulation.
15.01.2026 19:34 — 👍 47 🔁 11 💬 1 📌 0For a little midweek relief, I highly recommend Ryan George’s “The Absolute Lunantics of LinkedIn“ for dramatic readings and analyses of such posts. Eg. youtu.be/D3MQQDYijqY?...
15.01.2026 16:10 — 👍 1 🔁 0 💬 0 📌 0#ai companies have been steadily moving into schools across North America in the last year.
They are spending millions of dollars on advertising.
Parents often don't know the negative effects of AI, and educators are being told to use it by administrators blown away by slick presentations.
1/2
Oooh
12.01.2026 23:34 — 👍 2 🔁 0 💬 0 📌 0WANT
08.01.2026 01:35 — 👍 1 🔁 0 💬 0 📌 0I'll also add that when I read the intro, I imagine the Talking Heads "Once in a Lifetime" playing in the background 😁
07.01.2026 00:18 — 👍 2 🔁 0 💬 0 📌 0I had to send an email to my kid's teacher. Thankfully they weren't having the kids interact with it directly but I was still furious
06.01.2026 22:17 — 👍 114 🔁 34 💬 1 📌 0I'm sorry I missed your earlier posts about your dad. Again, so sorry. And yes, you've got plenty other on your plate. :)
05.01.2026 16:24 — 👍 0 🔁 0 💬 0 📌 0Seems like only a few years ago this type of thing was a scandal.
05.01.2026 13:44 — 👍 58 🔁 4 💬 3 📌 0Ben, I am so sorry. Thank you for sharing this.
05.01.2026 14:04 — 👍 2 🔁 0 💬 1 📌 0