You don’t say.
02.09.2025 15:08 — 👍 692 🔁 197 💬 11 📌 17@dativeworks.bsky.social
Digital Transformation | Strategy | Adaptive Skill Building @ Scale | Decaffeinated Irritated Optimist…social media is for giving opinions away like kittens.
Talked to 23 people who suddenly have to pay huge tariffs or otherwise cannot get chainmail from Pakistan, yarn from France, retro computers from Japan, metal music from the Netherlands, DVDs from Germany, cosplay supplies, sunscreen, etc.
www.404media.co/its-just-a-m...
An illustration of me, and the headline: "AI agents are coming for your privacy, warns Meredith Whittaker The Signal Foundation’s president worries they will also blunt competition and undermine cyber-security"
To put it bluntly, the path currently being taken towards agentic AI leads to an elimination of privacy and security at the application layer. It will not be possible for apps like Signal—the messaging app whose foundation I run—to continue to provide strong privacy guarantees, built on robust and openly validated encryption, if device-makers and OS developers insist on puncturing the metaphoric blood-brain barrier between apps and the OS. Feeding your sensitive Signal messages into an undifferentiated data slurry connected to cloud servers in service of their AI-agent aspirations is a dangerous abdication of responsibility.
Happily, it’s not too late. There is much that can still be done, particularly when it comes to protecting the sanctity of private data. What’s needed is a fundamental shift in how we approach the development and deployment of AI agents. First, privacy must be the default, and control must remain in the hands of application developers exercising agency on behalf of their users. Developers need the ability to designate applications as “sensitive” and mark them as off-limits to agents, at the OS level and otherwise. This cannot be a convoluted workaround buried in settings; it must be a straightforward, well-documented mechanism (similar to Global Privacy Control) that blocks an agent from accessing our data or taking actions within an app. Second, radical transparency must be the norm. Vague assurances and marketing-speak are no longer acceptable. OS vendors have an obligation to be clear and precise about their architecture and what data their AI agents are accessing, how it is being used and the measures in place to protect it.
📣 NEW -- In The Economist, discussing the privacy perils of AI agents and what AI companies and operating systems need to do--NOW--to protect Signal and much else!
www.economist.com/by-invitatio...
The AI Darwin Awards are here to catalog the damage that happens when humanity’s hubris meets AI’s incompetence. The simple website contains a list of the dumbest AI disasters from the past year and calls for readers to nominate more.
🔗 www.404media.co/ai-darwin-aw...
so the trouble with humanoid robots as the next great tech hype is just that they're shit, they don't work and they have no applications
spectrum.ieee.org/humanoid-rob...
"Consider the implications if ChatGPT started saying “I don’t know” to even 30% of queries ... Users accustomed to receiving confident answers to virtually any question would likely abandon such systems rapidly."
13.09.2025 06:50 — 👍 772 🔁 167 💬 18 📌 16We’re not just losing the plot.
We’re living inside systems that rewrite it in real time.
When language is optimized, meaning becomes fragile.
Narrative isn’t just a story; it’s control.
#NarrativePower #Disinformation #SemanticCollapse #DataGovernance
christinehaskell.substack.com/p/narrative-...
This is messed up.
It is also often the story of automation—labor costs don’t go away, they’re shifted to someone whose labor isn’t compensated. See, e.g. Elish & @cariatida.bsky.social on grocery scanners.
Noting that our AI footprint today is likely the smallest it will ever be, @technologyreview.com offers a comprehensive — and sobering — analysis of how much energy the AI industry uses and where it's headed. www.technologyreview.com/2025/05/20/1...
27.05.2025 14:30 — 👍 18 🔁 14 💬 0 📌 1This is a pretty big change to how the entire nonconsensual AI-generated content ecosystem works. I don't think this would have happened without our reporting or support from our subscribers www.404media.co/civitai-ban-...
27.05.2025 14:24 — 👍 136 🔁 39 💬 3 📌 1SCOOP: ICE, HSI, and DHS are getting side-door access to the nationwide system of Flock license plate cameras by asking local police to perform lookups for them, new public records show.
ICE does *not* have a contract to use this surveillance tool itself.
www.404media.co/ice-taps-int...
Nick Clegg says asking artists for use permission would ‘kill’ the AI industry
26.05.2025 13:40 — 👍 4486 🔁 865 💬 3659 📌 8933So true!
27.05.2025 14:53 — 👍 0 🔁 0 💬 0 📌 0Handwritten quote on a white sheet with an illustrated face at the bottom and pink hearts around the text. The quote reads: "If you love a book, write a nice review. It gives the author encouragement for bad days when they want to take up scorpion petting." — Liana Brooks Illustration and lettering by Debbie Ridpath Ohi. A paintbrush lies across the page on a wooden surface.
If you love a book, write a nice review. It gives the author encouragement for bad days when they want to take up scorpion petting. - @lianabrooks.bsky.social
#WriterSky #BookSky #KidLit
We made the system.
Then gave it the wheel.
The worst part?
Nobody feels responsible anymore.
(New essay out. It’s about what Steinbeck saw before we did.) christinehaskell.substack.com/p/the-sustai... #SystemDesign
#AIGovernance #LeadershipReflection #EthicalAI #DesignedAccountability
Watch this video of @carolecadwalla.bsky.social’s new TED Talk. TED has been a cheerleader for the technologies that have destroyed democracy and our social fabric, and Carole does not hold back. She is so courageous … and speaks the truth.
www.ted.com/talks/carole...
Sentence of the year: “Critics once described Brexit as the greatest act of economic self-harm by a Western country in the post-World War II era. It may now be getting a run for its money across the Atlantic.” www.nytimes.com/2025/04/13/w...
13.04.2025 13:31 — 👍 278 🔁 68 💬 10 📌 2China halts critical exports of certain rare earth mineral and magnets [NYT report]
We are all fucked.
Thank you, POTUS, for giving China an excuse to shut down the export of rare earth minerals and magnets, furthering your efforts to drive us all back to the Stone Age.
The government knows an enormous amount about you in the data systems DOGE has sought to access.
Just for starters:
www.nytimes.com/2025/04/09/u...
Courage is contagious. But how do we catch it?
For months, @amifieldsmeyer.bsky.social and I have been asking dissidents and activists from around the world how we can topple authoritarianism.
We assembled their lessons into a field guide to courage.
www.newyorker.com/news/the-wee...
"I grew up assuming democracy was inevitable"
Emmy Award-winning actor Bradley Whitford on the current political climate
“What did they have that eludes their lesser-known heirs today? The answer is that Kissinger & Brzezinski were immigrants. Newcomers often value America’s freedoms more than its native-born and are statistically far likelier to start companies, win Nobel Prizes & launch schools of thought.”
~Ed Luce
NEW: The doctor behind breakthrough Parkinson’s research was among the scientists purged from the National Institutes of Health, the US’s leading medical research agency. www.wired.com/story/doctor...
02.04.2025 00:13 — 👍 3501 🔁 1943 💬 138 📌 200I love that you tried though
01.04.2025 12:56 — 👍 0 🔁 0 💬 1 📌 0Schumer is a nudge.
01.04.2025 12:55 — 👍 0 🔁 0 💬 1 📌 0This show was…sublime in it portrayal of Human Resources. The persistent ethical question of if our memories make us who we are, are these innies new people created as slaves whose sole purpose is to work. I mean, it reminded me of 90s tech culture.
28.03.2025 03:47 — 👍 2 🔁 0 💬 0 📌 0Government transparency on AI use is not just a bureaucratic exercise—it is a fundamental component of maintaining public trust, writes Clara Langevin.
20.03.2025 15:06 — 👍 31 🔁 9 💬 3 📌 1Some astonishing numbers in here:
-OpenAI loses $2 for every $1 it makes
-OpenAI projects annual losses of $14 billion by 2026
-To break even OpenAI needs to increase revenue 25x in just 5 years
-33% of VC portfolios are committed to AI
-5 AI-heavy stocks account for 29% of the S&P 500's value
How many times are we gonna see this same story, but with the newspaper name changed
26.03.2025 19:12 — 👍 36 🔁 8 💬 2 📌 0Can’t help but admire this administration’s commitment to transparency
26.03.2025 23:33 — 👍 187 🔁 36 💬 2 📌 2