π Key Topics Include:
- Lifecycle Uses & LLM-Driven Generation
- Safety & Robustness
- Privacy, Security & Data Governance
- Fairness, Bias & Representation
- Explainability, Interpretability & Uncertainty
- Standards, Metrics & Tooling for Trustworthy Use
- Critical Perspectives on Synthetic Data
08.10.2025 13:19 β π 0 π 0 π¬ 0 π 0
Foundation models increasingly leverage synthetic data for training while simultaneously generating synthetic datasets for downstream applications.
This workshop centers on the responsible development and use of synthetic data with and for foundation models
08.10.2025 13:19 β π 0 π 0 π¬ 1 π 0
ποΈ Submission Deadline: October 20th, 2025 AoE
08.10.2025 13:19 β π 0 π 0 π¬ 1 π 0
Shaping Responsible Synthetic Data in the Era of Foundation Models
A Workshop at AAAI 2026
27th January 2026
Singapore Expo, Singapore
π¨ Submission deadline is approaching for the Responsible Synthetic Data (RSD) Workshop @ AAAI 2026
π’ The RSD workshop at AAAI 2026 (27th January, πΈπ¬ Singapore) focuses on responsible practices for synthetic data with/for foundation models.
π Website: responsible-synthetic-data.github.io
08.10.2025 13:19 β π 0 π 0 π¬ 1 π 0
FORC 2026: Call for Papers
The 7th annual Symposium on Foundations of Responsible Computing (FORC) will be held on June 3-5, 2026 at Harvard University. Brief summary for those who are familiar with past editions (prior to 2β¦
The FORC 2026 call for papers is out! responsiblecomputing.org/forc-2026-ca... Two reviewing cycles with two deadlines: Nov 11 and Feb 17. If you haven't been, FORC is a great venue for theoretical work in "responsible AI" --- fairness, privacy, social choice, CS&Law, explainability, etc.
07.10.2025 12:02 β π 14 π 9 π¬ 1 π 0
Terms Watch
Track changes to Terms of Service across major platforms
(Caught this via Terms Watch - a tool I built that monitors ToS changes across major platforms. Daily digest at termswatch.io)
05.10.2025 10:40 β π 0 π 0 π¬ 0 π 0
This is a fascinating shift in platform power dynamics. Instead of unilateral AI scraping, we're seeing the a forced data reciprocity
05.10.2025 10:40 β π 0 π 0 π¬ 1 π 0
Some open source devs may want LLMs to learn their code - it could help users get support.
But this reciprocity clause might have a chilling effect
05.10.2025 10:40 β π 0 π 0 π¬ 1 π 0
Practical example: Google trains Gemini on GitHub code (everyone does - it's the world's largest code repo).
Under the new terms, GitHub could now access Google's public data - like YouTube videos - for their own AI training
05.10.2025 10:40 β π 0 π 0 π¬ 1 π 0
GitHub (owned by Microsoft) just added an "Access Reciprocity" clause to their Terms of Service.
If you're a tech giant (700M+ monthly users) training AI on GitHub's code, GitHub now gets to use YOUR public data too.
Here's what that means π§΅
05.10.2025 10:40 β π 0 π 0 π¬ 1 π 0
Working on #NLProc for social good.
Currently at LTI at CMU. π³βπ
product @roost.tools. just another extremely earnest and Very Online idealist. proud public school alum with roots in the desert π΅ full of personal opinions like the skeets below
trust & safety, birds, hockey π€
Robust Open Online Safety Tools (ROOST) is a new non-profit entity providing open source, accessible, high-quality, transparent safety tools for digital organizations of all kinds.
roost.tools
Nachrichten aus dem Institut fΓΌr Arbeitsmarkt- und Berufsforschung (IAB) Impressum/Guidelines: https://iab.de/impressum/, Kontakt per E-Mail: IAB.Social-Media@iab.de
Researcher in ML and Privacy.
PhD @UofT & @VectorInst. previously Research Intern @Google and @ServiceNowRSRCH
https://mhaghifam.github.io/mahdihaghifam/
Every future imagined by a tech company is worse than the previous iterationβ¦or something like that.
Assistant professor of computer science at Technion; visiting scholar at @KempnerInst 2025-2026
https://belinkov.com/
We're the Electronic Frontier Foundation. We're a nonprofit that fights for your privacy and free speech online. Find all of EFF's social media accounts at eff.org/social.
author of Blood in the Machine, tech writer, luddite
newsletter: https://www.bloodinthemachine.com/
books: https://www.hachettebookgroup.com/contributor/brian-merchant/
kofi link: https://ko-fi.com/brianmerchant
Professor of Computational Cognitive Science | @AI_Radboud | @Iris@scholar.social on 𦣠| http://cognitionandintractability.com | she/they π³οΈβπ
Founder & PI @aial.ie. Assistant Professor of AI, School of Computer Science & Statistics, @tcddublin.bsky.social
AI accountability, AI audits & evaluation, critical data studies. Cognitive scientist by training. Ethiopian in Ireland. She/her
Signal is a nonprofit end-to-end encrypted communications app. Privacy isnβt an optional mode, itβs the way Signal works. Every message, every call, every time.
President of Signal, Chief Advisor to AI Now Institute
He teaches information science at Cornell. http://mimno.infosci.cornell.edu
Professor of Computer Science at Cambridge.
presidential fellow at City University of London & research affiliate at University of Cambridge | populism in power, media under attack, future of news | ex Haaretz, Gates-Cambridge, Molad | soon out: https://footnotepress.com/books/the-new-censorship/
Chasing digital badness. Senior Researcher at Citizen Lab, but words here are mine.
DE/EN. π Potsdam / Berlin
coordination of @dsa40collaboratory.bsky.social, various research at @weizenbauminstitut.bsky.social
among other things: http://zusammenfuergleichstellung.de
dsa40collaboratory.eu | joint project by the @europeannewschool.bsky.social and the @weizenbauminstitut.bsky.social
Offizieller Account der Humboldt-UniversitΓ€t zu Berlin.
Impressum: https://www.hu-berlin.de/de/hu/impressum