A reminder that anything recorded on a device like this AI "friend" could be used against you β by hackers, private companies, or the government.
This technology isn't a friend, it's surveillance.
@mkgerchick.bsky.social
She/her. Data Science Manager and Algorithmic Justice Specialist at ACLU. Personal account.
A reminder that anything recorded on a device like this AI "friend" could be used against you β by hackers, private companies, or the government.
This technology isn't a friend, it's surveillance.
More of this kind of reporting, please!
12.09.2025 17:11 β π 150 π 31 π¬ 5 π 4Customs and Border Protection agents searched nearly 15,000 devices from April through June of this year, a nearly 17 percent spike over the previous three-month high in 2022. "The real issue is the chilling effect it has on all travelers."
20.08.2025 16:08 β π 315 π 148 π¬ 14 π 29After @riaclu.bsky.social sued, a judge blocked the Trump administration from imposing ideological restrictions on federal grant recipients who serve domestic violence survivors, LGBTQ youth, and unhoused people.
These organizations can continue their critical work without political interference.
The use of AI shouldnβt come at the expense of our civil liberties.
At the ACLUβs first-ever Civil Rights in the Digital Age AI Summit, weβre considering ways to leverage emerging technology as an asset, while safeguarding our civil rights.
Gerchick et al: Auditing the Audits: Lessons for Algorithmic Accountability from Local Law 144's Bias Audits
#FAccT2025
title slide for tutorial presentation reading "public interest tech (pit) clinics as applied sociotechnical pedagogy: practice tutorial, FAccT 2025, Athens, Greece; by Lauren M. Chambers and Diag Davenport, UC Berkeley, June 25, 2025."
getting excited for my #FAccT2025 tutorial - today 5 pm Athens time!
I'm sharing new work with @berkeleyischool.bsky.social's awesome Prof. Diag Davenport: "Public Interest Tech (PIT) Clinics as Applied Sociotechnical Pedagogy."
the idea: bring real-world experience & impact into tech education π«
β€οΈβ€οΈβ€οΈ
23.06.2025 15:00 β π 0 π 0 π¬ 0 π 0Weβre here at the airport as Mahmoud Khalil returns home after over 100 days of being unjustly detained for his advocacy for Palestinian rights.
Welcome home, Mahmoud! β€οΈ
And also excited to share our tutorial β on ACLUβs approach to evaluating generative AIβs implications for our work β with @techforimpact.bsky.social @bmad.bsky.social Ranya Ahmed and Evani Radiya-Dixit, also on Wednesday, at 5 PM Athens time programs.sigchi.org/facct/2025/p...
21.06.2025 18:00 β π 8 π 1 π¬ 0 π 0This work was the product of an amazing collaboration across civil society and academia β with a stellar team including @rone.bsky.social and @metaxa.net (Penn HCI), Cole Tanigawa-Lau (Stanford), Lena Armstrong (Harvard) and Ana GutiΓ©rrez (my colleague at ACLU!)
21.06.2025 18:00 β π 6 π 0 π¬ 1 π 0Excited to be at #FAccT2025 and honored to see our work βauditing the auditsβ resulting from one of the first enacted AI laws in the U.S. received an honorable mention! Iβll be presenting this work on Wednesday at 11 AM Athens time: programs.sigchi.org/facct/2025/p...
21.06.2025 18:00 β π 26 π 3 π¬ 1 π 2Some results from this paper: trans people, nonbinary people, and disabled people (especially neurodivergent people and people with mental health conditions) have more negative views towards AI than the general US population.
#FAccT2025
BREAKING: A federal judge reversed National Institutes of Health's terminations of hundreds of critical research grants that were canceled because of their alleged connection to disfavored topics, including diversity, equity, inclusion, and gender identity.
This is a major win for public health.
The National Institutes of Health's efforts to shut down research that doesn't align with the Trump administrationβs political agenda is a direct attack on public health.
16.06.2025 17:57 β π 523 π 112 π¬ 4 π 3The unlawful detention of Mahmoud Khalil is one of the most extreme threats to free speech in 50 years.
Trump is sending a message that people who speak out will be punished β but we have a message for him, too.
We'll see you in court.
BREAKING: We and @nyclu.org are joining Columbia student Mahmoud Khalil's legal team after ICE unlawfully arrested and detained him in retaliation for his political views.
We will see the Trump administration in court.
The role is paid, open to graduate and undergraduate students, and available for remote or hybrid work from our NYC or DC office. The priority application deadline is February 25 β please share widely!
19.02.2025 22:00 β π 0 π 0 π¬ 0 π 0This is a unique opportunity to conduct applied research (our team regularly publishes at conferences like ACM FAccT, for instance) that will shape our approach to leveraging emerging technologies in line with our values.
19.02.2025 22:00 β π 1 π 0 π¬ 1 π 0Projects will be tailored to the internβs skillset and interests, and could range from adapting cutting edge research on bias testing of large language models (LLMs) to develop a model-agnostic testing and evaluation toolkit, to conducting a landscape analysis of open weight model offerings.
19.02.2025 22:00 β π 0 π 0 π¬ 1 π 0For this position, we welcome applicants with backgrounds in technical fields (e.g., engineering, ML, AI, NLP, statistics) and applicants with backgrounds in sociotechnical fields (e.g., history of science, history of technology, HCI, information science, science and technology studies, sociology).
19.02.2025 22:00 β π 0 π 0 π¬ 1 π 0ACLU Technology is hiring an AI Research Intern for Summer 2025! Come help us conduct applied research to evaluate the rapidly evolving landscape of generative AI tools against ACLU transparency, privacy, security, and algorithmic accountability values. π§΅
www.aclu.org/careers/inte...
We believe this body of work will help us begin to incorporate this technology into our work so that it creates, rather than risks or diminishes value. For more info, read about our approach and follow @techforimpact.bsky.social @bmad.bsky.social!
technicallyoptimistic.substack.com/p/procedures...
To stay up to date with emerging research in this area, we formed a reading group open to anyone at the ACLUβincluding staff at our 54 affiliates across the countryβthat shares and discusses emerging research about gen AI (including research papers, model cards, articles, and other explainers).
03.02.2025 21:58 β π 2 π 0 π¬ 1 π 0To operationalize the principles, we created a product focused on procurement of Gen AI tools: a seven-page questionnaire that covers risks related to security, privacy, bias, copyright and accuracy. We ask vendors to explain what theyβre doing with gen AI and hold them accountable to their answers.
03.02.2025 21:58 β π 0 π 0 π¬ 1 π 0Our Generative AI Working Group produced a principles and guidelines resource to help staff assess the potential benefits and risks of gen AI technologies, as well as easy-to-follow guidelines for business problems that staff may want to solve using gen AI.
03.02.2025 21:58 β π 0 π 0 π¬ 1 π 0The Working Groupβs focus has been to develop principles and guidelines for ACLUβs approach to gen AI, and to apply those principles to our procurements in practice β including what we call βtrojan horseβ generative AI β when gen AI features are incorporated into our existing software.
03.02.2025 21:58 β π 0 π 0 π¬ 1 π 0Our focus when it comes to gen AI has been to balance the need to innovate while maintaining our values (e.g., privacy, fairness, etc.). Shortly after the release of chatGPT, we formed a Generative AI Working Group within ACLU, with members from our Tech Team, Privacy, HR, Counsel, and others.
03.02.2025 21:58 β π 0 π 0 π¬ 1 π 0As technologists at a civil rights org, we often grapple with how we can use AI in service of our work, while also fighting harmful uses of AI.
ACLU CTO @techforimpact.bsky.social recently shared more about our approach to internal uses of gen AI: technicallyoptimistic.substack.com/p/procedures...