Stephen Neville's Avatar

Stephen Neville

@silentrecord.bsky.social

Métis | Communication & Culture PhD Candidate @yorkuniversity | Infoscape Lab | Institute for Research on Digital Literacies | Media, Sound, Surveillance The Politics of Media Scarcity (2024) out now.

1,062 Followers  |  1,317 Following  |  77 Posts  |  Joined: 12.11.2024  |  2.0088

Latest posts by silentrecord.bsky.social on Bluesky

Wow. I see it protects patients receiving therapy but wonder how it applies or not in relation to crisis line help seekers.

09.08.2025 18:43 — 👍 2    🔁 1    💬 0    📌 0

Thank you! Looking forward to reading it.

09.08.2025 18:40 — 👍 1    🔁 0    💬 0    📌 0

Awesome!

08.08.2025 16:54 — 👍 2    🔁 0    💬 0    📌 0

Thank you! I will stay tuned for that piece.

08.08.2025 16:14 — 👍 2    🔁 0    💬 1    📌 0

Thank you for bringing this to my attention! My intention was to use this piece as a conversation starter between AI enthusiasts and those of us who are skeptical and deeply concerned about its implementation in this context. I definitely should have provided this essential context.

08.08.2025 16:05 — 👍 1    🔁 0    💬 1    📌 0
Preview
When Help Isn’t Fully Human: The Problem of Generative AI in Crisis Support Generative AI is becoming more ubiquitous across industries and services, including mental health and crisis counseling. Here, 2024 Data Fluencies Grantee Stephen Neville reflects on the proliferation...

If you're working or advocating in this space, I’d really like to hear what you think.

Read the full article here just-tech.ssrc.org/articles/the...

07.08.2025 19:14 — 👍 2    🔁 1    💬 0    📌 0

We’ve learned (the hard way) what happens when "innovation" outruns ethical responsibility. On that basis, I’m calling for measured AI use and transparent policies, so that technological change doesn't come at the cost of trustworthy care.

07.08.2025 19:14 — 👍 3    🔁 3    💬 1    📌 0

What are the ethical problems in using de-identified data from vulnerable people to train commercial systems?

How do informal uses of tools like ChatGPT by crisis workers alter issues of trust and privacy?

07.08.2025 19:14 — 👍 2    🔁 1    💬 1    📌 0

I'm raising some urgent questions for frontline workers, orgs, and community advocates:
What role should AI play in striking a balance between the need for scalable crisis support and the ethical imperative to protect public trust, consent, and human connection?

07.08.2025 19:14 — 👍 1    🔁 1    💬 1    📌 0

People in crisis are reaching out for human support and increasingly the response they get is shaped by generative AI.

I wrote about how gen AI is reshaping crisis support, and why that should concern all of us advocating for responsible and trustworthy crisis counselling and peer support.

07.08.2025 19:14 — 👍 5    🔁 1    💬 2    📌 1
Preview
White House proposes axing 988 suicide hotline services for LGBTQ youth The specialized suicide-prevention counseling service for LGBTQ youth and young adults received more than 1.3 million contacts since it started in 2022, advocates said.

This is abhorrent. LGBTQ+ youth are disproportionately in need of crisis counselling support.

www.nbcnews.com/nbc-out/out-...

11.06.2025 01:43 — 👍 0    🔁 0    💬 0    📌 0

people always dismiss the data privacy concerns—like hmm don’t collect unnecessary data on sensitive populations, because it can be used against them

and now the city of San Antonio is handing over data on every person that came through our migrant center because they need FEMA reimbursements

27.03.2025 15:40 — 👍 93    🔁 23    💬 3    📌 1

Absolutely devastating news.

23.03.2025 02:27 — 👍 2    🔁 1    💬 0    📌 0

The Rolling Stoners

22.03.2025 13:04 — 👍 1    🔁 0    💬 0    📌 0

“The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn.” -Alvin Toffler

I.e. The illiterate of the 21st century will be those using generative AI when they should be doing some thinking for themselves.

20.03.2025 15:34 — 👍 10    🔁 3    💬 0    📌 0
Preview
Everything you say to your Echo will be sent to Amazon starting on March 28 Amazon is killing a privacy feature to bolster Alexa+, the new subscription assistant.

Get rid of your Amazon Echo
arstechnica.com/gadgets/2025...

15.03.2025 01:50 — 👍 100    🔁 71    💬 12    📌 25

I've been thinking about how to apply this for undergrad education in particular and hoping for people's insight.

Strangely, I feel like community agreements are actually a great way of starting discussion.

12.02.2025 01:34 — 👍 0    🔁 0    💬 0    📌 0

Yes, exactly! I've been part of classes and summer school programs that integrate these collaborative community agreement documents. I've found it's a great way to break the ice with folks to illustrate some shared values and to demonstrate that discussions will be held in a safe environment.

12.02.2025 01:34 — 👍 0    🔁 0    💬 1    📌 0

i'm continuing to try raising awareness: exploitation of persons, vulnerable, who use & answer #CrisisLines.
To have this work amplified by s.o. so dedicated to protections for vulnerable & expert in subject matter, feels such an honor. grateful! We can end exploitation #CrisisLineWorkers #organize

11.02.2025 03:03 — 👍 1    🔁 1    💬 0    📌 0

We owe a lot to citizen activism efforts like this in demanding higher ethical standards in academic research in the nonprofit tech sector.

07.02.2025 21:35 — 👍 0    🔁 0    💬 0    📌 0

Lack of Safeguards for Vulnerable Populations: The research involved minors and individuals in crisis, yet the paper did not meaningfully address ethical concerns specific to these groups, nor the regulatory requirements for research on children.

07.02.2025 21:35 — 👍 0    🔁 0    💬 1    📌 0

Transparency Issues: Several authors of the original research on CTL’s data-sharing project had undisclosed ties to CTL, including advisory roles. The paper also made no mention of Loris.ai, however the conflicts of interest section on the paper has since been updated.

07.02.2025 21:35 — 👍 0    🔁 0    💬 1    📌 0

Commercialization of Crisis Data: Despite previous assurances from CTL that conversation data would never be used for commercial purposes, CTL later licensed it to Loris.ai, a for-profit company that used the data to create AI customer service models.

07.02.2025 21:35 — 👍 0    🔁 0    💬 1    📌 0

Here are some major takeaways from Reierson's commentary:

Informed Consent Concerns: CTL claimed that texters provided consent to have their data used via a terms of service, but research consent requires more explicit and meaningful consent especially when involving vulnerable populations.

07.02.2025 21:35 — 👍 0    🔁 0    💬 1    📌 0
Preview
Protecting User Privacy and Rights in Academic Data-Sharing Partnerships: Principles From a Pilot Program at Crisis Text Line Data sharing between technology companies and academic health researchers has multiple health care, scientific, social, and business benefits. Many companies remain wary about such sharing because of ...

The commentary is based on a former publication in JMIR which detailed CTL’s data sharing partnership with academic researchers. In light of the issues presented, it is ironic that the original article is entitled “Protecting User Privacy and Rights…”

www.jmir.org/2019/1/e11507/

07.02.2025 21:35 — 👍 0    🔁 0    💬 1    📌 0
Preview
Commentary on “Protecting User Privacy and Rights in Academic Data-Sharing Partnerships: Principles From a Pilot Program at Crisis Text Line”

This article is very worthwhile read and shows the ongoing activism efforts of a former CTL volunteer (@holdspacefree.bsky.social) in demanding accountability from CTL and academic researchers behind its data enclave pilot project.
www.jmir.org/2024/1/e42144

07.02.2025 21:35 — 👍 0    🔁 0    💬 1    📌 0

It seems like a distant memory when news broke in 2022 from Politico about the data sharing partnership between Crisis Text Line and Loris.ai – a subsidiary that used crisis conversation data to create AI models for customer service support. The story is ongoing.

www.politico.com/news/2022/01...

07.02.2025 21:35 — 👍 0    🔁 0    💬 1    📌 1
Angelo Badalamenti explains how he wrote Laura Palmer's Theme
YouTube video by Joseph Angelo Badalamenti explains how he wrote Laura Palmer's Theme

Badlamenti on how he and Lynch wrote the Twin Peaks theme.

16.01.2025 18:34 — 👍 8    🔁 4    💬 1    📌 0

I'm wondering if folks would share their approach to community agreements in the post-secondary classroom?

Do you start with a blank template or provide some anchoring principles/values to add to, revise, etc.?

16.01.2025 15:54 — 👍 0    🔁 0    💬 1    📌 1
A big stack of books related to Metis culture and history.

A big stack of books related to Metis culture and history.

In the weeds of writing today with a great big stack of books on Métis culture and history.

15.01.2025 20:00 — 👍 2    🔁 0    💬 0    📌 0

@silentrecord is following 20 prominent accounts