I hope this primer proves useful to you and within your communities. Please consider sharing it to help raise awareness in connection with Data Privacy Day.
28.01.2026 15:24 — 👍 0 🔁 0 💬 0 📌 0@silentrecord.bsky.social
Métis | Communication & Culture PhD Candidate @yorkuniversity | Infoscape Lab | Institute for Research on Digital Literacies | Media, Sound, Surveillance The Politics of Media Scarcity (2024) out now.
I hope this primer proves useful to you and within your communities. Please consider sharing it to help raise awareness in connection with Data Privacy Day.
28.01.2026 15:24 — 👍 0 🔁 0 💬 0 📌 0The updated primer features an expanded discussion of gaps in privacy protections, as well as new recommendations aimed at engaging civil society organizations in policy-related action.
28.01.2026 15:24 — 👍 0 🔁 0 💬 1 📌 0Some of you may have participated in an earlier event in October as part of Media Literacy Week organized by @mediasmarts.bsky.social , where you shared valuable feedback with the research team.
28.01.2026 15:24 — 👍 0 🔁 0 💬 1 📌 0
drive.google.com/file/d/1NVPJ...
This resource offers an overview of this rapidly evolving area of technology and highlights key privacy considerations and rights protections in a Canadian context.
Today marks Data Privacy Day, an international initiative dedicated to promoting awareness and best practices around privacy protection. In recognition of this occasion, I'm pleased to share an updated version of the digital literacy primer, “Understanding Machine Listening and Voice Data Privacy”.
28.01.2026 15:24 — 👍 0 🔁 0 💬 1 📌 1
For those interested in this issue, my team and I have prepared a public-facing primer on voice data privacy, which explores these risks and the current gaps in Canadian privacy law.
drive.google.com/file/d/1NVPJ...
At a minimum, a meaningful step forward would be clear and enforceable limits on how voice data is collected, used, and disclosed, including prohibitions on using police call data to train commercial AI systems beyond narrowly defined purposes.
16.01.2026 18:55 — 👍 0 🔁 0 💬 1 📌 0In the wrong contexts, this data could be used to profile people based on their credibility, race, or other variables. Given this sensitivity, commercial uses of voice data likely require explicit consent under PIPEDA.
16.01.2026 18:55 — 👍 2 🔁 0 💬 1 📌 0
Voice data is biometric information: it is tied to our bodies and is individually identifying. This raises the risk of re-identification considerably.
Voice also reveal information about emotional state, health conditions, mental health indicators, language background, and race.
Data derived from 811 or 911 calls would be highly valuable to insurance companies and other parties, for example, and could be used to profile individuals or communities.
4) Voice data is highly sensitive biometric data
Yet aggregate and anonymized data still carry real risks of re-identification.
In Canada, under PIPEDA, commercial uses of identifiable personal data require meaningful consent because disclosure to third parties creates downstream risks.
This would position an automated decision-making system at the very front end of emergency response.
3) Serious consent problems
Hyper’s ToC and privacy policy allow broad secondary uses of caller data, including training AI models, developing new products, and creating anonymized datasets.
Data monitoring systems introduced for narrow purposes are routinely expanded over time. This well-documented process is known as function creep.
A plausible future scenario is the use of Hyper as a triage system for 911 calls.
This could lead to under-reporting of non-emergency incidents, and potentially even emergency situations, due to privacy and surveillance fears.
2) Function creep from non-emergency to emergency calls
Toronto Police have stated that Hyper will only be used for 811 calls. For now.
That reluctance is likely to be far stronger when contacting the police.
For people concerned about data surveillance, particularly members of racialized communities, the idea that a police call is being analyzed and stored by AI may discourage contact altogether.
1) Risk of damaged trust and under-reporting
The core question is not only whether the system works as advertised, but whether people want to use it at all. Many people already avoid automated customer service systems out of frustration.
In the meantime, I shared several privacy and surveillance concerns in an interview with Toronto Today [link]. Here are four points that deserve closer attention: www.torontotoday.ca/local/crime-...
16.01.2026 18:55 — 👍 1 🔁 0 💬 1 📌 0
Hyper is already being used by police departments across Canada and the United States.
Imagine interacting with Alexa or Siri, except this time it is the police on the other end of the line.
A privacy impact assessment hasn’t been released.
If you live in North America, the next time you call the police, your call may be answered by AI.
This week, the Toronto Police Service announced it will begin using a voice AI system developed by Hyper for 811 non-emergency calls.
🗓️ Wednesday, Oct 29 | 12–1pm EDT
💻 Zoom registration: yorku.zoom.us/meeting/regi...
Join us for a conversation about how smart speakers, voice assistants, and other listening apps shape our privacy. We're also looking for feedback: how can this primer better address the privacy questions that matter most to you and your community?
#MediaLiteracyWeek
48 hours until our event, Understanding Machine Listening and Voice Data Privacy, as part of 2025 Media Literacy Week!
We’re excited to share a new resource on digital literacy and machine listening. Take a look at the draft primer: drive.google.com/file/d/15lJw...
@mediasmarts.bsky.social
The event is co-hosted by the Infoscape Research Lab, the Institute for Research on Digital Literacies, and @mediasmarts.bsky.social.
@alexborko.bsky.social @alisonharvey.aoir.social.ap.brid.gy
Event: Understanding Machine Listening and Voice Data Privacy
📅 Wednesday, October 29, 12-1pm EDT
💻On Zoom. Register here: yorku.zoom.us/meeting/regi...
We'll be sharing a new digital literacy resource, followed by an interactive session exploring “machine listening” and why it matters for your privacy. This event is part of ongoing knowledge mobilization supported by the Office of the Privacy Commissioner of Canada.
Details below:
Ever wonder if Alexa or Siri hear more than they should?
Come to an online interactive session covering this question and more on Oct. 29th, as part of Media Literacy Week (Oct. 27-31).
Wow. I see it protects patients receiving therapy but wonder how it applies or not in relation to crisis line help seekers.
09.08.2025 18:43 — 👍 2 🔁 1 💬 0 📌 0Thank you! Looking forward to reading it.
09.08.2025 18:40 — 👍 1 🔁 0 💬 0 📌 0Awesome!
08.08.2025 16:54 — 👍 2 🔁 0 💬 0 📌 0Thank you! I will stay tuned for that piece.
08.08.2025 16:14 — 👍 2 🔁 0 💬 1 📌 0