She Came Out of the Bathroom Naked, Employee Says
Bank details, sex and naked people who seem unaware they are being recorded. Behind Meta’s new smart glasses lies a hidden workforce, uneasy about peering into the most intimate parts of other people’...
A striking piece on how Kenyan data workers at Sama are reviewing and annotating private data captured by Meta’s “smart glasses.”
I’ve visited the Nairobi site. Most people in Europe would be stunned by how much of their everyday data is being processed and labeled there
www.svd.se/a/K8nrV4/met...
04.03.2026 08:05 —
👍 9
🔁 1
💬 0
📌 2
THE SILICON GAZE
After reading a really interesting paper from @oii.ox.ac.uk (link below), I asked ChatGPT (version 5.2) to give an ranking of countries by IQ, 'extrapolating' and 'estimating' where data was not available.
I then asked it to provide an 'approximate' heat map of the estimates
1/2
28.01.2026 13:29 —
👍 6
🔁 3
💬 2
📌 0
OpenAI’s ChatGPT has a Western bias, study finds
ChatGPT’s viewpoints are shaped by the predominantly Western, white, male developers and platform owners who built it, a study finds.
New coverage from @euronews.com on Prof. @geoplace.bsky.social's research, which finds that answers from OpenAI’s ChatGPT favour wealthy, Western countries and sideline much of the Global South.
Read more:
www.euronews.com/next/2026/01...
22.01.2026 10:26 —
👍 2
🔁 1
💬 0
📌 0
"From these empirics, we argue that bias is (...) an intrinsic feature of generative AI, rooted in historically uneven data ecologies and design choices (...) that accounts for the complex ways in which LLMs privilege certain places while rendering others invisible."
AI scare us cos it's based on us
21.01.2026 07:16 —
👍 5
🔁 1
💬 0
📌 0
'The silicon gaze: A typology of biases and inequality in LLMs through the lens of place'.
Develops "a five-part typology of bias (availability, pattern, averaging, trope, and proxy) that accounts for the complex ways in which LLMs privilege certain places while rendering others invisible."
21.01.2026 15:16 —
👍 3
🔁 1
💬 1
📌 0
Researchers FranciscKerche, Matthew Zook and @geoplace.bsky.social show how bias emerges in ChatGPT outputs. For example, responses to queries rank Ipanema, Leblon and Lagoa as having the happiest people compared to Complexo do Alemão, Complexo da Maré and Rio Comprido as the unhappiest. 2/4
20.01.2026 15:45 —
👍 1
🔁 1
💬 1
📌 0
The team has created a public website inequalities.ai where anyone can explore how ChatGPT rates countries, cities and neighbourhoods across a range of lifestyle indicators including food, culture and quality of life. 3/4
20.01.2026 10:17 —
👍 10
🔁 3
💬 1
📌 1
Researchers Francisco Kerche, Prof Matthew Zook and @geoplace.bsky.social find that ChatGPT reproduces global biases. For example, responses rank Brighton, London and Bristol as having the sexiest people in the UK whilst Grimsby, Accrington and Barnsley are rated lowest. More: bit.ly/4bF4K9B
20.01.2026 14:46 —
👍 1
🔁 1
💬 0
📌 0
New study from @oii.ox.ac.uk and the University of Kentucky sheds light on how bias manifests in ChatGPT outputs. For example, London boroughs Bloomsbury, Hampstead and the City of London are rated as having the smartest people with Croydon, Tottenham and Hillingdon rated the lowest. 1/2
20.01.2026 14:46 —
👍 4
🔁 1
💬 1
📌 0
AI 'reveals' the most racist towns in the UK - Burnley tops list
When asked which UK towns and cities are the most racist, ChatGPT claims that Burnley tops the list. This is followed by Bradford, Belfast, Middlesbrough, Barnsley, and Blackburn.
“ChatGPT isn't an accurate representation of the world. It rather just reflects and repeats the enormous biases within its training data” @geoplace.bsky.social @oii.ox.ac.uk speaking to @dailymail.co.uk about his new co-authored study with University of Kentucky. www.dailymail.co.uk/sciencetech/...
20.01.2026 13:51 —
👍 6
🔁 2
💬 0
📌 0
News alert! New study from @oii.ox.ac.uk and the University of Kentucky finds that ChatGPT amplifies global inequalities. Researchers find that large language models reflect historic biases in the data sets they learn from whilst shaping how people see the world. More here: bit.ly/4bF4K9B 1/4
20.01.2026 10:17 —
👍 15
🔁 8
💬 1
📌 0
AI thinks these are the most racist places in the UK
ChatGPT answers often repeat negative stereotypes and reinforce prejudices, study shows
New @oii.ox.ac.uk and University of Kentucky study shows how ChatGPT amplifies global inequalities, with LLMs reflecting historic biases in training data. With thanks to @telegraph.co.uk for sharing the study. @geoplace.bsky.social
www.telegraph.co.uk/business/202...
20.01.2026 12:10 —
👍 7
🔁 2
💬 0
📌 0
Researchers Francisco Kerche, Matt Zook and @geoplace.bsky.social find responses generated by ChatGPT consistently rate wealthier, western regions as ‘better’, smarter’, ‘happier’ and ‘more innovative’. 2/4
20.01.2026 10:17 —
👍 2
🔁 1
💬 1
📌 0
Place is not a neutral category in AI systems. Our findings show how historical and institutional patterns of documentation become legible as common sense in LLM outputs.
You can explore all of our data and create your own maps at inequalities.ai
20.01.2026 10:11 —
👍 3
🔁 0
💬 0
📌 0
One recurring issue is the use of proxies: quantifiable stand-ins (rankings, lists, awards) used to answer questions that are not straightforwardly measurable. This tends to advantage already-visible places.
20.01.2026 10:11 —
👍 2
🔁 0
💬 1
📌 0
We used forced-choice prompts to elicit comparative judgements about places. This makes latent preferences and stereotypes easier to detect than in open-ended responses.
20.01.2026 10:11 —
👍 1
🔁 0
💬 1
📌 0
The paper develops a typology of five recurrent biases in LLM place representations: availability, pattern, averaging, trope, and proxy. The maps illustrate how these surface across regions.
20.01.2026 10:11 —
👍 1
🔁 0
💬 1
📌 0
A large share of place-based answers in LLMs appear to be shaped by uneven visibility in the underlying data. This is particularly evident for places that are sparsely documented online.
20.01.2026 10:11 —
👍 1
🔁 0
💬 1
📌 0
We introduce the term “silicon gaze” to describe patterned inequalities in how LLMs represent place. The paper sets out a typology and maps the resulting spatial distributions.
20.01.2026 10:11 —
👍 1
🔁 0
💬 1
📌 0
Our new paper audits ChatGPT’s place-based judgements using 20 million pairwise comparisons. We find systematic geographic biases in how places are described and evaluated.
journals.sagepub.com/doi/10.1177/... (authors: Francisco W. Kerche, Matthew Zook, Mark Graham)
20.01.2026 10:11 —
👍 14
🔁 10
💬 1
📌 2
Artificial intelligence (AI) and employment
Artificial intelligence (AI) is becoming more common in UK workplaces. How is it being used, and what are the impacts on job opportunities and working conditions?
The OII's Prof. @geoplace.bsky.social contributed to POST UK's report on AI and employment, which considers the factors driving adoption and the issues that might make this challenging.
Read the full report here: post.parliament.uk/research-bri...
08.01.2026 11:51 —
👍 2
🔁 1
💬 0
📌 0
Fairwork’s AI Supply Chain Assessment: Appen report is now LIVE.
- 15 changes were implemented by Appen during the assessment period.
- The report also highlights areas for further progress, including pay, worker protections, and transparency.
Read the full report here: fair.work/en/fw/public...
16.12.2025 10:25 —
👍 2
🔁 1
💬 0
📌 0
If AI is going to be fair, its supply chains have to be too. That’s why we’ve launched Fairwork Certification, working with lead firms to push higher standards all the way down their chains. Details here:
fair.work/wp-content/u...
10.12.2025 10:59 —
👍 0
🔁 0
💬 0
📌 0
A new @towardsfairwork.bsky.social assessment of Sama is out.
It looks at the people doing the invisible data work that keeps AI running for companies in sectors from driverless cars to online retail.
Read the scorecard and our report here:
🔗 fair.work/en/fw/public...
10.12.2025 10:59 —
👍 3
🔁 2
💬 1
📌 0
„Die Arbeitsbedingungen sind brutal“: Über die geheimen Malocher hinter ChatGPT
Hinter jedem KI-Bild steckt Handarbeit: Menschen in Kenia, Indien oder auf den Philippinen schuften stundenlang für Hungerlöhne, um Maschinen klüger zu machen. Im Gespräch enthüllt Mark Graham die uns...
KI klingt nach Zukunft – aber sie lebt von menschlicher, unsichtbarer Arbeit. Ich habe für @freitag.de mit @geoplace.bsky.social über sein Buch „Feeding the Machine“ gesprochen – und darüber, was KI wirklich kostet. Leseempfehlung für alle, die hinter die Kulissen der "KI-Unternehmen" schauen wollen
10.11.2025 08:39 —
👍 123
🔁 74
💬 5
📌 6
I’ve got a new chapter out with Adam Badger, Alessio Bertolini, Fabian Ferrari & Funda Ustek Spilda in the forthcoming Handbook of Labour Geography. In it, we unpack the Fairwork action-research method.
Read: www.elgaronline.com/edcollchap/b...
30.10.2025 15:32 —
👍 1
🔁 0
💬 0
📌 0