Centre for Technomoral Futures

Centre for Technomoral Futures

@technomoralfutures.bsky.social

Part of the University of Edinburgh’s Futures Institute, our work aims to unify technical and moral knowledge to serve the goals of sustainable, just & ethical innovation. Find us here: https://www.technomoralfutures.uk/

1,862 Followers 118 Following 158 Posts Joined Aug 2024
1 day ago
YouTube
Celebrating Five Years of Just and Ethical Innovation YouTube video by Centre for Technomoral Futures

The Centre for Technomoral Futures is about bringing together the technical, moral, social expertise that's needed to secure the future of human flourishing.

Watch to learn about our first 5 years, and let us know where you think the next few years might take us! 👇
youtube.com/shorts/GUU70--ftkg

4 1 0 0
1 day ago
Preview
Stakeholder Forum BRAID UK - Stakeholder Forum

Call for EOIs: Join @braiduk.bsky.social and Ada's Stakeholder Forum to help shape the direction of Responsible AI in the UK.

Forum members will inform BRAID's activities and priority areas, and explore challenges and opportunities for Responsible AI.

braiduk.org/braid-stakeh...

📅18 March 2026

1 2 0 0
2 days ago
Preview
ERC PhD studentship: The Ethics and Philosophy of Science of Machine Learning | Scholarships and Student Funding | Student Administration This is a four-year ERC funded PhD studentship (starting in September 2026).

The PhD will research how concepts in ethics and political philosophy might give us new perspectives in philosophy of science about concepts of idealization, representation and model use.

Find out more & apply by 16 March 👇
edin.ac/46cHXPd

@edfuturesinstitute.bsky.social @schoolofppls.bsky.social

0 0 0 0
2 days ago
Headshot of Dr Sullivan. She has blue eyes, and a short brown bob haircut with a fringe. Text on image reads: Dr Emily Sullivan, Co-Director of the CTMF & Senior Lecturer in the Philosophy of Technology. Text reads: ERC PhD studentship: The Ethics and Philosophy of Science of Machine Learning. Applications are open for this funded PhD Studentship, to begin in September 2026. Application Deadline: 16 March 2026, More info: edin.ac/46cHXPd

Applications close on Monday (16 March) for this fully-funded PhD studentship in the Ethics of AI and Philosophy of Science!

Supervised by our CTMF Co-Director Dr Emily Sullivan, this studentship is part of the @erc.europa.eu project TOY: Machine learning in Science and Society: A dangerous toy?

4 6 1 0
2 days ago
Post image

Dr Suzanne Black is joining us to give a paper from her upcoming book https://lsupress.org/9780807186428/between-novel-and-network/ on March 25th. Join us for "Preserving digital cultural heritage: Archive of Our Own and issues of structural bias?" Register here: https://edin.ac/3MgQsSA #EdCDCS

1 1 0 0
2 days ago
Preview
Trash In, Trash Out? Synthetic Data and the Politics of Degenerative AI This talk discusses how phenomena such as model collapse help us make sense of the emergent risks of synthetic data

✨ How can model collapse help us make sense of the emergent risks of synthetic data? And how might it perpetuate notions of real data as 'gold standard'? @bnjacobsen.bsky.social will discuss this in his talk on March 18th - sign up ⬇️

www.eventbrite.com/e/trash-in-t...

3 2 0 1
3 days ago

📸 photos 1 & 2 by @chrisdonia.bsky.social!

0 0 0 0
3 days ago

@edfuturesinstitute.bsky.social @shannonvallor.bsky.social @ginahelfrich.bsky.social @ftollon.bsky.social

0 0 0 0
3 days ago
A full audience in a lecture theatre, listening to Professor Vallor's Inaugural Lecture. Photo by Chris Scott. Professor Shannon Vallor is stood behind a lectern in front of a full audience, mid presentation. Photo by Chris Scott. Members of the CTMF community presenting their research posters and networking. Shannon Vallor introducing Meenakshi Mani's presentation at the CTMF PhD Showcase. Meenakshi's slides read: The trifecta of colonial "isms": AI, Big Tech & Indian education.

📢 We've published our latest Annual Report! 📢

In 2024-2025, we celebrated 5 years of the Centre, 4 of our PhD Fellows submitted their theses, we were joined by @zeerak.bsky.social and 4 new PhD Fellows, and our public events reached nearly 1000 attendees!

Read about our year 👇
edin.ac/3N0WLKm

4 0 2 0
1 week ago

During this event, @amoorelouise.bsky.social and @sophiagoodfriend.bsky.social will each reflect on recent scholarship they have completed which examines different facets of algorithmic life, and how they met the challenges of researching algorithmic lives (and deaths).

1 1 0 0
1 week ago

Algorithmic technologies, including AI, are transforming our lives – for better or for worse. How can social scientists adequately investigate their transformative effects? What does this mean for our political and social theoretical concepts and understandings of human society?

0 0 1 0
1 week ago
Researching Algorithmic Life – A Conversation on Method and Substance Researching Algorithmic Life – A Conversation on Method and Substance, with Professor Louise Amoore FBA and Dr Sophia Goodfriend

Last few tickets remaining for 'Researching Algorithmic Life – A Conversation on Method and Substance’ with @amoorelouise.bsky.social and @sophiagoodfriend.bsky.social!

🗓️ Wednesday 18 March 14.00 – 16.00
📍 Old College, Edinburgh
🎟️ edin.ac/3Njvgf1

@leverhulmecal.bsky.social

1 1 1 0
1 week ago

This PhD studentship will look at how concepts in ethics and political philosophy might give us new perspectives in philosophy of science about concepts of idealization, representation and model use.

Find out more & apply by 16 March 👇
edin.ac/46cHXPd

4 5 0 0
2 weeks ago
Background shows someone Using digital tablet for stock market analysis.

Text reads:
Scams, bubbles and the financial technologies of data societies

The big shed economy
Hyperscale data centre investment 

(or what happens when the world’s dullest industry gets very exciting)

Financial Wellness and the fight against the online scam economy

Controversies in the Data Society 2026 Seminar Series

Scams, bubbles and the financial technologies of data societies
06 March 2026 / 3:10pm - 5:30pm
With Professor Liz McFall and Dr Lana Swartz

Find out more and register your place:
edin.ac/4af78BQ

1 1 1 0
1 week ago
text reads:
Recent developments and hype around technology grounded in natural language - in particular, large language models - has considerable-to-huge controversy about whether they can be ‘tamed’ sufficiently by engineering to suppress the way they seem to encode and seem likely to perpetuate social and economic inequalities when used in the wild. 

This week’s speakers are experts in natural language technology, and will explore how this may be impossible, and how we need instead to develop social processes to manage these challenges.

Can we tame language technologies? How to think about Large Language Models in the wild
13 March 2026
3.10pm - 5.30pm

Book your space:
edin.ac/4cvZwNS

3 1 1 0
2 weeks ago

Our Critical Data Studies group at @edfuturesinstitute.bsky.social has a few talks this semester:

March 27th: Jonathan Gray: Public Data Cultures

April 1st: Amelia Acker: Archiving Machines: A Material History of Data Storage

April 29th: Sophie Bishop: Influencer Creep

See eventbrites below

37 14 1 0
2 weeks ago

This event was run in collaboration with @braiduk.bsky.social & @edfuturesinstitute.bsky.social

📸 Photos by @chrisdonia.bsky.social

2 0 0 0
2 weeks ago
Preview
Technomoral Conversations: What’s the Story with AI? AI Narratives and Counter-Narratives — Centre for Technomoral Futures During this fireside chat, we will hear insights from experts across academia and industry on the dominant narratives surrounding AI, and what alternative stories can and are being told about AI and i...

In case you missed it, the recording of 'Technomoral Conversations: What's the Story with AI? AI Narratives and Counter-Narratives' with @abeba.bsky.social @johnthornhill.bsky.social @amoorelouise.bsky.social & @alex-taylor.bsky.social is available now!

Watch the event recording 👇
edin.ac/4q8TXsZ

13 2 1 0
2 weeks ago
John Thornhill, Abeba Birhane and Steph Wright seated on stage during our Technomoral Conversations event. Abeba is speaking.

"AI systems still perpetuate societal norms and stereotypes and misogyny and racism and harms. But because the 'AI as innovation' or 'AI as societal advancement' narrative has taken over, the other side of the debate or the other side of the argument rarely gets any attention." @abeba.bsky.social

37 14 1 1
3 weeks ago
Post image

Are you making waves in data research, or do you know someone who is?
Submit a nomination for the CDCS Prizes https://www.cdcs.ed.ac.uk/cdcs-prizes #EdCDCS

4 3 0 0
3 weeks ago
Preview
Scotcast - AI Future Martin Geissler hears how AI can be a force for good, but only if people and politicians curtail the power of Big Tech companies.

Was on the telly tonight - it’s available on BBC iPlayer. Almost fell off the couch hearing the intro describe me as a ‘cautious AI optimist’ - it’s a telling misframing.

If you reject the narrative that we must passively accept the worst things about AI, you must believe AI is a force for good!

21 3 2 1
3 weeks ago
Preview
Scotcast - AI philosopher: Why we're not doomed - BBC Sounds A call to arms to make big tech serve people not profit from an ethics expert.

Nice that our Uni news update included this today. Looking forward to listening:

AI philosopher: Why we're not doomed w/ @shannonvallor.bsky.social
www.bbc.co.uk/sounds/play/...

23 4 1 2
3 weeks ago
Preview
ERC PhD studentship: The Ethics and Philosophy of Science of Machine Learning | Scholarships and Student Funding | Student Administration This is a four-year ERC funded PhD studentship (starting in September 2026).

The PhD will research how concepts in ethics and political philosophy might give us new perspectives in philosophy of science about concepts of idealization, representation and model use.

Find out more & apply 👇
edin.ac/46cHXPd

@shannonvallor.bsky.social @edfuturesinstitute.bsky.social

2 2 0 0
3 weeks ago
Text reads: ERC PhD studentship: The Ethics and Philosophy of Science of Machine Learning. Applications are open for this funded PhD Studentship, to begin in September 2026. Application Deadline: 16 March 2026, More info: edin.ac/46cHXPd Headshot of Dr Sullivan. She has blue eyes, and a short brown bob haircut with a fringe. Text on image reads: Dr Emily Sullivan, Co-Director of the CTMF & Senior Lecturer in the Philosophy of Technology.

@schoolofppls.bsky.social are now accepting applications for a fully funded PhD studentship in the Ethics of AI and Philosophy of Science, supervised by CTMF Co-Director Dr Emily Sullivan!

The PhD is part of the @erc.europa.eu project TOY: Machine learning in Science and Society: A dangerous toy?

13 5 2 1
1 month ago

We have a last minute addition to our panel! @elmosmoe.bsky.social, co-founder and managing director of @our-ai-collective.bsky.social, will be joining us this evening!

Unfortunately, Louise Amoore will no longer be able to join us. We look forward to welcoming her to Edinburgh for a future event!

3 0 0 0
1 month ago
Technomoral Conversation: What’s the Story with AI? Exploring AI Narratives Hear insights on the dominant narratives around AI and what alternative stories can and are being told about AI and its place in our futures

Register for our Technomoral Conversation at the link below 👇

🗓️ Wednesday 11 February 18.00-19.30
📍 Edinburgh Futures Institute & online
🎟️ edin.ac/3MZEm0a

0 0 0 0
1 month ago

During this free public event, our speakers will discuss the dominant narratives surrounding AI and what alternative stories can and are being told about AI and its place in our futures.

Run in collaboration with @braiduk.bsky.social & @edfuturesinstitute.bsky.social!

1 0 1 0
1 month ago
Blue to green gradient graphic with headshots of the speakers for the Technomoral Conversations event, and text reading: "Technomoral Conversations: What's the Story with AI? Exploring AI Narratives. Join us on 11 February at 6pm in Edinburgh & online, where we will hear from Alex Taylor (University of Edinburgh), Abeba Birhane (Trinity College Dublin), Louise Amoore (Durham University) and John Thornhill (Financial Times)." There are logos for EFI, CTMF and BRAID, the co-organisers of the event.

Last chance to register to attend our Technomoral Conversations event, 'What's the Story with AI? Exploring AI Narratives and Counter-Narratives'!

Tomorrow night we'll hear from @abeba.bsky.social @johnthornhill.bsky.social @amoorelouise.bsky.social and @alex-taylor.bsky.social!

🎟️ edin.ac/3MZEm0a

9 1 1 1
1 month ago
Text reads: 
Rethinking sustainable AI
Labour and environmental perspectives from the Global South
Beyond the label: repairing data work, rethinking AI
Despite recent advances in AI’s computational capabilities, data work—the human labour required for training, fine-tuning, and evaluating AI systems - remains indispensable to AI production. Yet, data work is constituted as a routine and repetitive activity, with little scope for applying expertise and skill, and often conducted under unfair conditions of work. 

Sustainability kaleidoscope to shield predatory corporations fostering AI green economy 
As any mainstream concept and buzzword, sustainability might be used to overshadow a myriad of meanings and a part of them tailored to cover environmentally damaging activities sponsored by private corporations and companies worldwide: a phenomenon broadly known as greenwashing.

CONTROVERSIES in the DATA SOCIETY
Seminar Series

Rethinking sustainable AI
Labour and environmental perspectives from the Global South
Friday, Feb 13
3:10pm - 5:30 pm

Srravya Chandhiramowuli and Dr Beatrice Bonami

Booking:
edin.ac/4tejumk

6 2 1 0
1 month ago
Preview
ERC PhD studentship: The Ethics and Philosophy of Science of Machine Learning | Scholarships and Student Funding | Student Administration This is a four-year ERC funded PhD studentship (starting in September 2026).

Good news - we have a new ERC-funded 4-year PhD Studentship at the University of Edinburgh, based in Philosophy and our Centre @technomoralfutures.bsky.social, supervised by Dr Emily Sullivan; the project applies philosophy of science to assessing ML's epistemic & social value. Apply by 16 March!

36 15 0 1