βthe primary framing of open science as the pursuit of reproducibility and objectivity risks promoting positivism creep in the social sciences and humanitiesβ -
05.12.2025 16:19 β π 14 π 2 π¬ 1 π 0@maddipow.bsky.social
Associate Professor. Keen bean. Writer. Pedagogy, psychology, reflexivity, open science, iced lattes. ABSENT MINDS coming May 2026π. She/her π»
βthe primary framing of open science as the pursuit of reproducibility and objectivity risks promoting positivism creep in the social sciences and humanitiesβ -
05.12.2025 16:19 β π 14 π 2 π¬ 1 π 0New preprint with Thomas Graves and @annayahprosser.bsky.social: "Getting Creeped Out? Open Science, Qualitative Methods, and the Dangers of Positivism Creep".
osf.io/preprints/so...
It has been a while since @flavioazevedo.bsky.social asked me to take over the Reversals project at @forrt.bsky.social...
It has since evolved beyond my wildest dreams, mostly thanks to @aufdroeseler.bsky.social & @lukaswallrich.bsky.social.
We are very proud of the Replication Hub & Database β₯οΈ
"We publish imagined research abstracts as works of fiction firstly because writing for enjoyment is a good thing to encourage...writing imaginatively is a good way to reshape our relationship with writing into something creative and enjoyable"
02.12.2025 15:59 β π 2 π 0 π¬ 0 π 0The head of open research at Taylor & Francis was supportive when we had a chat and agreed to create some guidelines for living SRs in the first instance, so happy to connect you up if that's useful
01.12.2025 21:09 β π 0 π 0 π¬ 0 π 0Iβd love to hear about how you get on with piloting this!
01.12.2025 20:12 β π 1 π 0 π¬ 1 π 0Systematic reviews often guide policy and theory, but they can become quickly outdated. Almost 10% of systematic reviews are out of date even *before they are published*.
Living systematic reviews (LSRs) continuously integrate new evidence, therefore they offer a solution to this problem.
2/8
I am so happy to see this. After the metascience conference in July I spoke with a few publishers to try and propose something similar to this (registered reports meets βlivingβ systematic review) but the idea never really got off the ground. This paper is so important and I LOVE it
01.12.2025 20:11 β π 6 π 0 π¬ 1 π 0P.S you can preorder it here www.waterstones.com/book/absent-...
26.11.2025 18:15 β π 3 π 0 π¬ 0 π 0I was proofreading my book on the train and the man next to me audibly complained when I moved the page before heβd finished reading over my shoulder. Fair play
26.11.2025 18:14 β π 6 π 0 π¬ 1 π 0Wow!
26.11.2025 09:29 β π 1 π 0 π¬ 0 π 0HYBRID EVENT: 'Introduction to Quantitative Bias' with Rachel Hughes on Monday 1st December at 10am-12pm UK time.
For those near Leeds, bring your laptop and enjoy this as an in-person session!
Sign-up before 9am on Thursday 27th via: www.eventbrite.co.uk/e/introducti...
Science is a collective effort, but @simine.com is a singular force. She is an exemplary model for all of us to follow in her commitment and action to improving science, on every dimension.
She is so deserving of the award. The only uncertainty is whether the award deserves her!
This paper provides guidance and tools for conducting open and reproducible systematic reviews in psychology. It emphasizes the importance of systematic reviews for evidence-based decision-making and the growing adoption of open science practices. Open science enhances transparency, reproducibility, and minimizes bias in systematic reviews by sharing data, materials, and code. It also fosters collaborations and enables involvement of non-academic stakeholders. The paper is designed for beginners, offering accessible guidance to navigate the many standards and resources that may not obviously align with specific areas of psychology. It covers systematic review conduct standards, pre-registration, registered reports, reporting standards, and open data, materials and code. The paper is concluded with a glimpse of recent innovations like Community Augmented Meta-Analysis and independent reproducibility checks.
There is no reason why systematic reviews can't be open. The data used for synthesis is *already* open and there are many excellent open source tools that can facilitate the easy sharing of analysis scripts.
Here's a nice guide for performing open systematic reviews doi.org/10.1525/coll...
This is such wonderful news. Simine has done so much for advocating for robust, fair, transparent practices throughout psychology and beyond π
24.11.2025 12:41 β π 7 π 1 π¬ 0 π 0Patton, C. (2024). Replicability and the humanities: the problem with universal measures of research quality. Research Evaluation, 34. https://doi.org/10.1093/reseval/rvaf052
A new article by Chloe Patton in #ResearchEvaluation shows how debates about #OpenScience often slip into absurdity β like demanding #replication from the #Humanities. You canβt replicate history, culture, or interpretation the way you replicate a physics experiment: doi.org/10.1093/rese...
23.11.2025 08:43 β π 14 π 4 π¬ 1 π 1βProhibited words and concepts: Equity, diversity, & inclusion; anti-racism; Bias; Critical race theory; implicit bias; oppression; intersectionality; prohibited discriminatory practices; racial privilege; promoting stereotypes based on personal identity characteristics.β
This week, I withdrew from a speaking engagement at a public university because they sent me a list of prohibited βwords & concepts.β I will not humor this censorship. It does a disservice to the stories Iβm discussing & the audience, who deserve unfettered access to information & conversation.
20.11.2025 22:20 β π 8727 π 2177 π¬ 161 π 250Love this β¬οΈ
18.11.2025 06:46 β π 6 π 0 π¬ 0 π 0'Me and AI'β¦
We want to hear your views on AI's place in Psychology, and your own journey in Psychology.
www.bps.org.uk/psychologist...
POV: you are a young woman celebrating a recent academic success
17.11.2025 19:20 β π 20459 π 3275 π¬ 3190 π 822π£ Digital Research Community!
The new UK Adolescent Health Study will follow 100k young people (8β18yrs) for 10+ years. Please share what digital technology measures you think it should include.
Please complete this survey (by 24th November 2025 @ 9AM): cambridge.eu.qualtrics.com/jfe/form/SV_...
Transparent and comprehensive statistical reporting is critical for ensuring the credibility, reproducibility, and interpretability of psychological research. This paper offers a structured set of guidelines for reporting statistical analyses in quantitative psychology, emphasizing clarity at both the planning and results stages. Drawing on established recommendations and emerging best practices, we outline key decisions related to hypothesis formulation, sample size justification, preregistration, outlier and missing data handling, statistical model specification, and the interpretation of inferential outcomes. We address considerations across frequentist and Bayesian frameworks and fixed as well as sequential research designs, including guidance on effect size reporting, equivalence testing, and the appropriate treatment of null results. To facilitate implementation of these recommendations, we provide the Transparent Statistical Reporting in Psychology (TSRP) Checklist that researchers can use to systematically evaluate and improve their statistical reporting practices (https://osf.io/t2zpq/). In addition, we provide a curated list of freely available tools, packages, and functions that researchers can use to implement transparent reporting practices in their own analyses to bridge the gap between theory and practice. To illustrate the practical application of these principles, we provide a side-by-side comparison of insufficient versus best-practice reporting using a hypothetical cognitive psychology study. By adopting transparent reporting standards, researchers can improve the robustness of individual studies and facilitate cumulative scientific progress through more reliable meta-analyses and research syntheses.
Our paper on improving statistical reporting in psychology is now online π
As a part of this paper, we also created the Transparent Statistical Reporting in Psychology checklist, which researchers can use to improve their statistical reporting practices
www.nature.com/articles/s44...
implies the only people interested in the obituaries are psychologists. V short sighted to keep historically important psychologists fenced off from the outside world
13.11.2025 12:50 β π 5 π 1 π¬ 0 π 0ββ¦This means that one of the indispensable inputs of generative AI is the very output that open science works hard to generate and perfect: open data, source code, scientific articles and educational resources, all of which are provided for free and are often funded by tax-payer moniesβ
14.11.2025 16:48 β π 7 π 1 π¬ 0 π 0Open up access to the pastβ¦
Dr @maddipow.bsky.social (University of Leeds) on the importance of obituaries.
www.bps.org.uk/psychologist...
Oh thatβs a whole other thing but equally as important!
13.11.2025 14:19 β π 1 π 0 π¬ 0 π 0Iβm sorry that you canβt see how publishing a commemoration of someoneβs life is fundamentally a different thing to publishing a scientific article. I donβt know how else to explain this
13.11.2025 14:19 β π 0 π 0 π¬ 0 π 0Shout out to @psychmag.bsky.social for being speedy, open access, and helping to share things I care about quickly and freely:
www.bps.org.uk/psychologist...
Because obituaries are celebrations of peopleβs lives, not just a record of a scientific contribution. I get that the publishing model is what it is, but this feels different.
Do you think itβs right that journals should profit from celebrating the lives of people who have passed?
Response from the APA:
βThe obituaries are a highly valued part of the American Psychologist. We offer access to the journal as part of our APA membership. If you are interested in membership, please let us knowβ