The FORRT Library of Reproduction and Replication Attempts (FLoRA) will be the basis of upcoming tools and projects that help replications and reproductions become a natural and difficult to ignore part of research. It is already implemented in our FLoRA Annotator: forrt.org/annotator/
06.03.2026 07:51 β
π 3
π 1
π¬ 1
π 0
Where have all the comments gone?
For decades, the American Economic Review regularly published formal comments β papers that replicate, reassess, or challenge earlier AER articles.
In our latest blog post, we show: theyβve nearly disappeared.
26.02.2026 13:01 β
π 17
π 10
π¬ 1
π 0
Itβs out!
22.02.2026 01:35 β
π 214
π 29
π¬ 7
π 1
Our updated strategic plan!
The major change (improvement?!) is focus. COS does many things. My insistence that they all fit together in my head was apparently insufficient.
Now, we are trying to improve clarity, accountability, and effectiveness by aligning activities on a true north objective.
19.02.2026 15:39 β
π 20
π 2
π¬ 1
π 0
I have not seen this mentioned here, but this is a great example of how the research world has dramatically changed underneath of us and if you're not keeping up you're going to be left behind quickly:
yiqingxu.org/papers/2026_...
19.02.2026 17:03 β
π 15
π 4
π¬ 4
π 1
Sunsetting TOPFactor.org: Whatβs Changing and Why
COS shares our plans to sunset TopFactor.org, what prompted the decision, and how the research community can keep advancing open and transparent policymaking.
Since 2020, TOP Factor has helped researchers understand how journals support open practices. On March 16, 2026, COS will be sunsetting the tool.
Read more about what prompted the decision, what we've learned, and the future of open and transparent policymaking: www.cos.io/blog/sunsett...
18.02.2026 19:14 β
π 8
π 6
π¬ 0
π 0
OSF
Check out our preprint: "What Pilot Studies Can (and Cannot) Do for Validity in Psychological Research"
Great job @yashvin.bsky.social and @mbneff.bsky.social for leading!
doi.org/10.31234/osf...
16.02.2026 10:38 β
π 16
π 8
π¬ 0
π 0
first slide of presentation with OSC logo, and social media handles, name of presenter. the title says From the replicability crisis to credible science, and has three badges 1. preregistered: 100% p-hacking free, 2. open data: Here, check our numbers, 3. open materials: here's how you can replicate our results
Taught bavarian center for cancer research students about
Preregistration β more reliable research
Reproducible workflows
FAIR data managmt β higher-quality, reusable data
Yes, FAIR & sensitive medical data are compatible
Slides osf.io/p9sev/files/...
Tutorials lmu-osc.github.io/training/sel...
17.02.2026 19:26 β
π 18
π 5
π¬ 0
π 1
New paper, on a worrying trend in meta-science: the practice of anonymising datasets on, e.g., published articles. We argue that this is at odds with norms established in research synthesis, explore arguments for anonymisation, provide counterpoints, and demonstrate implications and epistemic costs.
13.02.2026 16:50 β
π 98
π 52
π¬ 6
π 7
vazul: An R Package for Analysis Blinding: https://osf.io/mp54s
12.02.2026 22:39 β
π 3
π 2
π¬ 0
π 0
Early draft of my ebook for the course:
ianhussey.quarto.pub/reproducible...
05.02.2026 12:40 β
π 28
π 8
π¬ 1
π 0
Reporting Practices, Open Science Practices, and Trustworthiness of Simulation Studies in Psychology: A Questionnaire Study: https://osf.io/jn9sy
05.02.2026 15:08 β
π 4
π 2
π¬ 0
π 0
Now with more Rat dck!
05.02.2026 16:19 β
π 2
π 0
π¬ 1
π 0
Diagram showing four phases of methodological research (Theory, Exploration, Systematic Comparison, Evidence Synthesis) with an arrow indicating that preregistration usefulness increases from early to late phases. Each phase lists its aim, elements, outcome, and an example from factor retention research.
Does it make sense to preregister simulation studies?
This question has sparked a lot of debate.
βΆοΈWe* work through the why, when, and how
βΆοΈWe discuss different phases of methodological research to clarify where preregistration might (or might not) add value
π Preprint: doi.org/10.31234/osf...
04.02.2026 10:40 β
π 37
π 13
π¬ 1
π 0
This headline number has generated a lot of attention, but does not account for the classifier's accuracy. @jamiecummins.bsky.social and I wrote a short commentary showing that, assuming a paper mill base rate of 10%, 30% of the flagged papers are false positives. At a base rate of 5%, 50% are FPs.
03.02.2026 13:44 β
π 17
π 5
π¬ 1
π 0
A framework for assessing the trustworthiness of scientific research findings1 | PNAS
Vigorous debate has erupted over the trustworthiness of scientific research findings
in a number of domains. The question “what makes research find...
Our new paper, with colleagues from the Strategic Council of the National Academies, offers an integrative framework of the several components that contribute to making research findings trustworthy including ethics, methodology, transparency, inclusion, assessment, etc
www.pnas.org/doi/10.1073/...
03.02.2026 19:27 β
π 38
π 17
π¬ 1
π 3
Mega-journal Heliyon retracts hundreds of papers after internal audit
Heliyon has published fewer papers and ramped up its retractions since a major indexing service put the journal on hold and the publisher launched an audit of all papers published in the journal siβ¦
Heliyon has published fewer papers and ramped up its retractions since a major indexing service put the journal on hold and the publisher launched an audit of all papers published in the journal since its launch in 2016.
03.02.2026 17:39 β
π 17
π 7
π¬ 1
π 2
AStA Advances in Statistical Analysis
AStA Advances in Statistical Analysis is a quarterly journal that publishes original contributions on statistical methodology, applications, and review ...
Advances in Statistical Analysis has a call for papers on the role of multiverse analysis in statistical modelling and applications: link.springer.com/journal/1018...
Deadline is May 1st, so still plenty of time to put something together!>
03.02.2026 17:42 β
π 28
π 12
π¬ 2
π 1
Interesting! AI-assisted assessment of responsible research practices. www.biorxiv.org/content/10.6...
#metascience #meta-science #meta-research
03.02.2026 18:55 β
π 1
π 0
π¬ 0
π 0
Likelihood Ratio Test for Publication Bias β a proof of concept - MetaROR
Publication bias poses a serious challenge to clarity and precision in scientific research & meta-analyses. This article by PaweΕ Lenartowicz poses a way to deal with this: the Likelihood Ratio Test for Publication Bias.
π Read the editorial assessment, peer reviews, and full article on MetaROR now
03.02.2026 16:49 β
π 8
π 3
π¬ 0
π 0
Wiley: "Weβre supporting responsible research assessment practices" onlinelibrary.wiley.com/journal/1520...
Also Wiley: "Prove that your article is a good fit for this journal πππππ by citing at least two of our articles in your manuscript before we will even consider reviewing it" π€‘
30.01.2026 11:29 β
π 65
π 31
π¬ 9
π 8
Preregistration Works: Increased Reporting Quality, Internal Validity, and Protocol Adherence in Animal Studies: https://osf.io/ruw7p
30.01.2026 22:34 β
π 1
π 1
π¬ 0
π 0
π
Mark your calendars for #SIPS2027!
The 2027 SIPS conference, organized in collaboration with the Association for Interdisciplinary Meta-Research and Open Science @aimosinc.bsky.social, will be held in November at the University of Melbourne, Australia.
We are looking forward to seeing you there!
27.01.2026 15:21 β
π 38
π 24
π¬ 0
π 2
The Iowa Gambling Task is an extreme example of Jingle Fallacy and schmeasurement.
In 100 articles we found 244 different ways of scoring it, 177 were never reused. Correlations between them range -.99 to .99.
At the same time, we show meta-analyses combine these results as if theyβre equivalent.
25.01.2026 12:01 β
π 140
π 54
π¬ 5
π 4
RegCheck
RegCheck is an AI tool to compare preregistrations with papers instantly.
Comparing registrations to published papers is essential to research integrity - and almost no one does it routinely because it's slow, messy, and time-demanding.
RegCheck was built to help make this process easier.
Today, we launch RegCheck V2.
π§΅
regcheck.app
22.01.2026 11:05 β
π 174
π 90
π¬ 8
π 6