The streets were already too well cleared and sprinkled with gravel yesterday after lunch, so that I had to walk back home, but today the views are even prettier with sunshine
13.02.2026 10:46 β π 17 π 0 π¬ 0 π 0@avehtari.bsky.social
Academy Professor in computational Bayesian modeling at Aalto University, Finland. Bayesian Data Analysis 3rd ed, Regression and Other Stories, and Active Statistics co-author. #mcmc_stan and #arviz developer. Web page https://users.aalto.fi/~ave/
The streets were already too well cleared and sprinkled with gravel yesterday after lunch, so that I had to walk back home, but today the views are even prettier with sunshine
13.02.2026 10:46 β π 17 π 0 π¬ 0 π 0Photo of snowy scenery
Finally enough snow that I could ski from home to campus
12.02.2026 11:43 β π 39 π 0 π¬ 0 π 0See also arxiv.org/abs/2003.04026
12.02.2026 09:29 β π 2 π 0 π¬ 0 π 0In this same journal, Arnold Zellner published a seminal paper on Bayes' theorem as an optimal information processing rule. This result led to the variational formulation of Bayes' theorem, which is the central idea in generalized variational inference. Almost 40 years later, we revisit these ideas, but from the perspective of information deletion. We investigate rules which update a posterior distribution into an antedata distribution when a portion of data is removed. In such context, a rule which does not destroy or create information is called the optimal information deletion rule and we prove that it coincides with the traditional use of Bayes' theorem.
arXivππ€
Optimal information deletion and Bayes' theorem
By Montcho, Rue
I reported this, and they have now fixed it. They told they had tested the survey form with several people before publishing it, but I guess no-one else was trying to find the middle of the scale
11.02.2026 10:34 β π 4 π 0 π¬ 1 π 0Response scale with options 0 - Not at all likely, 1, 2, 4, 5, 6, 7, 8, 9, 10 Extremely likely
I felt confused, and it took me a few seconds to realize why
11.02.2026 09:41 β π 28 π 1 π¬ 5 π 0Etsimme tietotekniikan lehtoreita Aalto-yliopistoon kokoaikaisiin, toistaiseksi voimassa oleviin tehtΓ€viin! π
Alueet:
β
Bioinformatiikka, biostatistiikka ja terveysdatatiede
β
Tietotekniikka: ohjelmointi, algoritmit ja teoria
β
Ohjelmistotuotanto
πHae 16.3.2026 mennessΓ€. LisΓ€tiedot ja ohjeet π
Posterior-SBC now also with peer-review stamp in Statistics and Computing doi.org/10.1007/s112... (update your bib files)
09.02.2026 17:00 β π 24 π 4 π¬ 1 π 1Compositional data (proportions that sum to 1) behave in ways standard models arenβt built for
I walk through why Dirichlet regression is often the right tool & what extra insight it gives using a real ex of eyetracking
#Dirichlet #r #brms #guide #eyetracking
open.substack.com/pub/mzlotean...
Palkkaamme suomea osaavia lehtoreita: www.aalto.fi/fi/avoimet-t...
09.02.2026 09:25 β π 3 π 1 π¬ 0 π 0The building blocks exist, but for a ready made example on how to do it, it's best to ask the authors of that paper
08.02.2026 21:01 β π 1 π 0 π¬ 0 π 0The second part computation can be made faster using importance sampling arxiv.org/abs/2505.10510
08.02.2026 09:50 β π 6 π 1 π¬ 1 π 0This is known as cut posterior. Stan developers have discussed possibility of adding support for cut in Stan, but it hasn't been prioritised as it can often be achieved just running inference for first part, and then run parallely the second part with many datasets (as in multiple imputation)
08.02.2026 09:50 β π 3 π 1 π¬ 1 π 0Ever wondered how to use supra-Bayesian approach in species distribution modeling (SMDs). Well we present a framework for it and exemplify it with real data on fish reproduction area estimation.
nsojournals.onlinelibrary.wiley.com/doi/10.1002/...
1/3
In CmdStanR, you just pass the pathfinder object to sample init argument, no need to convert
06.02.2026 08:16 β π 4 π 0 π¬ 1 π 0First release of stanflow! v0.1.0 was a few days later than I'd like, but its up now.
Stanflow is a metapackage a la tidyverse for a Stan Bayesian workflow--see the README for more details/features!
Feedback/issues/PRs are always appreciated.
#rstats #bayes
Nikolas Siccha's blog post "Divergent transitions in Hilbert Space Gaussian process posteriors and how to avoid them" shows some cool results he started working on when at Aalto www.generable.com/post/hsgp-re...
05.02.2026 17:09 β π 11 π 2 π¬ 0 π 0With more than 14k lines of code!
05.02.2026 09:39 β π 2 π 0 π¬ 1 π 0The publisher estimates the Bayesian Workflow book will ship in June www.routledge.com/Bayesian-Wor...
05.02.2026 09:12 β π 89 π 8 π¬ 2 π 0I'm watching this year's lectures to see how @rmcelreath.bsky.social, one of the soon to be published Bayesian Workflow book co-author, teaches workflow. So far looking good!
04.02.2026 19:21 β π 8 π 0 π¬ 0 π 0StanCon 20206 International Conferene on Bayesian Inference and Probabilistic Programming 17-21 August, 2026 Uppsala, Sweden
Three weeks time to submit contributed talk abstract to StanCon 2026! You can also submit a poster abstract early, if you need to make early travel plans. There will be travel and accommodation support for students, too!
More information about submitting at www.stancon2026.org/abstracts/
Watching @rmcelreath.bsky.social's both A and B lectures feels like a watching a movie which jumps between two different time periods
04.02.2026 19:15 β π 24 π 1 π¬ 1 π 0I also want to hear about your favorite recent (<5 years π ) Stan and Stan-adjacent developments that changed your bayesian framework (thinking about models, validating models, fitting models and checking their fit and how they break)!
04.02.2026 13:35 β π 0 π 1 π¬ 0 π 0as I'm revising my course materials, I keep stumbling upon cool @mc-stan.org developments.
Current favorites:
1. your model has funnels and you exhausted reparametrization ideas: metric = "dense_e" makes your HMC learn about covariance btw parameters. Sloooow, but effective!
1/
New pre-pre-print:
03.02.2026 19:29 β π 12 π 8 π¬ 2 π 0This time I try to explain group-level confounding and some ways to deal with it. Lecture B04 of Statistical Rethinking 2026 - fixed effects, Mundlak machines, latent Mundlak machines, intro to social network analysis and the social relations model. Full lecture list: github.com/rmcelreath/s...
30.01.2026 13:27 β π 102 π 16 π¬ 6 π 4wanna learn about cognitive modeling and love pizza? I am revising my cognitive modeling course notes and could use some feedback (or you might just enjoy learning about cog modeling in R and Stan): fusaroli.github.io/AdvancedCogn...
(only chapter 1-3 are in place, the others will come)
StanCon 2026 is this August 17-21 in Upsala, Sweden πΈπͺ
www.stancon2026.org
β° Abstracts for contributed talks are due Feb 25
β° Abstracts for posters are due May 27
And just to be clear: Yes, StanCon is my favorite conference to attend!! Can't wait for this one!
if you want to know how to compute marginal means from a mixed effects model in brms using rvars, then have I got a supplemental materials document for you pubs.asha.org/doi/10.1044/...
29.01.2026 22:10 β π 24 π 8 π¬ 1 π 0Another absolute banger from @natehaines.bsky.social: generative hierarchical Bayesian modeling* is the best way to investigate individual differences!
*This isn't "just" making use of Bayesian estimation, but one of the best examples I've seen of a full Bayesian embrace of uncertainty.