@lacerbi.bsky.social wrote a very nice summary post of our paper here if anyone missed it:
bsky.app/profile/lace...
I can give some more behind-the-scenes information. ๐งต
@huangdaolang.bsky.social
PhD student at Aalto University ๐ซ๐ฎ Probabilistic ML, amortized inference. See more at huangdaolang.com
@lacerbi.bsky.social wrote a very nice summary post of our paper here if anyone missed it:
bsky.app/profile/lace...
I can give some more behind-the-scenes information. ๐งต
1/ Introducing ACE (Amortized Conditioning Engine)! Our new AISTATS 2025 paper presents a transformer framework that unifies tasks from image completion to BayesOpt & simulator-based inference under *one* probabilistic conditioning approach. It's Bayes all the way down!
06.03.2025 10:32 โ ๐ 35 ๐ 14 ๐ฌ 1 ๐ 3Interested in amortization + experimental design + decision making? #NeurIPS2024
Come by to our poster, starting soon (11-2, East Hall)!
NeurIPS link: neurips.cc/virtual/2024...
Paper: openreview.net/forum?id=zBG...
with @huangdaolang.bsky.social Yujia Guo @samikaski.bsky.social
I am at #NeurIPS2024 and looking for postdocs. Feel free to reach out if you want to discuss!
We have also other positions: fcai.fi/winter-calls.... My slightly outdated home page is: kaski-lab.com
1/ Hi all, I am at #NeurIPS2024 and I will be hiring a postdoc in probabilistic machine learning starting asap.
Research interests: amortized, approximate & simulator-based inference, Bayesian optimization, and AI4science.
Get in touch for a chat or come to our posters today 11AM or Friday 11AM!
Great list! Can I join?
06.12.2024 14:07 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 08/ Join us at our poster session at #NeurIPS2024. Unfortunately this year I canโt attend in person, but @lacerbi.bsky.social will present our work. We are excited to discuss and explore future directions in Bayesian experimental design and amortization!
05.12.2024 12:18 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 07/ Experiments show TNDP significantly outperforms traditional methods across various tasks, including targeted active learning and hyperparameter optimization, retrosynthesis planning.
05.12.2024 12:18 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 06/ Our Transformer Neural Decision Process (TNDP) unifies experimental design and decision-making in a single framework, allowing instant design proposals while maintaining high decision quality.
05.12.2024 12:18 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 05/ We introduce Decision Utility Gain (DUG) to guide experimental design with a direct focus on optimizing decision-making tasks, moving beyond traditional information-theoretic objectives. DUG measures the improvement in the maximum expected utility from observing a new experimental design.
05.12.2024 12:18 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 04/ In our work, we present a new amortized BED framework that optimizes experiments directly for downstream decision-making.
05.12.2024 12:18 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 03/ But what if our goal goes beyond parameter inference? In many real-world tasks like medical diagnosis, we care more about making the right decisions than learning model parameters.
05.12.2024 12:18 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 02/ Bayesian Experimental Design (BED) is a powerful framework to optimize experiments aimed at reducing uncertainty about unknown system parameters. Recent amortized BED methods use pre-trained neural networks for instant design proposals.
05.12.2024 12:18 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 0Optimizing decision utility in Bayesian experimental design is key to improving downstream decision-making.
Excited to share our #NeurIPS2024 paper on Amortized Decision-Aware Bayesian Experimental Design: arxiv.org/abs/2411.02064
@lacerbi.bsky.social @samikaski.bsky.social
Details below.
1/ Excuse me, can I interest you in eliciting your beliefs as flexible probability distributions? No worries, we only need pairwise comparisons or rankings, no personal details.
Led by **Petrus Mikkola** and joint with **Arto Klami**, to be presented soon at @neuripsconf.bsky.social #NeurIPS2024