More details can be found at fridaytalks.github.io!
05.08.2025 07:07 — 👍 0 🔁 0 💬 0 📌 0@johanneszenn.bsky.social
PhD Student in Machine Learning @unituebingen.bsky.social, @ml4science.bsky.social, @tuebingen-ai.bsky.social, IMPRS-IS; previously intern @vectorinstitute.ai; jzenn.github.io
More details can be found at fridaytalks.github.io!
05.08.2025 07:07 — 👍 0 🔁 0 💬 0 📌 0A new recording of our FridayTalks@Tübingen series is online!
Effortless, Simulation-Efficient Bayesian Inference using Tabular Foundation Models
by
@vetterj.bsky.social & Manuel Gloeckler from @mackelab.bsky.social
Watch here: youtube.com/watch?v=Wx2p...
📣 [Openings] I'm now an Assistant Prof @westernu.ca CS dept. Funded PhD & MSc positions available! Topics: large probabilistic models, decision-making under uncertainty, and apps in AI4Science. More on agustinus.kristia.de/openings/
04.07.2025 22:55 — 👍 16 🔁 5 💬 1 📌 1@unituebingen.bsky.social Tübingen has won 6 out of 9! Not too bad, is it? 👍 Congratulations to all winning teams!
22.05.2025 17:00 — 👍 4 🔁 1 💬 0 📌 0The members of the Cluster of Excellence "Machine Learning: New Perspectives for Science" raise their glasses and celebrate securing another funding period.
We're super happy: Our Cluster of Excellence will continue to receive funding from the German Research Foundation @dfg.de ! Here’s to 7 more years of exciting research at the intersection of #machinelearning and science! Find out more: uni-tuebingen.de/en/research/... #ExcellenceStrategy
22.05.2025 16:23 — 👍 74 🔁 20 💬 4 📌 5Das Bild zeigt einen Brunnen vor der Neuen Aula. Darunter der Text: Sechs Exzellenzcluster werden in Tübingen gefördert.
Die Universität Tübingen erhält sechs #Exzellenzcluster, die im Rahmen der #Exzellenzstrategie vom 01.01.2026 an sieben Jahre lang gefördert werden! Darunter drei Cluster, die bereits etabliert sind und eine erneute Förderung erhalten. uni-tuebingen.de/universitaet... #ExStra #Forschung #UniTübingen
22.05.2025 17:13 — 👍 70 🔁 29 💬 5 📌 4Great idea! Are there any plans to do this for other subfields as well?
21.02.2025 12:54 — 👍 4 🔁 0 💬 1 📌 0Unfortunately, voting at a German embassy is not an option.
21.02.2025 12:47 — 👍 0 🔁 0 💬 0 📌 0I'm extremely disappointed with the organization of the upcoming German election. Voting is a fundamental right, yet my ballot letter arrived too late in Toronto—now the fastest! delivery won't get it there by Monday, after the election. Unacceptable.
20.02.2025 14:46 — 👍 3 🔁 0 💬 2 📌 0More generally, we believe that the variational expectation maximization algorithm can be a useful tool for many physico-chemical prediction problems since it balances structure-based and representation-learning based predictions by weighing of their respective uncertainties.
31.01.2025 14:45 — 👍 0 🔁 0 💬 0 📌 0We extend a probabilistic matrix factorization method (Jirasek et al., 2020) by learning priors from the chemical structure of mixture components utilizing graph neural networks and the variational expectation maximization algorithm which significantly improves the predictive accuracy over the SOTA.
31.01.2025 14:45 — 👍 0 🔁 0 💬 1 📌 0Predicting the physico-chemical properties of pure substances and mixtures is a central task in
thermodynamics. We propose a method for combining molecular descriptors with representation learning for the prediction of activity coefficients of binary liquid mixtures at infinite dilution.
I am happy to share that we (finally!) took the time to publish parts of my Master's thesis as a paper (pubs.rsc.org/en/content/a...). Huge thanks to Dominik Gond, Fabian Jirasek and Robert Bamler. Check it out if you are interested in applications of machine learning to chemical engineering!
31.01.2025 14:45 — 👍 5 🔁 0 💬 1 📌 0Can we build neural networks whose structure and computational abilities match a real brain? We are not quite there yet, but recent work by @lappalainenjk.bsky.social et al. shows a strategy for getting closer to this goal. Read more on our blog: www.machinelearningforscience.de/en/improving...
20.12.2024 11:32 — 👍 26 🔁 11 💬 1 📌 2How to find all fixed points in piece-wise linear recurrent neural networks (RNNs)?
A short thread 🧵
In RNNs with N units with ReLU(x-b) activations the phase space is partioned in 2^N regions by hyperplanes at x=b 1/7
@jkapoor.bsky.social and @auschulz.bsky.social introduce LDNS, a diffusion-based latent variable model to generate diverse neural spiking data flexibly conditioned on external variables Poster #4010 (East; Wed 11 Dec 11PT) ➡️ openreview.net/forum?id=ZX6... 3/4
09.12.2024 19:28 — 👍 11 🔁 2 💬 1 📌 0@vetterj.bsky.social and I are excited to present our work at #NeurIPS2024! We present Sourcerer: a maximum-entropy, sample-based solution to source distribution estimation.
Paper: openreview.net/forum?id=0cg...
Code: github.com/mackelab/sou...
(1/8)
I am super hyped and happy with our recent paper on a ✨VampPrior 2.0✨: Hierarchical VAE with a diffusion-based VampPrior! 🦇 We got SOTA VAE results on CIFAR-10! Kudos to Anna Kuzina because this TMLR paper is the last chapter in her PhD thesis 🤩
📃 tinyurl.com/22rvzc4f
💻 github.com/AKuzina/dvp_...