Shubhendu Trivedi's Avatar

Shubhendu Trivedi

@shubhendu.bsky.social

Interests on bsky: ML research, applied math, and general mathematical and engineering miscellany. Also: Uncertainty, symmetry in ML, reliable deployment; applications in LLMs, computational chemistry/physics, and healthcare. https://shubhendu-trivedi.org

917 Followers  |  251 Following  |  4,312 Posts  |  Joined: 30.09.2023  |  2.1316

Latest posts by shubhendu.bsky.social on Bluesky

Everyone Should Learn Optimal Transport, Part 2 In the previous blog post, we saw that optimal transport gives us calculus on the space of probability distributions. In this post, we will continue the core message, but we will also see that Wassers...

Wasserstein geometry = quotient geometry of permutation invariance.

In this blog, I explain why this is the natural language for exchangeable particlesโ€”and why mean-field neural network training shows up as a W2 gradient flow.

mufan-li.github.io/OT2/

14.02.2026 21:36 โ€” ๐Ÿ‘ 4    ๐Ÿ” 1    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Interestingly never heard of it before today.

14.02.2026 19:24 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Standard sort of pol sci conference. Ivan Krastev is good and very insightful and tends to invite all sorts of people. But most of the people assembled aren't all that good IMO, surprising given he's the curator. If he was looking for a fascist speaker, there could be better options.

14.02.2026 19:23 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

I was just pulling your leg!

11.02.2026 14:57 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
600 Years in Kerala | The Gujarati Community of Calicut | Documentary | afterImages
YouTube video by afterImages 600 Years in Kerala | The Gujarati Community of Calicut | Documentary | afterImages

"Today is ours". Mr. Padamsi on the Gujaratis of Calicut.
www.youtube.com/watch?v=UcFi...

10.02.2026 23:46 โ€” ๐Ÿ‘ 2    ๐Ÿ” 1    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

You fought with me 13 years ago for taking a crack at our common crackpot. :P

11.02.2026 01:30 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

I asked chatppt a variation and it tries to act too smart

11.02.2026 00:36 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Where is Claude code for packing up a few thousand books when you need it?

11.02.2026 00:35 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

I was curious if Obsidian was an actual Armenian surname. Turns out not originally, but after some corruptions and misspellings of some mysterious original name, it exists as an ultra rare surname.

11.02.2026 00:25 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Oh wow! had no idea

11.02.2026 00:21 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

I can't even imagine what 61% would have looked like there.

11.02.2026 00:19 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

My most successful % wise fluke has a record here. This % gain was on an 8% stock decline. bsky.app/profile/shub...

11.02.2026 00:16 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

15-20x is a conservative estimate btw. Since linkedin no longer has a stock I don't know what it's implied volatility numbers were. I also assume it had a thinly traded options chain, so had more slippage. 61% on something as high volume as nvda or tsla would mean 100x even.

11.02.2026 00:11 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

Also see how Jim Simons dealt with him. Also Taleb (whatever his flaws, he was skilled at options). Simons differed from these math people because he didn't need to be impressed by money, was a math heavy weight, but also a pretty tough guy (less room for politesse).

10.02.2026 23:38 โ€” ๐Ÿ‘ 2    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

There is a record of 800 shares bought for his charity when the linkedin technicals looked awful. In 15d there was news of acquisition, and that closed with a 61% gain in no time. Now imagine if he bought options in personal accounts for 1 MM. Could have easily been 15-20x (based on 61% stock gain).

10.02.2026 23:36 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

I am actually not a fan of the Newman stuff. They just kept writing the same thing again and again. But between centrality could be a measure yes. But you could keep it simpler, and use something like Adamic-Adar.

10.02.2026 21:52 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

For sparse graphs, we can still do reasonable stuff. We don't want wrong edges to dominate. Could have a first filtering step where you in fact sparsify further based on mutual similarities, local structure (keep locally meaningful edges). Then collapse nodes betwn heavy edges in an iterated manner.

10.02.2026 19:07 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

^if I understood you correctly. You don't seem to mean looking at a "reduced graph" where subgraphs are reduced to nodes and the reduced graph is clustered instead and then the clustering is projected up to the full graph.

10.02.2026 17:59 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

Yes, it makes sense. Sounds more like personalized page rank and / or seeded clustering. There used to be work using Lancozs / Arnoldi for seeded spectral clustering.

10.02.2026 17:57 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

<authors>, <title>, obv

10.02.2026 16:54 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Since bibliographies were/are in the conversation recently, a random tip on bib health: If you cite an arXiv paper with copy paste from some default channels, which renders in references as <title>, <year>, <arxiv url>, it will very rarely get picked up by google scholar.

10.02.2026 16:50 โ€” ๐Ÿ‘ 5    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

What do you mean by locally invert?

10.02.2026 16:48 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

Not at all. The people who were good at it stayed far away and thought it was mafia money. He was more of a wheeler-dealer PE corporate raider guy. Also, options are easy (w outsized returns due to the leverage) if you have insider information (check his trade in LinkedIn before acquisition by MS).

10.02.2026 16:47 โ€” ๐Ÿ‘ 2    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

Prof. P. P. Divakaran (1936-2025) passed away last year. An appreciation by those who knew him.
bhavana.org.in/puthan-puray...

09.02.2026 00:02 โ€” ๐Ÿ‘ 3    ๐Ÿ” 2    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

I like that it's something that people enjoy. :)

09.02.2026 02:21 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

I will share a list. Vovk doesn't write in an accessible way IMO. It's part of the reason why the overhype/popularity of the framework came due to papers on it by others (Larry Wasserman and Jing Lei at first, then Tibshirani and Barber, and then Candรจs and Jordan).

09.02.2026 02:19 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Usually learn of super bowl (on some years) because I might go to some of my usual hangs, and they are filled with loud people (a bit out of character for said places). I want to say that I hate this shit, but that's too mean. In some ways, it's "better" than the spectacle around cricket matches.

09.02.2026 01:39 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

No shame in trying to copy too e.g. x.com/_onionesque/...

08.02.2026 02:43 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

For example, anyone who knows me and talks to me about conformal prediction, always is recommended the same six papers. None of them has Vovk as an author.

08.02.2026 02:21 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 0

Every once in a while I will come across an arxiv submission with a particular author, and I think OK, I will get to learn about this area, even if I don't understand their result/paper. Rina Barber is one such author.

08.02.2026 02:20 โ€” ๐Ÿ‘ 6    ๐Ÿ” 1    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

@shubhendu is following 19 prominent accounts