Piyush Mishra's Avatar

Piyush Mishra

@peeyoushh.bsky.social

PhD student, percussionist piyushmishra12.github.io

32 Followers  |  78 Following  |  11 Posts  |  Joined: 07.12.2024  |  1.7232

Latest posts by peeyoushh.bsky.social on Bluesky

Félicitations Alice 🎉

11.10.2025 15:53 — 👍 0    🔁 0    💬 0    📌 0
Preview
To Understand AI, Watch How It Evolves | Quanta Magazine Naomi Saphra thinks that most research into language models focuses too much on the finished product. She’s mining the history of their training for insights into why these systems work the way they…

“Interpretability research should be interpretable.”

26.09.2025 20:04 — 👍 17    🔁 7    💬 0    📌 0

Scaffolded hiPSC liver organoids recapitulating bile duct tubulogenesis and periportal architecture. https://www.biorxiv.org/content/10.1101/2025.08.28.672864v1

02.09.2025 12:30 — 👍 1    🔁 1    💬 0    📌 0

Very excited by this paper by @peeyoushh.bsky.social in my lab. We demonstrated the increased robustness of LLM/transformers in tracking at scale but also their pitfalls: when the combinatorics gets simpler (and not just modeling!), stats work best. Tons of potential for hybrid approaches :).

17.01.2025 16:08 — 👍 6    🔁 3    💬 0    📌 0
Preview
Vidéo | La "Gare des Étoiles", un café-restaurant tenu par des jeunes porteurs de trisomie 21 à Niolon Anciennement abandonnée, la gare de Niolon sur la Côte Bleue a été transformée en café-restaurant et gîte par l’association T’Cap21. Le Train Inc Caf

🚂 "Un lieu de travail ordinaire pour des gens extraordinaires". Des jeunes en situation de handicap mental se forment aux métiers de l’hospitalité à la "Gare des Étoiles" de Niolon.

26.12.2024 08:46 — 👍 10    🔁 4    💬 0    📌 0
Preview
AI has an environmental problem. Here’s what the world can do about that. The sprawling data centres that house AI servers churn out toxic electronic waste and are voracious consumers of electricity, which in most places is still produced from fossil fuels.

"A request made through ChatGPT, an AI-based virtual assistant, consumes 10 times the electricity of a Google Search"

This isn't a game. This is accelerated destruction via huge usage of power, water, resources etc. Stop using it - especially for messing about!!

www.unep.org/news-and-sto...

28.12.2024 08:02 — 👍 3004    🔁 1779    💬 83    📌 106
Preview
a woman stands in front of a large sign that says hannah montana ALT: a woman stands in front of a large sign that says hannah montana

So the Bayesian approach is great for the actual smoothing, but transformers are remarkable for pruning the hypothesis-set. Can we hybridise to use the best of both worlds? Stay tuned :)

23.12.2024 16:08 — 👍 0    🔁 0    💬 0    📌 0
Post image

We thus see the emergence of two regimes, one where we have a lower no. of hypotheses (where the Bayesian approach is unmatched) and another with a higher no. of hypotheses (where transformers take the lead).

23.12.2024 16:08 — 👍 0    🔁 0    💬 1    📌 0
Post image

While the transformer is heavier for lower lookback, the compute of the Bayesian method increases super-exponentially on increasing lookback! This is a perfect illustration of our combinatorial challenge of tracking and how transformers could help in resolving it.

23.12.2024 16:08 — 👍 0    🔁 0    💬 1    📌 0
Post image

Not only is the transformer suboptimal, it remains suboptimal when the Bayesian method is optimal (hint: AI alignment problem). Increasing the amount of data starts decreasing the accuracy!

23.12.2024 16:08 — 👍 2    🔁 0    💬 1    📌 0
Post image

But what if we had a world where this was possible (i.e., short sequences of 8 time steps, hence less no. of hypotheses)? No matter how much we train the transformer, it never matches the optimal performance!

23.12.2024 16:08 — 👍 0    🔁 0    💬 1    📌 0

This suggests that increasing past information of sequences leads to a better robustness for both the strategies. So if the Bayesian approach can access all the past information of the sequence, it should be optimal! But doing that is intractable for realistic scenarios!

23.12.2024 16:08 — 👍 0    🔁 0    💬 1    📌 0
Post image

Transformers are robust when dealing with large information. On increasing noise (for 2 particles undergoing brownian motion for 150 timesteps) we see a prolongation in the breakpoint of accuracy in all cases. An increase in sequence lookback shows further prolongation!

23.12.2024 16:08 — 👍 0    🔁 0    💬 1    📌 0

The Bayesian multiple hypothesis tracking approach is the theoretical optimal solution but it can only handle a certain amount of hypotheses before it becomes intractable. We look at where the switch happens and what we can do about it.

23.12.2024 16:08 — 👍 0    🔁 0    💬 1    📌 0

We know that transformers work well. But should we just replace all our previous techniques with transformers and call it a day? (Spoiler: no)

23.12.2024 16:08 — 👍 0    🔁 0    💬 1    📌 0
Comparative study of transformer robustness for multiple particle tracking without clutter The tracking of multiple particles in lengthy image sequences is challenged by the stochastic nature of displacements, particles detection errors, and the combinatorial explosion of all possible traje...

This year, we (Philippe Roudot and I) published my first PhD paper on the usability of transformers for tracking problems at EUSIPCO! Find the open access version here and/or continue reading the thread :)
hal.science/hal-04619330/

23.12.2024 16:08 — 👍 0    🔁 0    💬 1    📌 1

@peeyoushh is following 20 prominent accounts