Ben Grimmer's Avatar

Ben Grimmer

@profgrimmer.bsky.social

Assistant Professor @JohnsHopkinsAMS, Works in Mathematical Optimization, Mostly here to share pretty maths/3D prints, sometimes sharing my research

518 Followers  |  247 Following  |  77 Posts  |  Joined: 20.09.2023  |  1.8773

Latest posts by profgrimmer.bsky.social on Bluesky


Post image

The pattern continues. We can fractally build an N=31 self-dual pattern constructed of two N=15 patterns or 4 N=7 patterns, carefully sewn together (pun intended).

I'll stop sewing after I finish N=63 :)
Stay tuned for an upcoming paper where this has unexpected algorithmic/engineering value

12.02.2026 18:05 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

The partition {1,2}{3} above is "self-dual". To get this note, I numbered the 1', 2', 3' nodes counterclockwise. We can use this self-dual partition of size N=3 to build a self-dual partition of N=7 recursively.

Physically self-dual == "dream-catcher" mirrors blue and green
3/4

12.02.2026 18:05 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

A noncrossing partition of {1 ... N} requires if you connect grouped numbers in a circle by string, no strings cross. Blue is {1,2}{3}.
The "Kreweras" dual of this adds numbers 1', ... N', between each, taking the maximal noncrossing partition of them. Green is dual to blue.
2/4

12.02.2026 18:05 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
5 circles with yarn between them representing non-crossing partitions and their duals

5 circles with yarn between them representing non-crossing partitions and their duals

Lately, non-crossing partitions have shown up out of nowhere in my research, which have a lovely duality structure. This inspired some good art and fractals :)

Wanted to share the fun here (just sharing the pretty art for now, the research story will come in due time)
1/4

12.02.2026 18:05 β€” πŸ‘ 3    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Preview
Optimization Letters Optimization Letters covers all aspects of optimization, including theory, algorithms, computational studies, and applications. This journal provides an ...

Happy to announce that my work "On optimal universal first-order methods for minimizing heterogeneous sums" just received the Optimization Letters Best Paper Prize.

link.springer.com/journal/1159...

This work is part of a larger trend, fighting the brittleness of classic smooth/nonsmooth models.

02.02.2026 15:42 β€” πŸ‘ 8    πŸ” 0    πŸ’¬ 0    πŸ“Œ 1
Post image

Sunday morning spent setting up my office in the new @hopkinsdsai.bsky.social building. I gained a good amount more wall space, so I have the freedom to grow my collections again

11.01.2026 17:42 β€” πŸ‘ 7    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

Join us in advancing data science and AI research! The Johns Hopkins Data Science and AI Institute Postdoctoral Fellowship Program is now accepting applications for the 2026–2027 academic year. Apply now! Deadline: Jan 23, 2026. Details and apply: apply.interfolio.com/179059

19.12.2025 13:29 β€” πŸ‘ 11    πŸ” 9    πŸ’¬ 0    πŸ“Œ 5
Preview
An Elementary Proof of the Near Optimality of LogSumExp Smoothing We consider the design of smoothings of the (coordinate-wise) max function in $\mathbb{R}^d$ in the infinity norm. The LogSumExp function $f(x)=\ln(\sum^d_i\exp(x_i))$ provides a classical smoothing, ...

A link for those interested in reading πŸ€“
arxiv.org/abs/2512.10825
(4/4)

13.12.2025 04:35 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

The "bad" news: Despite being *nearly* optimal, we show for fixed small dimensions that strictly better smoothings exist, approximating the max function more closely and attaining our lower bound. So LogSumExp is only nearly, not exactly, minimax optimal. (3/4)

13.12.2025 04:35 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

LogSumExp is within 20% of a lower bound we derive on how good *any* similar smoothing can be. The proof just combines inequalities for smooth convex functions, no heavy machinery needed.

The good news: We aren't leaving much on the table by choosing logSumExp. (2/4)

13.12.2025 04:35 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

My student Thabo Samakhoana and I have been obsessed with smoothings lately. The softmax/logSumExp smoothing seems to be the standard everywhere in ML and optimization.
So, in what sense is this choice "optimal"?

We found some "elementary" answers, both good and bad news (1/4)

13.12.2025 04:35 β€” πŸ‘ 4    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

For those interested in reading πŸ€“
arxiv.org/pdf/2511.14915

20.11.2025 03:45 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

This polynomial characterization opens a lot of new directions in algorithm design. As a 3D printing enthusiast I was quick to want to visualize the set of optimal methods

Below is the region (living in 6 dimensions) of optimal 3-step methods that happens to sit nicely in 3D 4/

20.11.2025 03:45 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

Our new work provides a complete description of all minimax-optimal methods. We give a set of polynomial equalities that every optimal method must satisfy ("H invariants") and similarly a needed set of polynomial ineq ("H certificates")

Together these are "if and only if"!! 3/

20.11.2025 03:45 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

This is a classic type of problem; fixed points are a broad modelling tool, capturing, for example, gradient descent

In terms of algorithm design (my interest): In recent years the community pinned down an optimal method (Halpern) but showed that infinitely many others exist 2/

20.11.2025 03:45 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

A new paper out with TaeHo Yoon and Ernest Ryu:
We looked at the design of optimal fixed-point algorithms.
That is, seeking to approximately solve T(y)=y using as few evaluations of the operator T() as possible. Maximally efficient methods are "minimax optimal" 1/

20.11.2025 03:45 β€” πŸ‘ 6    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

It's all performance estimation under the hood :)
That tool does wonders for conceptual framing

19.11.2025 13:38 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Some links for those interested πŸ€“
Smooth convex: arxiv.org/abs/2412.06731
Adaptive smooth convex: arxiv.org/abs/2510.21617
Nonsmooth convex: arxiv.org/abs/2511.13639

18.11.2025 14:58 β€” πŸ‘ 4    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

We have done a wide range of numerics for smooth, convex settings where our resulting subgame perfect gradient methods SPGM compete with state-of-the-art L-BFGS methods and beat existing adaptive gradient methods in iter and realtime.

I am excited about the future here :)
4/

18.11.2025 14:58 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

In a series of works with the newest showing up on arxiv TODAY, we show that this strengthened standard is surprisingly attainable!

Today we proved a method of Drori and Teboulle 2014 is a subgame perfect subgradient method and designed a new, subgame perfect proximal method 3/

18.11.2025 14:58 β€” πŸ‘ 4    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

Rather than asking to do the best on the worst-case problem, we should be asking that, as it seems first-order information, our alg updates to do the best against the worst problem **with those gradients**
This demands a dynamic form of optimality, called subgame perfection. 2/

18.11.2025 14:58 β€” πŸ‘ 5    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

Lately, I have been obsessed with developing theoretically based optimization algorithms that actually attain the best practical performance.
Alas, the classic model of minimax optimal methods is overly conservative; it overfits to tune its worst-case.
We found a path forward 1/

18.11.2025 14:58 β€” πŸ‘ 15    πŸ” 4    πŸ’¬ 2    πŸ“Œ 0
Post image

Enjoyed being part of the Brin Mathematical Research Center's summer school on Scientific Machine Learning last week. Many very good talks and always nice to visit UMD!

14.08.2025 20:28 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

You'll have to read the paper if you want the maths defining these extremal smoothings for any sublinear function and convex cone. I now have a whole family of optimal smoothing Russian nesting dolls living in my office.

Enjoy: arxiv.org/abs/2508.06681

12.08.2025 14:38 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image

If instead, you wanted the optimal outer smoothings (ie, sets containing K), there is a similar spectrum of optimal smoothings being everything between the minimal and maximal sets shown below.

12.08.2025 14:38 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

If we restrict to looking at inner smoothings (ie, subsets of K), it turns out there are infinitely many sets attaining the optimal level of smoothness. Our theory identifies that there is a minimal and maximal such smoothing, shown below (nesting dolls from before).

12.08.2025 14:38 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

To do something more nontrivial, consider the exponential cone K={(x,y,z) | z >= y exp(x/y)}, which is foundational to geometric programming. The question: What is the smoothest set differing from this cone by at distance one anywhere?
My 3D print of this cone is below :)

12.08.2025 14:38 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image Post image

For example, you could invent many smoothings of the two-norm (five given below). In this case, the Moreau envelope gives the optimal outer smoothing. If you wanted the best smoothing of the second-order cone (the epigraph of the two-norm) a different smoothing is optimal.

12.08.2025 14:38 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

πŸ“’ Excited to share a new paper with PhD student Thabo Samakhoana. Nonsmooth optimization often uses smoothings, nearby smooth functions or sets. Often chosen in an ad hoc fashion.

We do away with ad hoc, characterizing optimal smoothings for convex cones and sublinear functions

12.08.2025 14:38 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

Oh, that’s so satisfying! I stopped at the 4-norm ball thinking I had the solution as it fits the hole like a pot lid (has a perfect circle as an intersection).

05.08.2025 19:36 β€” πŸ‘ 2    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

@profgrimmer is following 20 prominent accounts