Very strong results on SSv2 and action anticipation, plus zero-shot robotics planning! And we also attached an LLM to the vision encoder and got strong numbers on PerceptionTest!
Check out the blog post (with link to paper and GitHub) above!
@mattmucklm.bsky.social
Research Engineer, Meta Fundamental AI Research (FAIR). ML for compression, computer vision, medicine. https://mmuckley.github.io/
Very strong results on SSv2 and action anticipation, plus zero-shot robotics planning! And we also attached an LLM to the vision encoder and got strong numbers on PerceptionTest!
Check out the blog post (with link to paper and GitHub) above!
Very excited to share V-JEPA 2! I've been working on the encoder pretraining pipeline and data curation for this model the last few months, and am excited for it to finally be out!
ai.meta.com/blog/v-jepa-...
If you'd like to try yourself, the code has been added to our GitHub repository!
07.01.2025 14:46 β π 0 π 0 π¬ 0 π 0Qinco2 builds on Qinco with several optimizations, including beam search (increases accuracy+compute) and pre-selection (decreases compute). On the balance this leads to a more efficient method for similarity search.
07.01.2025 14:46 β π 0 π 0 π¬ 1 π 0The Qinco2 architecture builds on our previous Qinco work, which uses a neural network to implicitly parametrize code books for residual quantization. At each quantization step, a neural network is used in conjunction with the current vector to predict the next update.
07.01.2025 14:46 β π 0 π 0 π¬ 1 π 0We just published "Qinco2: Vector Compression and Search with Improved Implicit Neural Codebooks" on arXiv, work led by our talented intern, Theophane Vallaeys.
Qinco2 achieves as much as 40-60% reduction error for vector compression, as well as better performance for approximate similarity search.
Yes exactly.
Depending on how much you mutate it, keeping such libraries can also be very useful for reproducibility (which I didn't mention above).
But this can be difficult for other people that are trying to do something that doesn't fit in your framework (which happens often in research). There's always one or two things that simply don't fit. As a result, I'm finding myself writing more hacky/prototyping code these days.
13.12.2024 15:26 β π 2 π 0 π¬ 1 π 0In my PhD much of my code was hacky, and I think this set me back quite a bit. At some point I overcorrected towards building complex frameworks for my work, which let me try a lot of things (so long as I stayed within my own framework). This is more or less what you see in NeuralCompression.
13.12.2024 15:26 β π 5 π 0 π¬ 1 π 0One thing I've found in research is the constant counterbalancing between "prototype" code and "engineered" code.
Prototyped code is often a bit hacky, but gets the job done. But if you ever need to extend it, it can be quite a pain.
Engineered code usually has some overarching design philosophy
Is there a particular reason this is considered an anti pattern? I'm actually curious.
04.12.2024 17:12 β π 0 π 0 π¬ 1 π 0For MRI folks: we just rolled out a new release to torchkbnufft, first in a couple years.
The changes are for working with newer package versions. Things now work on numpy 2.0, and a few deprecations are fixed. Other than that, it's the same as before :). Get it with
`pip install torchkbnufft`
I actually think there was quite a bit of spam here a month or two ago and it's already gotten better. Not sure if that's due to an effort on the part of the site admins or just basic engagement numbers shifting what gets into a feed.
24.11.2024 21:53 β π 1 π 0 π¬ 0 π 0Good thoughts, some of which I've learned from trial and error over the years.
The advice part, centering things on technical points, is also useful for academic publishing and the review process. It really helps defuse what tends to be an adversarial relationship with reviewers (or authors).
π£ I am sure we have reached only a small fraction of New York's ML community in bsky. Please repost π this if you think you may have interested people close to you in the social graph.
22.11.2024 14:14 β π 18 π 6 π¬ 2 π 0Please make machine learners who still are children.
20.11.2024 13:56 β π 1 π 0 π¬ 0 π 0π
19.11.2024 19:52 β π 2 π 0 π¬ 1 π 0I'm here!
19.11.2024 17:33 β π 0 π 0 π¬ 1 π 0Kinda has the y2k gaming energy (but without the CRT monitor)
19.11.2024 13:27 β π 1 π 0 π¬ 0 π 0HELLO Hello hello hellooo...
18.11.2024 16:01 β π 7 π 0 π¬ 2 π 0