Self-supervised perception for tactile skin covered dexterous hands
We present Sparsh-skin, a pre-trained encoder for magnetic skin sensors distributed across the fingertips, phalanges, and palm of a dexterous robot hand. Magnetic tactile skins offer a flexible form f...
This was an amazing collaboration at
@aiatmeta.bsky.social &
@cmurobotics.bsky.social : @carohiguera.bsky.social ,
@mukadammh, @francois_hogan
and many others
Code coming soon! Checkout our paper: arxiv.org/abs/2505.11420 &
Website: akashsharma02.github.io/sparsh-skin-... for more details
6/6
27.05.2025 14:48 β
π 0
π 0
π¬ 0
π 0
And we also show improvement in many tactile perception tasks such as force estimation, pose estimation and full-hand joystick state estimation.
5/6
27.05.2025 14:47 β
π 1
π 0
π¬ 1
π 0
With this we see a 75% improvement in real-world tactile plug insertion over end-to-end using vision and tactile:
4/6
27.05.2025 14:46 β
π 0
π 0
π¬ 1
π 0
We pretrain Sparsh-skin with 4 hours of unlabeled data via self-distillation, and make several changes to get highly performant reps:
Decorrelate signals by tokenizing 1s window of tactile data.
Condition the encoder on robot hand configurations via sensor positions as input
3/6
27.05.2025 14:45 β
π 0
π 0
π¬ 1
π 0
Sparsh-skin is an approach to pretrain encoders for magnetic skin sensors on a dexterous robot hand.
It improves tactile tasks by over 56% in end-to-end methods and by over 41% in prior work.
It is trained via self-supervision for the Xela sensor, so no labeled data needed!
2/6
27.05.2025 14:44 β
π 1
π 0
π¬ 1
π 0
Robots need touch for human-like hands to reach the goal of general manipulation. However, approaches today donβt use tactile sensing or use specific architectures per tactile task.
Can 1 model improve many tactile tasks?
πIntroducing Sparsh-skin: tinyurl.com/y935wz5c
1/6
27.05.2025 14:44 β
π 4
π 0
π¬ 1
π 0
I might sound salty, but I never got how 'outstanding reviewers' are chosen. Til then, part of the 'mediocre reviewers' gang it is π€£
12.05.2025 23:15 β
π 2
π 0
π¬ 1
π 0
Sparsh | Self-supervised touch representations for vision-based tactile sensing
Sparsh: Self-supervised touch representations for vision-based tactile sensing
Check out my work to know more:
1. Sparsh: tactile reps for vision based sensors sparsh-ssl.github.io
2. [Releasing soon] Sparsh-skin: Tactile reps for full hand magnetic skins
3. [Coming soon] Reps for multimodal touch fusing tactile-images, audio, motion and pressure
11.05.2025 13:31 β
π 0
π 0
π¬ 0
π 0
We took a matter of fact approach for a robotics conference, and it backfired too.
11.05.2025 12:53 β
π 0
π 0
π¬ 0
π 0
I asked "on the other platform" what were the most important improvements to the original 2017 transformer.
That was quite popular and here is a synthesis of the responses:
28.04.2025 06:47 β
π 204
π 43
π¬ 4
π 3
β° Heads up! The deadline for two #CVPR2025 Autonomous Grand Challenge tracks is May 10th, 2025:
1οΈβ£ NAVSIM v2 Challenge: huggingface.co/spaces/AGC20...
2οΈβ£ World Model Challenge by 1X: huggingface.co/spaces/1x-te...
28.04.2025 09:41 β
π 9
π 6
π¬ 1
π 0
I love situations like this: in the pre-deep era (and following classical learning theory), people would have stopped training the white model at the red arrow, as the validation error increases. But, no, the model first seems to learns unwanted short cuts (overfitting wildly) but finds a way out.
27.03.2025 15:30 β
π 52
π 9
π¬ 6
π 0
First page of the instagram post
A new #CosmicDistanceLadder post on why lunar and solar eclipses tend to come in pairs (for instance, the solar eclipse next week is paired with the lunar eclipse from last week). www.instagram.com/p/DHkS3EcA40L
24.03.2025 04:42 β
π 30
π 5
π¬ 2
π 0
A photo of Lerrel looking happy.
What would you love to know about #robot learning and decision making?
Later this season, I'll be chatting to Prof. Lerrel Pinto (@lerrelpinto.com) from NYU about using machine learning to train robots to adapt to new environments.
Send me your questions for Lerrel: robottalk.org/ask-a-question/
18.03.2025 10:11 β
π 13
π 7
π¬ 0
π 1
We are looking for a student researcher to work on video understanding plus 3D, in Google DeepMind London. DM/Email me or pass it to someone if you feel it may be a good fit!
05.03.2025 20:43 β
π 20
π 6
π¬ 0
π 0
The first measles death in the US in a decade -- the tragic, preventable death of a child whose parents chose not to protect them with vaccination -- should spark an immediate nation-wide campaign to ensure all children are protected against preventable diseases. Anything less is unconscionable.
26.02.2025 19:36 β
π 39026
π 9469
π¬ 1135
π 450
Congrats Eric, really cool stuff!
26.02.2025 02:07 β
π 1
π 0
π¬ 0
π 0
Gearing up for our workshop on 4D Vision at @CVPR this June! Check out our line up of speakers and submit your work by Mar 28. Spread the word!
12.02.2025 13:35 β
π 5
π 1
π¬ 0
π 0
Gumbel distribution - Wikipedia
At first I thought poisson or log-normal, but after a bit of searching, maybe Gumbel distribution: en.m.wikipedia.org/wiki/Gumbel_...
09.02.2025 14:38 β
π 2
π 0
π¬ 1
π 0
Last night I found out that the NSF math postdoctoral fellowship I applied for is being deleted because it does not comply with Trumpβs executive orders on DEI in the federal government. Iβm going to answer some FAQs and share some thoughts about this ordeal in this thread 1/n
08.02.2025 18:42 β
π 1329
π 493
π¬ 50
π 53
Now, see how life changes when you swap control and caps lock! π
08.02.2025 04:56 β
π 2
π 0
π¬ 0
π 0
DexGen
Seeing some of the early results from DexterityGen were definitely a wow moment for me!
It doesn't take a lot to realize all the new opportunities a strong teleop system like this enables! π
X thread: x.com/zhaohengyin/...
Link: zhaohengyin.github.io/dexteritygen/
08.02.2025 03:02 β
π 2
π 1
π¬ 0
π 0
Not one VC would ever fund a startup to do the kind of hardcore optimization work that DeepSeek did.
Every VC firm should be asking themselves why.
28.01.2025 05:00 β
π 105
π 11
π¬ 5
π 2
Are those gulab jamuns? ππ
25.01.2025 03:21 β
π 3
π 0
π¬ 1
π 0
just warms my heart to see how they're citing my stuff --
"some people have done some thing [7]"
"most work is inadequate [8]"
"unlike prior work [7,8,9], we don't suck"
7. Bigham
8. Bigham
9. Bigham
them increasing my hindex is joke on them! π
17.01.2025 20:32 β
π 21
π 3
π¬ 1
π 0
A new dawn, a golden era of boot licking before us. Unheralded, unimaginable forms of boot licking to be discovered
07.01.2025 14:20 β
π 27
π 3
π¬ 0
π 0
Itβs kinda wild how much of ML is tradition. Not always in a bad way, just that thereβs so damn much that youβre forced to rely on othersβ recommendations for models, hyperparameters, training sets, loss metrics, architectures, and quirky practices.
18.12.2024 17:27 β
π 14
π 1
π¬ 4
π 0