A Boston Dynamics robot dog stands on a walkway through grass at the base of large cherry trees full of blooms
Close-up of the chest and head of a Boston Dynamics robot dog framed by trees full of cherry blossoms
Three smiling researchers dressed in casual clothing, one pulling a wagon, walk behind a Boston Dynamics robot dog with campus buildings and trees in the background
A researcher holding a handheld controller follows a Boston Dynamics robot dog down a building ramp while another researcher standing a few meters away holding a laptop looks on
If you visited the @uwcherryblossom.bsky.social, did you โspotโ an unusual visitor among the blooms? Researchers in the @uofwa.bsky.social #UWAllen #robotics group recently took advantage of some nice weather to take our Boston Dynamics robot dog for a stroll around campus. #AI 1/5
02.05.2025 23:28 โ ๐ 11 ๐ 4 ๐ฌ 2 ๐ 1
Thank you!
Yea we also went through a lot of the papers that tried to do long range perception for the LAGR project.
Really cool to take inspiration from works almost 20 years old but still very relevant :)
18.04.2025 18:33 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0
This project was a fun effort with Matt Schmittle, Nathan Hatch, Rosario Scalise, @mateoguaman.bsky.social, Sidharth Talia, @khimya.bsky.social, @siddhss5.bsky.social and Byron Boots.
๐งต6/6
18.04.2025 17:56 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0
This work is a collaboration between the Personal Robotics Lab (@siddhss5.bsky.social) and Robot Learning Lab at the University of Washington @uwrobotics.bsky.social @uwcse.bsky.social
๐งต5/6
18.04.2025 17:56 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0
๐ค Real-world tested: LRN cuts down interventions on Spot and a large tracked vehicle.
โ
Plug & play: Works with nearly any local stack that accepts goal waypoints.
๐ Auto-labeled: Trained from raw FPV videos using CoTracker to trace camera paths.
๐งต4/6
18.04.2025 17:56 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0
๐ฅ Key insight: Robots can reason further by learning to identify distant affordable frontiers as intermediate goals.
๐ง How it works: It uses a pre-trained SAM2 backbone + small head to find frontiers in images. Given a goal, it selects the highest-scoring one to navigate to.
๐งต3/6
18.04.2025 17:56 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0
โ๏ธProblem: Robots navigating with no prior maps relying only on local sensors have a limited mapping range (due to sparse/noisy depth) causing myopic decisions.
๐งต2/6
18.04.2025 17:56 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0
Long Range Navigator (LRN) ๐งญโ an approach to extend planning horizons for off-road navigation given no prior maps. Using vision LRN makes longer-range decisions by spotting navigation frontiers far beyond the range of metric maps.
personalrobotics.github.io/lrn/
๐งต1/6
18.04.2025 17:56 โ ๐ 3 ๐ 4 ๐ฌ 1 ๐ 3
Excited to attend the talk!
11.01.2025 23:01 โ ๐ 2 ๐ 0 ๐ฌ 0 ๐ 0
Sweet!! ๐๐คฉ
26.12.2024 13:47 โ ๐ 3 ๐ 0 ๐ฌ 0 ๐ 0
Happy holidays from UW Robotics!
24.12.2024 18:15 โ ๐ 11 ๐ 2 ๐ฌ 0 ๐ 0
PhD student at UC Berkeley studying RL and AI safety.
https://cassidylaidlaw.com
PhD at UW iSchool. Research in NLP, Responsible AI, AI Ethics.
She/her
navreeetkaur.github.io
Assistant Prof, CS@Mines
Aerial and multiple robots; autonomous filming, mapping, perception
Enjoys: ๐งโโ๏ธ๐ดโโ๏ธ๐ธ๐ถ๐ค
Ursula Le Guin stan. Call me on my ansible.
He/him pronouns ๐๐๐
Opinions my own.
https://www.micahcorah.com/
PhD student at NYU | Building human-like agents | https://www.daphne-cornelisse.com/
Senior Research Fellow @ ucl.ac.uk/gatsby & sainsburywellcome.org
{learning, representations, structure} in ๐ง ๐ญ๐ค
my work ๐ค: eringrant.github.io
not active: sigmoid.social/@eringrant @eringrant@sigmoid.social, twitter.com/ermgrant @ermgrant
I toot about open source robots and robotics.
[bridged from https://fosstodon.org/@locoscaron on the fediverse by https://fed.brid.gy/ ]
AI professor at Caltech. General Chair ICLR 2025.
http://www.yisongyue.com
PhD student at MIT working on deep learning (representation learning, generative models, synthetic data, alignment).
ssundaram21.github.io
https://robotics.cs.washington.edu/
Professor for Visual Computing & Artificial Intelligence @TU Munich
Co-Founder @synthesiaIO
Co-Founder @SpAItialAI
https://niessnerlab.org/publications.html
Canadian in Taiwan. Emerging tech writer, and analyst with a flagship Newsletter called A.I. Supremacy reaching 115k readers
Also watching Semis, China, robotics, Quantum, BigTech, open-source AI and Gen AI tools.
https://www.ai-supremacy.com/archive
PhD student @ ETH Zรผrich | all aspects of NLP but mostly evaluation and MT | go vegan | https://vilda.net
Robotics R&D at Intrinsic. PhD from USC RESL.
gautamsalhotra.com
Building generally intelligent robots that *just work* everywhere, out of the box, at NYU CILVR.
Previously at MIT and visiting researcher at Meta AI.
https://mahis.life/
PhD@UW, Student Researcher@Meta.
Physics, Visualization and AI PhD @ Harvard | Embedding visualization and LLM interpretability | Love pretty visuals, math, physics and pets | Currently into manifolds
Wanna meet and chat? Book a meeting here: https://zcal.co/shivam-raval
PhD Student @ UW CSE
NLP | BData | Mental Health