Making an app was uncharted territory for us as a research group, and would not have been possible without my wonderful collaborators: Zeyu (Michael) Bian, @venkyp.bsky.social, @haritheja.bsky.social, @eneserciyes.bsky.social, @notmahi.bsky.social and @lerrelpinto.com.
26.02.2025 16:49 β π 2 π 0 π¬ 0 π 0
GitHub - NYU-robot-learning/AnySense: An iPhone app for multi-sensory data collection and learning
An iPhone app for multi-sensory data collection and learning - NYU-robot-learning/AnySense
AnySense is built to empower researchers, engineers, and developers with better tools for sensor-based AI. Our code is fully open-source and can be found on
Github: github.com/NYU-robot-le...
Website: anysense.app
26.02.2025 16:49 β π 0 π 0 π¬ 1 π 0
Why does this matter? Here, we use AnySense to scale data and train visuo-tactile policies using Robot Utility Models (robotutilitymodels.com) for a whiteboard erasing task. With AnySense-enabled live streaming, you can just plug your iPhone into your robot and seamlessly deploy your policies!
26.02.2025 16:49 β π 0 π 0 π¬ 1 π 0
Need to connect external sensors? No problem! AnySense supports data streaming over Bluetooth at the press of a button! Hereβs a visualization of data collected by connecting the AnySkin (any-skin.github.io) tactile sensor with AnySense via Bluetooth.
26.02.2025 16:49 β π 1 π 0 π¬ 1 π 0
βAnySense
βAnySense is an open-source iPhone app that enables multi-sensory data collection by integrating the iPhoneβs sensory suite with external sensors via Bluetooth and wired interfaces, enabling both offl...
Try AnySense right now: apps.apple.com/us/app/anyse...
Even if you donβt have external sensors, you can start using AnySense immediately to record and stream:
β
RGB + Depth + Pose data
β
Audio from the iPhone mic or custom contact microphones
β
Seamless Bluetooth integration for external sensors
26.02.2025 16:49 β π 0 π 0 π¬ 1 π 0
Ever struggled with multi-sensor data from cameras, depth sensors, and other custom sensors? Meet AnySenseβan iPhone app for effortless data acquisition and streaming. Working with multimodal sensor data will never be a chore again!
26.02.2025 16:49 β π 5 π 2 π¬ 1 π 0
P3-PO is a great example of how simple human priors can facilitate significantly better generalizability for robot policies.
10.12.2024 20:48 β π 3 π 2 π¬ 0 π 0
BAKU is fully open source and surprisingly effective. We found it easily adaptable for a host of visuotactile tasks in visuoskin.github.io
10.12.2024 18:23 β π 8 π 2 π¬ 0 π 0
Robot utility models are not just among the first learned models that work zero-shot on a mobile manipulator, but also provide a nuanced discussion on what works and what doesn't in data-driven robot learning.
09.12.2024 16:54 β π 7 π 1 π¬ 0 π 0
I'll be presenting AnySkin at the Stanford Center for Design Research today at 2pm! Stop by for a chat and try the sensor out!
More info: any-skin.github.io
25.11.2024 18:15 β π 6 π 2 π¬ 0 π 0
I just joined bluesky, and would love to connect with folks interested in embodied AI and robotics. I am a postdoctoral researcher at NYU working at the intersection of sensing, machine learning and robotics. Hit me up if you'd like to chat!
More about my research: raunaqbhirangi.github.io
22.11.2024 17:50 β π 2 π 0 π¬ 0 π 0
Teaching Faculty @ Princeton University | CMU, MIT alum | reinforcement learning, AI ethics, equity and justice, baking | ADHD πππ
Associate Professor in EECS at MIT. Neural nets, generative models, representation learning, computer vision, robotics, cog sci, AI.
https://web.mit.edu/phillipi/
Artist, Prof. of Engineering @UCBerkeley, Chief Scientist, @AmbiRobotics & @JacobiRobotics. Interested in robots, rockets, redwoods, rebels.
Researcher trying to shape AI towards positive outcomes. ML & Ethics +birds. Generally trying to do the right thing. TIME 100 | TED speaker | Senate testimony provider | Navigating public life as a recluse.
Former: Google, Microsoft; Current: Hugging Face
Postdoc @ Princeton AI Lab
Natural and Artificial Minds
Prev: PhD @ Brown, MIT FutureTech
Website: https://annatsv.github.io/
Stanford Linguistics and Computer Science. Director, Stanford AI Lab. Founder of @stanfordnlp.bsky.social . #NLP https://nlp.stanford.edu/~manning/
π§π»ββοΈ scientist at Meta NYC | http://bamos.github.io
Professor, Programmer in NYC.
Cornell, Hugging Face π€
CS PhD Student @ NYU w/ Profs Saining Xie & Rob Fergus
Intern @ Ai2 | Prev: CMU, BlackRock, Vanderbilt
https://ellisbrown.github.io
Professor a NYU; Chief AI Scientist at Meta.
Researcher in AI, Machine Learning, Robotics, etc.
ACM Turing Award Laureate.
http://yann.lecun.com
PhD student @stanfordnlp.bsky.social. Robotics Intern at the Toyota Research Institute. I like language, robots, and people.
On the academic job market!
Associate Professor at UMD CS. YouTube: https://youtube.com/@jbhuang0604
Interested in how computers can learn and see.
Intelligent Autonomous Systems Group @TUDarmstadt working on Robot Learning, the intersection of robotics and machine learning.
Led by Prof. Jan Peters
https://www.ias.informatik.tu-darmstadt.de
#RobotLearning Professor (#MachineLearning #Robotics) at @ias-tudarmstadt.bsky.social of
@tuda.bsky.social @dfki.bsky.social @hessianai.bsky.social
Musician, math lover, cook, dancer, π³οΈβπ, and an ass prof of Computer Science at New York University
PhD student @ University of Maryland | Looking to make robots work in the real world!
Excited about generalizing AI | PhD student @NYU