R. James Cotton's Avatar

R. James Cotton

@peabody124.bsky.social

Physiatrist, neuroscientist, gait and movement, brain and robotics enthusiast. Assistant Professor at Northwestern and Shirley Ryan AbilityLab

1,057 Followers  |  949 Following  |  64 Posts  |  Joined: 01.08.2023  |  2.3671

Latest posts by peabody124.bsky.social on Bluesky

Video thumbnail

More demos and code available at intelligentsensingandrehabilitation.github.io/MonocularBio...

JD did a great job creating a Gradio demo so try it out and let us know what you think

And here is a video of JD going on a celebratory run that the preprint is out :)

14.07.2025 06:07 β€” πŸ‘ 3    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0
Clinical Validity of Smartphone-Based Gait Deviation Index. A) Hip and Knee flexion angles of clinical and control groups B) GDI separates groups at risk of falls determined by the Berg Balance Scale. C) GDI correlates with 10 Meter Walk Test performance $r = 0.82$. D) GDI of LLPUs and KOA participants is significantly lower than that of control populations. Further, GDI of Transfemoral amputees is significantly lower than GDI of Transtibial amputees. E) GDI collected in clinical settings correlates ($r = 0.47$) with the mJOA, a clinically used ordinal questionnaire.

Clinical Validity of Smartphone-Based Gait Deviation Index. A) Hip and Knee flexion angles of clinical and control groups B) GDI separates groups at risk of falls determined by the Berg Balance Scale. C) GDI correlates with 10 Meter Walk Test performance $r = 0.82$. D) GDI of LLPUs and KOA participants is significantly lower than that of control populations. Further, GDI of Transfemoral amputees is significantly lower than GDI of Transtibial amputees. E) GDI collected in clinical settings correlates ($r = 0.47$) with the mJOA, a clinically used ordinal questionnaire.

Excitingly, in addition to producing accurate kinematics, we can measure the gait deviation index from these videos. We find this is quite sensitive to a number of different clinical backgrounds, and even more responsive after neurosurgical interventions than the standard clincial outcomes (mJOA).

14.07.2025 06:07 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Quality Measures of Single Camera Fitting. A) Kinematic traces from smartphone video (red/blue) compared to ground truth (gray dashed) during walking. B) Joint angle errors across populations for select lower limb angles. n denotes the number of unique individuals in each cohort and v denotes the number of total videos for that cohort. C) Select joint angle errors with respect to camera view angle show that sagittal plane angles have the lowest error with sagittal camera views, and frontal angles have the lowest error with frontal views. D) Pelvis translation (RTE) extracted from handheld smartphone video compared to ground truth during functional gait assessments.

Quality Measures of Single Camera Fitting. A) Kinematic traces from smartphone video (red/blue) compared to ground truth (gray dashed) during walking. B) Joint angle errors across populations for select lower limb angles. n denotes the number of unique individuals in each cohort and v denotes the number of total videos for that cohort. C) Select joint angle errors with respect to camera view angle show that sagittal plane angles have the lowest error with sagittal camera views, and frontal angles have the lowest error with frontal views. D) Pelvis translation (RTE) extracted from handheld smartphone video compared to ground truth during functional gait assessments.

Central to this was extending our end-to-end differentiable biomechanics approach to fitting both 2D and 3D keypoints measured from images. This can also account for smartphone rotation measured by our Portable Biomechanics Platform, which also makes this easy to integrate into clinical workflows.

14.07.2025 06:07 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Methods Overview. We introduce a method for biomechanically grounded movement analysis in clinical settings using a handheld smartphone. \textbf{A)} Researchers held a smartphone (optionally with gimbal) while following a participant walking. Our system has no specific requirements regarding viewing angle, distance to subject, or therapist assistance. \textbf{B)} Recorded smartphone video and optional wearable sensor data are stored in the cloud, and processed using PosePipe, an open-source package implementing computer vision models for person tracking and keypoint detection. \textbf{C)} To reconstruct movement, we represent movement as a function that outputs joint angles, whichβ€”combined with body scaling parameters and evaluated through forward kinematicsβ€”generate a posed biomechanical model in 3D space. This untrained model is compared to video-extracted joint locations and optionally smartphone sensor data to compute a loss. This loss guides backpropagation to iteratively refine both the kinematic trajectory and body scale. \textbf{D)} Initially, the representation lacks knowledge of the person’s movements and scale (e.g., height, limb proportions), but after optimization, it typically tracks joint locations within 15 mm in 3D and 5 pixels in 2D.

Methods Overview. We introduce a method for biomechanically grounded movement analysis in clinical settings using a handheld smartphone. \textbf{A)} Researchers held a smartphone (optionally with gimbal) while following a participant walking. Our system has no specific requirements regarding viewing angle, distance to subject, or therapist assistance. \textbf{B)} Recorded smartphone video and optional wearable sensor data are stored in the cloud, and processed using PosePipe, an open-source package implementing computer vision models for person tracking and keypoint detection. \textbf{C)} To reconstruct movement, we represent movement as a function that outputs joint angles, whichβ€”combined with body scaling parameters and evaluated through forward kinematicsβ€”generate a posed biomechanical model in 3D space. This untrained model is compared to video-extracted joint locations and optionally smartphone sensor data to compute a loss. This loss guides backpropagation to iteratively refine both the kinematic trajectory and body scale. \textbf{D)} Initially, the representation lacks knowledge of the person’s movements and scale (e.g., height, limb proportions), but after optimization, it typically tracks joint locations within 15 mm in 3D and 5 pixels in 2D.

We developed a novel approach to fitting biomechanics from smartphone video that produces kinematic reconstructions within a few degrees and has been validated across a wide range of activities and clinical backgrounds.

14.07.2025 06:07 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Preview
Portable Biomechanics Laboratory: Clinically Accessible Movement Analysis from a Handheld Smartphone The way a person moves is a direct reflection of their neurological and musculoskeletal health, yet it remains one of the most underutilized vital signs in clinical practice. Although clinicians visua...

Super proud of this work led by JD Peiffer (@abilitylab.bsky.social and @tgsatnu.bsky.social)
preprint: arxiv.org/abs/2507.08268
project/code: intelligentsensingandrehabilitation.github.io/MonocularBio...

Open source code that lets you get state-of-the-art biomechanics from a smartphone

14.07.2025 06:07 β€” πŸ‘ 6    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0
Post image 04.06.2025 23:01 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

Enjoyed presenting on "The Good, The Bad, and the Ugly: AI for SCI Clinicians" with @ryansolinskymd.bsky.social and @josezariffa.bsky.social. Great enthusiasm from the crowd on the topic and the lively discussion and nice followup from precourse
bsky.app/profile/ryan...

04.06.2025 23:01 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

AI integration in spinal cord injury medicine precourse at the ASIA2025 meeting. Led by @peabody124.bsky.social and Dr. Sarah Brueningk. Learning lessons from other successful examples in Cancer, Alzheimer’s, Cardiology.

01.06.2025 16:38 β€” πŸ‘ 4    πŸ” 1    πŸ’¬ 0    πŸ“Œ 1

However, more work to do actually validating these against EMG recordings (we have this in many of our trials from our wearable sensor platform) and I suspect there will be lots of work to really tune it up.

Still, finding clinically sensible patterns is a promising first step.

27.05.2025 22:12 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0

The imitation learning policy is trained to replicate all the kinematics from 30+ hours of markerless motion data by driving a muscle-driven model with some regularization on muscle activation. Through training it learns muscle patterns that will replicate the kinematics.

27.05.2025 22:12 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

Looking forward, we hope to combine things like this.

E.g. using BiomechGPT to understand movement user requests and then run simulations in the physics simulator using imitation learning, for example.

Either way, really starting to see promise for foundation models in biomechanics

Stay tuned :)

27.05.2025 21:36 β€” πŸ‘ 4    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

Particularly exciting was evidence of positive transfer learning as we increased the set of tasks it is trained on.

Of course lots of it also makes mistakes (the person in that video is using a crutch!). Lots of work to do and we are just starting exploring the opportunities from this approach.

27.05.2025 21:36 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

We were super excited to see how well this performed across a range of tasks, even with fairly sparse annotation.

It's doing a great job at things like activity classification, which can be rather challenging for impaired movements, and more subtle things like inferring likely diagnoses.

27.05.2025 21:36 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Video thumbnail

The next paper is BiomechGPT arxiv.org/abs/2505.18465 with @antihebbiann.bsky.social and Ruize Yang which trains a language model to be fluent in tokenized movement sequences. This draws inspiration from MotionGPT but focuses on benchmarking performance on clinically meaingful tasks.

27.05.2025 21:36 β€” πŸ‘ 7    πŸ” 1    πŸ’¬ 1    πŸ“Œ 1

Shoutout to recent related work from @trackingskills.bsky.social group arxiv.org/abs/2503.14637

Great to see growing enthusiasm in this space
bsky.app/profile/trac...

And shoutout to MyoSuite for pushing the neuromuscular modeling in MuJoCo

27.05.2025 21:36 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Video thumbnail

Here is another example. It also captures some imperfections like little foot slips we want to improve.

27.05.2025 21:36 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Video thumbnail

Since then, we've tuned it up to handle anthropomorphic and muscle scaling. Still lots of work to do further tuning this as there are many things we aren't scaling such as mass and inertia and optimizing w.r.t. the EMG data we have from our wearable sensors.

27.05.2025 21:36 β€” πŸ‘ 3    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0

The first is KinTwin arxiv.org/abs/2505.13436 which trains torque driven and muscle-driven policies to replicate movements of intact and impaired gait. It detects clinically meaningful features like propulsion asymmetries and muscle timing.

Teaser from a few months back: bsky.app/profile/peab...

27.05.2025 21:36 β€” πŸ‘ 3    πŸ” 1    πŸ’¬ 1    πŸ“Œ 0

Over the last few years we have been developing methods for markerless motion capture of biomechanics and getting them into the clinics, such as at @abilitylab.bsky.social.

We are now developing foundation models from these large datasets and testing what this enables. Two recent preprints:

27.05.2025 21:36 β€” πŸ‘ 24    πŸ” 5    πŸ’¬ 3    πŸ“Œ 1
SoNMIR Program – Rehabweek 2025 – Chicago

It was also the initial meeting of Julius Dewald and Bob Sainburg's Society for Neuromechanics in Rehabilitation (SoNMiR) and it was great to present on biomechanics in rehabilitation and arxiv.org/abs/2411.03919. Very exciting for this society bringing together this community.

27.05.2025 13:15 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
Post image Post image

@jdpeiffer.bsky.social and Tim Unger also won second prize for the ICORR talks for "Differentiable Biomechanics for Markerless Motion Capture in Upper Limb Stroke Rehabilitation: A Comparison with Optical Motion Capture" arxiv.org/abs/2411.14992.

27.05.2025 13:15 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image Post image Post image

With Kyle Embry and the @abilitylab.bsky.social C-STAR team we also organized a half-day workshop "From Motion to Meaning: AI-Enabled Biomechanics for Rehabilitation" showcasing work from Georgios Pavlakos, Eni Halilaj, Vikash Kumar, Chris Awai, and Pouyan Firouzabadi.

27.05.2025 13:15 β€” πŸ‘ 0    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Post image

With Dailyn Despradel, Derek Kamper, @marcslutzky.bsky.social, and @dougweberlab.bsky.social we organized a workshop on EMG biofeedback. Very much enjoyed the engaged discussion on how to disseminate these technologies into the real world.

27.05.2025 13:15 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0

Finally recovered from RehabWeek in Chicago, which was fantastic. It was a very successful week for the Intelligent Sensing and Rehabilitation lab at @abilitylab.bsky.social.

(p.s. sorry below for anyone on BlueSky who I couldn't find/failed to tag)

27.05.2025 13:15 β€” πŸ‘ 1    πŸ” 0    πŸ’¬ 1    πŸ“Œ 0
Screenshot from a chat interface for biomechanics

Screenshot from a chat interface for biomechanics

Looking forward to presenting on what we can do with large-scale biomechanics data in rehabilitation in the SoMNiR #RehabWeek 2025 session this afternoon! @abilitylab.bsky.social

15.05.2025 17:39 β€” πŸ‘ 2    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
JD Peiffer presenting with a methods slide describing end-to-end differentiable biomechanics.

JD Peiffer presenting with a methods slide describing end-to-end differentiable biomechanics.

Tim Unger presenting with a slide showing arm kinematics

Tim Unger presenting with a slide showing arm kinematics

@jdpeiffer.bsky.social and Tim Unger gave a great talk in the ICORR best student paper session on their work using markerless motion capture to track arm kinematics of people with stroke. arxiv.org/abs/2411.14992

14.05.2025 23:34 β€” πŸ‘ 3    πŸ” 0    πŸ’¬ 0    πŸ“Œ 0
American Spinal Injury Association The premier North American organization in the field of Spinal Cord Injury Care, Education, and Research

Yeah that was a great session from the American Spinal Cord Injury Association! If you like that session, check out the conference! :) asia-spinalinjury.org

14.05.2025 23:25 β€” πŸ‘ 2    πŸ” 2    πŸ’¬ 0    πŸ“Œ 0

Really enoyed the session and engagement!

24.04.2025 01:31 β€” πŸ‘ 6    πŸ” 1    πŸ’¬ 2    πŸ“Œ 0
Dr. Cotton presenting onstage at #ASNR2025

Dr. Cotton presenting onstage at #ASNR2025

Dr. Cotton presenting onstage at #ASNR2025

Dr. Cotton presenting onstage at #ASNR2025

Dr. Cotton presenting onstage at #ASNR2025

Dr. Cotton presenting onstage at #ASNR2025

Continuing our #ASNR2025 symposium on #ArtificialIntelligence and computational modeling in #neurorehabilitation, @peabody124.bsky.social discussed how his lab is quantifying gait-based biomarkers during functional mobility tasks in diverse clinical settings.

#AI #neuroscience #rehabilitation

23.04.2025 19:01 β€” πŸ‘ 3    πŸ” 1    πŸ’¬ 0    πŸ“Œ 0

Very much looking forward to our speakers: @maryamshanechi.bsky.social, @hugospiers.bsky.social, Marom Bikson, @jenpitt.bsky.social, Mary Czerwinski, @peabody124.bsky.social, @dpferris.bsky.social, @sladouce.bsky.social, Matthew Rizzo,
@c-rothkopf.bsky.social, @doriswang.bsky.social, Ying Choon Wu

31.03.2025 15:37 β€” πŸ‘ 8    πŸ” 5    πŸ’¬ 1    πŸ“Œ 0

@peabody124 is following 20 prominent accounts