๐๐๐
02.07.2025 15:18 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0@alfcnz.bsky.social
Musician, math lover, cook, dancer, ๐ณ๏ธโ๐, and an ass prof of Computer Science at New York University
๐๐๐
02.07.2025 15:18 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0To compute the movement of the state x(t), we need to temporally integrate its velocity field xฬ(t). ๐ค
The control signal steering angle stays at 0, then 0.05ฯ, then linearly to โ0.20ฯ. The vehicle moves along circumferences.
Finally, a sweep of initial velocity is performed.
No, itโs just a non native speaker making silly mistakes. ๐
๐
๐
Thanks for catching that! ๐
Currently, writing chapter 10, ยซPlanning and controlยป.
Physical constrains for the evolution of the state (e.g. pure rotation of the wheels) are encoded through the velocity of the state แบ = dx(t)/dt, a function of the state x(t) and the control u(t).
๐ฅณ๐ฅณ๐ฅณ
10.06.2025 20:36 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0Oh! The undergrad feedback came in! ๐ฅน๐ฅน๐ฅน
10.06.2025 20:03 โ ๐ 9 ๐ 1 ๐ฌ 1 ๐ 0Releasing the Energy-Book ๐ from its first appendix's chapter, where I explain how I create my figures. ๐จ
Feel free to report errors via the issues' tracker, contribute to the exercises, and show me what you can draw, via the discussion section. ๐ฅณ
github.com/Atcold/Energ...
On a summer Friday night,
the first chapter sees the light.
๐ฅน๐ฅน๐ฅน
Yeah, it took me 20 days to get back ๐ฅน๐ฅน๐ฅน
I swear I respond to instant messages as they get through! ๐ฅฒ๐ฅฒ๐ฅฒ
Anyhow, one more successful semester completed. ๐ฅณ๐ฅณ๐ฅณ
This is the first semester I'm teaching this course.
I think I want to wait until version 2 (coming out this fall) before deciding to push the entire course online.
This of this lesson as a preview of what's coming next.
I'll be using it for advertising my course with the upcoming students.
You've been asking what I've been up to and how the book ๐ was coming alongโฆ well, since this new course is under construction, all my energy has been diverted to this project.
It's been exhausting ๐ฅต but rewarding ๐. It forced me to cover the history and the basics of my field.
In this lecture from my new undergrad course, we review linear multiclass classification, leverage backprop and gradient descent to learn a linearly separable feature vector for the input, and observe the training dynamics in a 2D embedding space. ๐ค
youtu.be/saskQ-EjCLQ
This is different from the video I made 5 years ago, where the input-output linear interpolation of an already trained network shows what a neural net does to its input. Namely, it follows a piece-wise linear mapping defined by the hidden layer.
08.04.2025 04:19 โ ๐ 2 ๐ 0 ๐ฌ 0 ๐ 0Training of a 2 โ 100 โ 2 โ 5 fully connected ReLU neural net via cross-entropy minimisation.
โข it starts outputting small embeddings
โข around epoch 300 learns an identity function
โข takes 1700 epochs more to unwind the data manifold
Did you enjoy Alfredo Canziani's lecture as much as we did?! If so, check out his website to find more about his educational offer: atcold.github.io
You can also find really cool material on Alfredo's YouTube channel! @alfcnz.bsky.social
๐ฃ A pocos dรญas del comienzo del Khipu 2025, nos complace anunciar que tanto las actividades del salรณn principal como el acto de clausura del viernes se retransmitirรกn en directo por este canal: khipu.ai/live/. ยกLos esperamos!
07.03.2025 11:30 โ ๐ 7 ๐ 2 ๐ฌ 0 ๐ 0I *really* had a blast giving this improvised lecture on a topic requested on the spot without any sleep! ๐คช
The audience seemed enjoying the show. ๐
To find more about my educational offer, check out my website! atcold.github.io
Follow here and subscribe on YouTube! ๐
In today's episode, we review the concepts of loss โ(๐, ๐), per-sample loss L(๐, x, y), binary cross-entropy cost โ(y, yฬ) = y softplus(โs) + (1โy) softplus(s), yฬ = ฯ(๐แต๐ณ(x)).
Then, we minimised the loss by choosing convenient values for our weight vector ๐.
@nyucourant.bsky.social
Yay! ๐ฅณ๐ฅณ๐ฅณ
01.02.2025 01:28 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0๐ ๐ ๐
30.01.2025 20:26 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0Tue morning: *prepares slides*
Tue class: *improv blackboard lecture*
Outcome: unexpectedly great lecture.
Thu morning: *prep handwritten notes*
Thu class: *executes blackboard lecture*
Students: ๐คฉ๐คฉ๐คฉ๐คฉ๐คฉ๐คฉ๐คฉ๐คฉ๐คฉ
@nyucourant.bsky.social @nyudatascience.bsky.social
๐ฅน๐ฅน๐ฅน
26.01.2025 05:01 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0I had linear algebra, calculus, and machine learning. I just removed the latter. ๐ ๐ ๐
26.01.2025 02:55 โ ๐ 2 ๐ 0 ๐ฌ 1 ๐ 0๐ฅน๐ฅน๐ฅน
26.01.2025 02:54 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0Whatโs going on? ๐ฎ๐ฎ๐ฎ
25.01.2025 23:53 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 0๐ค๐ค๐ค
25.01.2025 19:39 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0๐ฅฐ๐ฅฐ๐ฅฐ
24.01.2025 05:14 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0I think the new undergrad course is going well. At least we're having fun! ๐๐๐
23.01.2025 21:19 โ ๐ 17 ๐ 1 ๐ฌ 1 ๐ 1I am! ๐ฅฒ๐ฅฒ๐ฅฒ
23.01.2025 05:17 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0