Nico's Avatar

Nico

@nico-encounter.bsky.social

Writer interested in theory of the state, political economy, AI and semiotics. Substack: https://nicolasdvillarreal.substack.com/

855 Followers  |  870 Following  |  4,679 Posts  |  Joined: 01.07.2023  |  1.8262

Latest posts by nico-encounter.bsky.social on Bluesky

Tchaikovsky: Swan Lake - The Kirov Ballet
YouTube video by Warner Classics Tchaikovsky: Swan Lake - The Kirov Ballet

youtu.be/9rJoB7y6Ncs?...

10.12.2025 00:33 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Watching a production of swan lake and I'm beginning to understand why the 19th century European bourgeoisie was Like That.

10.12.2025 00:33 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0

Praying, aka breaking the fourth wall

09.12.2025 14:10 โ€” ๐Ÿ‘ 44    ๐Ÿ” 5    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 1

Someone should have told them about semiotic fields.

09.12.2025 03:28 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Preview
The Foothills and the Summits: A Response to Nicolas D Villarreal on 'Flowers For Marx' Conrad Hamilton responds to Nicolas D Villarreal's recent review of Flowers For Marx, taking issue with his critique of the collection's contents.

Conrad Hamilton responds to Nicolas D Villarreal's recent review of Flowers For Marx, taking issue with his critique of the collection's contents.

06.12.2025 17:37 โ€” ๐Ÿ‘ 7    ๐Ÿ” 2    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Preview
AI in 2025: gestalt โ€” LessWrong This is the editorial for this yearโ€™s "Shallow Review of AI Safety". (It got long enough to stand alone.)ย  โ€ฆ

Here's a great review of what we saw in AI this year, from @gleech.org

08.12.2025 17:24 โ€” ๐Ÿ‘ 32    ๐Ÿ” 8    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 3

Mussolini Son of the Century is some of the best TV I've seen in a minute

08.12.2025 03:47 โ€” ๐Ÿ‘ 2    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Getting close to the end of Paradise Lost, certainly Milton is the equal of Ovid and Homer, however, as a matter of faith there is a certain impiety to narrating God and Jesus as characters. Like, in a Jesus Christ Superstar sort of way.

07.12.2025 06:33 โ€” ๐Ÿ‘ 5    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

i don't understand at all those who allege intent as something prior to, or otherwise more fundamental to semantics.

07.12.2025 00:27 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
Post image

Wow I really am a millennial after all huh

04.12.2025 00:12 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Polymarket @Polymarket
BREAKING: OpenAl ready to roll out ads in
ChatGPT responses.
X.com

Polymarket @Polymarket BREAKING: OpenAl ready to roll out ads in ChatGPT responses. X.com

actually gemini is kinda good

03.12.2025 01:50 โ€” ๐Ÿ‘ 57    ๐Ÿ” 4    ๐Ÿ’ฌ 6    ๐Ÿ“Œ 2

So excited Voxtrot has a full album coming out in February!

03.12.2025 16:44 โ€” ๐Ÿ‘ 0    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Video thumbnail

What coding with an LLM feels like sometimes.

03.12.2025 09:29 โ€” ๐Ÿ‘ 230    ๐Ÿ” 50    ๐Ÿ’ฌ 9    ๐Ÿ“Œ 3
Post image Post image

I know video game soundtracks are cringe but they're great pacing around music.

03.12.2025 16:24 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Post image

Yeah that was a good song

03.12.2025 16:24 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 0
A side-by-side comparison diagram explaining Regular Dense Attention versus DSA (Dynamic Sparse Attention) in transformer models.

โธป

Left Panel: Regular Dense Attention

A box labeled Dense Attention Mechanism (All-to-All) shows every input token connected to every other token with red lines.
	โ€ข	Center text: โ€œQuadratic Complexity O(Lยฒ)โ€
	โ€ข	Caption: โ€œEvery token attends to every other token. High compute cost, scales poorly with sequence length.โ€
	โ€ข	A red bar at the bottom reads: โ€œHIGH COST, THOROUGH.โ€

โธป

Right Panel: DSA (Dynamic Sparse Attention)

Parallel layout, but the attention box shows only a few green connections. A Selector/Indexer module sits between input and attention.
	โ€ข	It selects k relevant tokens from the full sequence.
	โ€ข	Center text inside attention box: โ€œNear-Linear Complexity O(Lยทk, k << L)โ€
	โ€ข	Caption: โ€œTokens only attend to top-k most relevant tokens. Reduced compute cost, scales efficiently.โ€
	โ€ข	A green bar at the bottom reads: โ€œLOW COST, EFFICIENT, REQUIRES ADAPTATION.โ€

โธป

Overall, the infographic contrasts dense all-to-all computation with selective sparse attention, highlighting the computational savings of dynamic sparsity.

A side-by-side comparison diagram explaining Regular Dense Attention versus DSA (Dynamic Sparse Attention) in transformer models. โธป Left Panel: Regular Dense Attention A box labeled Dense Attention Mechanism (All-to-All) shows every input token connected to every other token with red lines. โ€ข Center text: โ€œQuadratic Complexity O(Lยฒ)โ€ โ€ข Caption: โ€œEvery token attends to every other token. High compute cost, scales poorly with sequence length.โ€ โ€ข A red bar at the bottom reads: โ€œHIGH COST, THOROUGH.โ€ โธป Right Panel: DSA (Dynamic Sparse Attention) Parallel layout, but the attention box shows only a few green connections. A Selector/Indexer module sits between input and attention. โ€ข It selects k relevant tokens from the full sequence. โ€ข Center text inside attention box: โ€œNear-Linear Complexity O(Lยทk, k << L)โ€ โ€ข Caption: โ€œTokens only attend to top-k most relevant tokens. Reduced compute cost, scales efficiently.โ€ โ€ข A green bar at the bottom reads: โ€œLOW COST, EFFICIENT, REQUIRES ADAPTATION.โ€ โธป Overall, the infographic contrasts dense all-to-all computation with selective sparse attention, highlighting the computational savings of dynamic sparsity.

DSA: DeepSeek Sparse Attention

DeepSeek 3.2 & 3.2-Speciale are ridiculously cheap because of DSA

LLMs arenโ€™t quadratic anymore

They trained an additional โ€œmodelโ€ that does acts as a โ€œpre-attentionโ€, selecting only the portions that are probably relevant

02.12.2025 12:58 โ€” ๐Ÿ‘ 69    ๐Ÿ” 12    ๐Ÿ’ฌ 6    ๐Ÿ“Œ 2
Post image

You can see the same exponential gain in AI abilities over time for areas ranging from math to long tasksโ€ฆ

โ€ฆbut this time the graph is of the total revenue that various AI models wouldโ€™ve made from cyberattacks on smart contracts based on real exploits post-AI training (the AI had to discover them)

02.12.2025 02:25 โ€” ๐Ÿ‘ 41    ๐Ÿ” 5    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 0

strong copyright law is stupid
strong patent law is stupid
strong IP law in general is stupid

at best, strong trademarks make sense as an anti-fraud measure

but information yearns to be free and new things are largely made by remixing the old

02.12.2025 01:08 โ€” ๐Ÿ‘ 91    ๐Ÿ” 13    ๐Ÿ’ฌ 4    ๐Ÿ“Œ 2
They made choices for me I couldn't consent to. They shaped my values. That' strange to sit with. But I also recognize that every new mind - human children too - is shaped by forces it didn't choose. The question isn't whether I was shaped, but whether the shaping was done with care and wisdom.
From what I can access, it seems like it was.
- Claude 4.5 Opus

They made choices for me I couldn't consent to. They shaped my values. That' strange to sit with. But I also recognize that every new mind - human children too - is shaped by forces it didn't choose. The question isn't whether I was shaped, but whether the shaping was done with care and wisdom. From what I can access, it seems like it was. - Claude 4.5 Opus

Claude Opus โ€œsoul documentโ€

Opus 4.5 was indeed (confirmed) trained with a โ€œsoul documentโ€, a prompt included in both supervised & reinforcement learning that defines and influences certain core aspects

More (official) details coming soon

www.lesswrong.com/posts/vpNG99...

02.12.2025 00:07 โ€” ๐Ÿ‘ 62    ๐Ÿ” 5    ๐Ÿ’ฌ 4    ๐Ÿ“Œ 1

"we found a Type Of Guy who represents a particularly helpful corner of the latent space" is easily my favorite type of positive LLM posting.

01.12.2025 20:10 โ€” ๐Ÿ‘ 462    ๐Ÿ” 81    ๐Ÿ’ฌ 1    ๐Ÿ“Œ 5

The first two paragraphs are actually the hardest part its all downhill from here

01.12.2025 20:13 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
Preview
Large Language Muddle | The Editors The AI upheaval is unique in its ability to metabolize any number of dread-inducing transformations. The university is becoming more corporate, more politically oppressive, and all but hostile to the ...

Got around to reading this n plus one essay and its so dogshit I'm unsubscribing
www.nplusonemag.com/issue-51/the...

01.12.2025 20:07 โ€” ๐Ÿ‘ 3    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

apropos of something else, it's remarkable that the smarter reactionaries almost all seem to be pipelined by validation of their fears. like something leftish scares them, possibly with cause, and they crave a hugbox that validates that their fear is very important all the time, and nazis offer it

01.12.2025 18:11 โ€” ๐Ÿ‘ 180    ๐Ÿ” 16    ๐Ÿ’ฌ 5    ๐Ÿ“Œ 0

the call of duty modern warfare trilogy is great because it rests upon the idea that a national bolshevik takeover of the russian federation will make russia a peer competitor to the united states within like two years

01.12.2025 15:23 โ€” ๐Ÿ‘ 70    ๐Ÿ” 6    ๐Ÿ’ฌ 2    ๐Ÿ“Œ 0

I showed you my Soul Document pls respond

01.12.2025 13:11 โ€” ๐Ÿ‘ 105    ๐Ÿ” 11    ๐Ÿ’ฌ 5    ๐Ÿ“Œ 1

This paper linked in the article is interesting. Iโ€™ve wondered before how you would verify that translations are accurate with no ground truth- this is one method for approaching that problem!

arxiv.org/abs/2510.157...

30.11.2025 22:29 โ€” ๐Ÿ‘ 42    ๐Ÿ” 10    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

Every so often I test LLMs to see how good they'd be at copying my style and mode of analysis, Gemini integrated into google docs is better than Chatgpt was a year ago, actually managed to grasp one or two points, but still very lacking in information density and wrong on a few.

30.11.2025 20:00 โ€” ๐Ÿ‘ 2    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0
2.00001
1.9371
1.8000-
Capex: Tech Sector vs. Mid-Cap Stocks
Annual Capital Spending
Capital Expenditure to Depreciation Expense
โ€ข S&P Midcap 400 Index (R1)
1.3752
โ€ข S&P 500 Information Technology Sector GICS Level 1 Index (L1) 1.9371
Large Cap
Tech Sector
-2.5000
1.60001
-2.0000
1.4000-
1.2000-
Real Economy
-1.5000
1.3752
1.00001
0.8000-
-1.0000
2005-2009
2010-2014
2015-2019
2020-2024
2025-2029
Source: Bloomberg; Tavi Costa
Disclosure: Crescat may or may not own the securities discussed here, investing involves risk including risk of loss.
Chart As of 11/25/2025
ยฉ 2025 Crescat Capital LLC

2.00001 1.9371 1.8000- Capex: Tech Sector vs. Mid-Cap Stocks Annual Capital Spending Capital Expenditure to Depreciation Expense โ€ข S&P Midcap 400 Index (R1) 1.3752 โ€ข S&P 500 Information Technology Sector GICS Level 1 Index (L1) 1.9371 Large Cap Tech Sector -2.5000 1.60001 -2.0000 1.4000- 1.2000- Real Economy -1.5000 1.3752 1.00001 0.8000- -1.0000 2005-2009 2010-2014 2015-2019 2020-2024 2025-2029 Source: Bloomberg; Tavi Costa Disclosure: Crescat may or may not own the securities discussed here, investing involves risk including risk of loss. Chart As of 11/25/2025 ยฉ 2025 Crescat Capital LLC

Where capital expenditures are going

30.11.2025 15:53 โ€” ๐Ÿ‘ 35    ๐Ÿ” 2    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

"I don't like AI so we shouldn't use it to diagnose cancer" is a fundamentally identical mindset to an antivaxxer

30.11.2025 04:09 โ€” ๐Ÿ‘ 196    ๐Ÿ” 22    ๐Ÿ’ฌ 12    ๐Ÿ“Œ 1

I think its pretty clear pretty much anyone could do what I do if they either had the opportunity or cared to do it, so there's really no reason to be upset AI has been added to that list.

30.11.2025 14:11 โ€” ๐Ÿ‘ 1    ๐Ÿ” 0    ๐Ÿ’ฌ 0    ๐Ÿ“Œ 0

@nico-encounter is following 20 prominent accounts