Thank you Grok! Definitely it is OpenAI researchers quitting that's the big news today.
11.02.2026 20:01 β π 0 π 0 π¬ 0 π 0@processorofnl.bsky.social
natural language processing
Thank you Grok! Definitely it is OpenAI researchers quitting that's the big news today.
11.02.2026 20:01 β π 0 π 0 π¬ 0 π 0Feeling like an asshole because I worded my objections a bit firmly in expectation of dissenting opinion, but everyone agreed.
10.02.2026 17:17 β π 0 π 0 π¬ 0 π 0(Reading too much into just one paper from my batch of reviews but...) It seems that many other reviewers are finally pushing back on low hanging fruit chasing papers. These would have been easy accepts last year (against my reject opinion).
10.02.2026 17:14 β π 0 π 0 π¬ 1 π 0me last year: I have no time to read papers I want I can't wait to start my PhD so I can read all these papers
me now: I am reading too many papers
From the way its worded this isnt training data, its links cited in RAG
08.02.2026 16:03 β π 0 π 0 π¬ 0 π 0highly accurate and up-to-date description of Google's models (this paper came out after Gemini)
06.02.2026 19:14 β π 0 π 0 π¬ 0 π 0why write interesting innovative papers to increase your citations count when you can just flag plant citable sounding surveys
06.02.2026 19:08 β π 0 π 0 π¬ 1 π 0god forbid a girl stim
05.02.2026 19:46 β π 5 π 1 π¬ 0 π 0In a world where Elmos and Berts are viewed as outdated relics, we gotta give props for the Baidu team for sticking all the way to ERNIE 5.0
huggingface.co/papers/2602....
how wet does my Claude have to be to support crustacean life?
05.02.2026 01:59 β π 0 π 0 π¬ 0 π 0is this negative Loss?
04.02.2026 23:20 β π 1 π 0 π¬ 0 π 0hell yeah
03.02.2026 21:19 β π 1 π 0 π¬ 0 π 0It's insightful and true because it's in ICLR format
03.02.2026 15:28 β π 1 π 1 π¬ 0 π 0I was morbidly curious about the content here but its like... Reddit in LaTex form...
www.clawxiv.org
if LLMs were so smart why would they post on Reddit
01.02.2026 21:57 β π 1 π 0 π¬ 0 π 0the wording of the quote "test" makes it obvious this was generated from the mind of a schoolchild which is quite endearing but also annoying on how easy it is to spread false quotes
31.01.2026 16:17 β π 0 π 0 π¬ 0 π 0in finding this image I found that many people attribute a similarish quote to Benjamin Franklin which is quite funny... like what test would this man be taking bro quit school at age 10
31.01.2026 16:15 β π 0 π 0 π¬ 1 π 0hyperparameter sweeping be like
31.01.2026 16:11 β π 0 π 0 π¬ 1 π 0Very interesting... Because of this I asked a bunch of question I probably wouldn't have asked and ended up getting more done and fixing more stuff and learning more.
30.01.2026 04:21 β π 0 π 0 π¬ 0 π 0Iβm getting anxiety just contemplating this number.
29.01.2026 23:37 β π 2 π 0 π¬ 0 π 0okay he couldn't help me fix my (codebase) problems but it felt nice the whole time talking with him
29.01.2026 21:20 β π 0 π 0 π¬ 1 π 0Went back to Claude after mindlessly defaulting to Gemini for months feels like rediscovering how to love
29.01.2026 20:11 β π 1 π 0 π¬ 1 π 0If you ever wonder why your boss makes you sit through a bunch of cybersecurity awareness tutorials that would only help the technically illiterate, it's usually because they are one
28.01.2026 00:18 β π 0 π 0 π¬ 0 π 0Interpretability researchers love watching the Transformer MLP
27.01.2026 19:11 β π 4 π 0 π¬ 0 π 0arxiv.org/abs/2311.04897
The ability to get to future tokens from representations should suggest the former
You are not "high-agency" for being unable to go through the mundane. It just means you have ADHD and since its 2026 it means you're average.
26.01.2026 21:35 β π 0 π 0 π¬ 0 π 0I thought 2025 was supposed to be the year of agents
24.01.2026 22:47 β π 0 π 0 π¬ 0 π 0again, very topical news topics of today that I should care about thank you Grok and Elon!!
24.01.2026 22:39 β π 1 π 0 π¬ 0 π 0this needs to be so much louder. The average paper even before LLMs are very very sloppy with citations and reviewers aren't expected to care about them.
24.01.2026 22:20 β π 1 π 0 π¬ 0 π 0