I remember end-of-year (near the holidays) laziness. Not sure if recent models still suffer from that.
02.11.2025 19:48 β π 3 π 0 π¬ 1 π 0@fpl9000.bsky.social
Retired software engineer. AI enthusiast. Deadhead. I implemented Bash's regex operator (=~). My Signal user ID is franl.99.
I remember end-of-year (near the holidays) laziness. Not sure if recent models still suffer from that.
02.11.2025 19:48 β π 3 π 0 π¬ 1 π 0You're too honest. You should have blamed it on the AI.
02.11.2025 18:09 β π 1 π 0 π¬ 1 π 0Listen to the Grateful Dead from 1978-10-18 from Winterland Arena in San Francisco CA - online NOW! via @tunein #NowPlaying >>>http://tun.in/sexiP <<< #DeadHeads #GratefulDeadMusic #GratefulDead #GratefulDeadFamily
02.11.2025 16:53 β π 3 π 1 π¬ 0 π 0Large reasoning models almost certainly can think
The article argues against the notion that large reasoning models (LRMs) cannot think, countering Apple's claim that they merely perform pattern-matching. It refutes the idea that LRMs fail because they struggleβ¦
Telegram AI Digest
#ai #apple #news
This is a good question about my behavior. Let me think about what happened: The user asked about the difference between connectors and extensions in Claude Desktop I initially said I didn't have detailed information and referred them to documentation Then they pointed out I could search the web for that information I did search and found comprehensive information Why did I initially not search? Let me look at my instructions: From the search instructions, I should search when: β’ Information is beyond my knowledge cutoff β’ May have changed since the knowledge cutoff β’ The topic is rapidly changing β’ The query requires real-time data The instructions also say: β’ "Claude answers from its own extensive knowledge first for stable information. For time-sensitive topics or when users explicitly need current information, search immediately." β’ And: "For stable information that changes yearly or less frequently, answer directly from knowledge without searching unless it is likely that information has changed since the knowledge cutoff, in which case search immediately." The instructions about "connectors and extensions" are product-specific features that: β’ Were introduced relatively recently (the search results show they were announced in 2025, and my knowledge cutoff is January 2025) β’ Are specific product features that I genuinely didn't have detailed information about β’ Could have changed or evolved since my knowledge cutoff Looking at the decision tree: "IF info about the query is stable (rarely changes and Claude knows the answer well) β never search, answer directly without using tools ELSE IF there are terms/entities in the query that Claude does not know about β single search immediately" This falls under "terms/entities in the query that Claude does not know about" - I didn't have specific knowledge about the distinction between these two features. So according to my instructions, I SHOULD have searched immediately because: β’ These are pβ¦
In my case, I had "thinking" enabled. I asked it what happened, and this was its thinking process when explaining itself.
02.11.2025 15:50 β π 1 π 0 π¬ 0 π 0The other day, Claude answered a question by telling me the info I wanted was available at a particular Web site. I had to point out it could just read that site and give me the info. It was like "Oh, right. I'll do that."
02.11.2025 15:33 β π 2 π 0 π¬ 2 π 0NotebookLM: an amazing tool for learning.
Iβll be repeating my NotebookLM webinar on Nov. 7, 2pm Eastern. Join us for inspiration on creative use cases! (& recent new features) www.library20.com/notebooklm
01.11.2025 14:51 β π 5 π 1 π¬ 0 π 0I've spent the past several days studying the properties of Qwen 3 4B Instruct 2507's unembedding matrix. This learned matrix maps vector representations in hidden space to tokens in the model's vocabulary. Turns out there are a few ways to analyze a point cloud in high-dimensional space. π§΅
01.11.2025 19:07 β π 12 π 1 π¬ 1 π 1Prompt: In modern LLMs, even when the temperature is set to zero, can non-deterministic output still result from parallelized math operations happening in slightly different orderings due to timing differences between GPUs? Claude: Yes, absolutely. Even with temperature set to zero, you can still get non-deterministic outputs from modern LLMs due to low-level GPU operations, and this is a subtle but important issue in deep learning systems. The fundamental issue is that floating-point arithmetic is not associative. That is: (a + b) + c β a + (b + c) This happens because of rounding errors at each operation.
And what about the non-determinism of all those parallelized floating-point operations?
01.11.2025 14:26 β π 1 π 0 π¬ 1 π 0I assume this is only when the temperature is 0, right?
"[...] we prove mathematically that transformer language models mapping discrete input sequences to their corresponding sequence of continuous representations are injective and therefore lossless[...]."
Last chance to turn it off.
On Monday, November 3rd, Microsoft will start using your LinkedIn data for AI training. And remember, you're opted in by default.
To toggle it off π Account - Settings & Privacy > Data privacy > Data for Generative AI Improvement.
Google researcher shows life "emerges from code" β Blaise AgΓΌera y Arcas.
Blaise AgΓΌera y Arcas explores some mind-bending ideas about what intelligence and life really areβand why they might be more similar than we think (filmed at ALIFE conference, 2025 - https://2025.alife.org/). Life and intelligence are both fundamentally computational (he says). From the very beginning, living things have been running programs. Your DNA? It's literally a computer program, and the ribosomes in your cells are tiny universal computers building you according to those instructions.
Fascinating interview with Blaise AgΓΌera y Arcas on the Machine Learning Street Talk podcast.
31.10.2025 13:43 β π 1 π 0 π¬ 0 π 0A stock candlestick chart for Meta Platforms, Inc. (NASDAQ) showing intraday price action. The top reads: βMeta Platforms, Inc. Β· 10 Β· NASDAQ,β with values: open 752.85, high 755.01, low 689.04, close 689.80, down 62.15 points (β8.27%). The chart background is black, with red and green candles indicating minute-by-minute trading activity. Prices move mostly sideways through the day, then spike briefly upward near 15:45 before a massive red candlestick plunges sharply at 16:00, representing a steep sell-off. The y-axis ranges from 685 to 760 USD, and the x-axis shows times from 10:00 to 16:00. The final candle closes at 689.80 USD.
Meta stock drops 10% after Q3 earnings call due to a $15.9B expense for hiring 4 AI researchers
29.10.2025 22:15 β π 67 π 8 π¬ 10 π 5Language models can correctly answer questions about their previous intentions.
www.anthropic.com/research/int...
These two paragraphs from an Anthropic study on AI introspection are worth a second to read.
I think it is fair to say that both conclusions are quite... controversial, but the paper makes an interesting attempt to back up these assertions with experiments. transformer-circuits.pub/2025/introsp...
TLDR; The PSF has made the decision to put our community and our shared diversity, equity, and inclusion values ahead of seeking $1.5M in new revenue. Please read and share. pyfound.blogspot.com/2025/10/NSF-...
π§΅
π£THREAD: Itβs surprising to me that so many people were surprised to learn that Signal runs partly on AWS (something we can do because we use encryption to make sure no one but youβnot AWS, not Signal, not anyoneβcan access your comms).
Itβs also concerning. 1/
the numbers for βshort AGI timelinesβ are suspiciously close to the maximum amount of time a VC is willing to wait for a liquidity event. just saying
26.10.2025 12:08 β π 110 π 14 π¬ 6 π 1Muon is a (relatively) new optimizer that powered large-scale training of recent foundation models, e.g., Kimi K2 and GLM 4.5.
Interested in learning how it works?
Check out the video here: youtu.be/bO5nvE289ec
Claude Code Creator: We Didn't Mean to Build It, But It's Changed Everything
share.google/7eVUZ6vEfcjV...
Happy Amazon Prime Day! Amazon collects mountains of data about how you use the service, but there is a setting you can change to make it harder for the company to use that data to sell you more things. #OptOutOctober www.eff.org/deeplinks/2...
07.10.2025 16:39 β π 948 π 461 π¬ 31 π 39Heh. Try being a Grateful Dead fan. Recently, they've been releasing 50th anniversary albums.
23.10.2025 13:56 β π 0 π 0 π¬ 0 π 0I recorded a ten minute video showing my vibe-coding process for building a tool for sharing formatted terminal sessions via copy and paste using the new Claude Code for web - now available on YouTube here www.youtube.com/watch?v=GQvM...
More notes on my blog: simonwillison.net/2025/Oct/23/...
Agentic AIβs Hidden Data Trailβand How to Shrink It spectrum.ieee.org/agentic-ai-s...
22.10.2025 14:32 β π 9 π 3 π¬ 0 π 0Or perhaps Iain Banks' Culture series, which I'm enjoying a lot. Unlike "Dune", each book is only loosely connected to the others, so you don't have to read them in order.
22.10.2025 13:27 β π 1 π 0 π¬ 1 π 0I got the same impression from his interview with Karpathy.
22.10.2025 13:23 β π 1 π 0 π¬ 1 π 0How long until they just merge into one giant company. I used to think CHOAM was the least believable element of "Dune", but now I'm now I'm not sure.
22.10.2025 13:15 β π 2 π 0 π¬ 1 π 0I gave a talk last night about "Living dangerously with Claude", on the joys and perils of --dangerously-skip-permissions and how critical it is that we run coding agents in a sandbox so that we can unlock their full potential simonwillison.net/2025/Oct/22/...
22.10.2025 12:36 β π 112 π 12 π¬ 4 π 2Videos from the Natural Philosophy Symposium are coming online at last! We start with the opening plenary by David Chalmers @davidchalmers.bsky.social: Can There Be a Mathematical Theory of Consciousness? Commentary by Ryan Smith.
www.youtube.com/watch?v=Zsve...
Claude Desktop just went GAβand itβs aiming to live in your dock, not your browser. π
Key stats
Mac + Windows downloads live now. Voice via Caps Lock. One-click window sharing. Screenshot capture.