At this point you may also dispute this is not clear enough to read. And if you do, I suspect that your definition of legibility may be a bit too puritanical 😊
25.07.2025 19:01 — 👍 3 🔁 0 💬 0 📌 0@eulacia.bsky.social
Promote the growth of knowledge and honor the institutions that safeguard the freedom to criticize. eulacia.com
At this point you may also dispute this is not clear enough to read. And if you do, I suspect that your definition of legibility may be a bit too puritanical 😊
25.07.2025 19:01 — 👍 3 🔁 0 💬 0 📌 0yes!
14.04.2025 00:45 — 👍 1 🔁 0 💬 0 📌 0They quote books that they never really understood and lack the culture to understand that the reason why intelligence is — to the best of our current knowledge — substrate independent is also the reason why AGIs will necessarily people.
Most of these guys are plain dorks.
Altman never understood Deutsch with any depth. Him invoking on of his books to mix up creativity with induction is a good example of how shallow he is… same for Thompson. These guys don’t know anything about — and clearly were never interested in — epistemology.
14.04.2025 00:36 — 👍 0 🔁 0 💬 0 📌 0@gruber.foo could use a refresher 😊
According to postmodernism, there are no assertions, only narratives; no truth but someone's truth; any convenient thing becomes your truth, and its effect on people is what matters, not its correspondence with reality.
daringfireball.net/linked/2025/...
It would be challenging to revert to Google. Kagi offers a significant improvement in terms of quality and clarity in search results, resulting in more efficient searches.
11.03.2025 16:39 — 👍 3 🔁 0 💬 0 📌 0I see but it’s a solvable issue. Correlation engines face two intractable issues : inexplicit knowledge and the limits of induction.
07.03.2025 10:46 — 👍 0 🔁 0 💬 0 📌 0No, a lot of the things we know are not explicit. Nor are they even conscious, actually. Language is just a distillation of what we know. And you can’t seriously hope to reverse-engineer human intelligence from what humans say to write.
07.03.2025 02:57 — 👍 0 🔁 0 💬 1 📌 0Even early web was fraught with liabilities for hosts.
26.02.2025 02:42 — 👍 0 🔁 0 💬 1 📌 0Building it would result in way more downsides than upsides for the owner.
26.02.2025 02:19 — 👍 0 🔁 0 💬 1 📌 0some people do miss you on X btw… I told them you’re here. The best will follow!
22.02.2025 12:08 — 👍 0 🔁 0 💬 0 📌 0this is great; a life in weeks: weeks.ginatrapani.org
16.02.2025 06:14 — 👍 38 🔁 3 💬 2 📌 1Love this new lamp to no end 💕 Matches my charger too 😊
12.02.2025 15:12 — 👍 1 🔁 0 💬 0 📌 0Wokism is the sacralization of historically marginalized groups, whether racial, ethno-religious, sex, or sexual preferences-based. Except the Jews, of course, because antisemitism loves nothing more than lazy thinking.
08.02.2025 22:48 — 👍 0 🔁 0 💬 0 📌 0The bad epistemology and weak science of Jonathan Haidt wins again 🙄
01.02.2025 17:14 — 👍 0 🔁 0 💬 0 📌 0Ever looked into Kagi.com? They offer most LLMs. You can compare, tweak each one of them for specific uses, compare their results instantly… the dream!
01.02.2025 12:57 — 👍 1 🔁 0 💬 1 📌 0Kagi’s the best.
29.01.2025 01:56 — 👍 1 🔁 0 💬 0 📌 0Also remarkable:
1^3+2^3+3^3+4^3+5^3+6^3+7^3+8^3+9^3=2025
2025 will be the square year of our lifetime — next one is 2116.
01.01.2025 18:19 — 👍 3 🔁 0 💬 1 📌 0Communicating with anyone in the world is about as hard as convincing your best friend to read a single paragraph of David Deutsch.
09.12.2024 11:30 — 👍 1 🔁 0 💬 0 📌 0You need iOS 18 or MacOS Sequoia.
www.macrumors.com/2024/06/26/i...
Select the file in the iCloud folder and ask to keep download. It’s now an option.
07.12.2024 15:18 — 👍 0 🔁 0 💬 1 📌 0Come back to podcasting soon!
04.12.2024 22:43 — 👍 1 🔁 0 💬 0 📌 0Ai will essentially destroy Onlyfans, along with much of the porn and influencer industries in less than ten years.
Anyone seeking to adopt a young bodybuilder stranded in Dubai just has to wait for optimal pricing.
Voted!
04.12.2024 10:47 — 👍 0 🔁 0 💬 0 📌 0Vaguely suicidal cluelessness is always trending. It’s cold over there.
03.12.2024 11:12 — 👍 1 🔁 0 💬 0 📌 0What most proponents of a god-like super-intelligence miss is that the world has its own speed and cannot be accelerated. You have to understand it on those terms, in a way.
The less educated will argue that super-AGIs would use accelerated simulations, but they didn’t think that through 🤓