@azathothsdaddy.bsky.social
Old man. Lawyer; photographer; writer; player of bad guitar . . .
"We knew very well what we were electing. We wanted a racist dictator."
07.12.2025 22:57 β π 2 π 0 π¬ 0 π 0#AngryUpvote
07.12.2025 22:07 β π 8 π 0 π¬ 0 π 0Promoters are definitely overclaiming what AI can do. They don't really understand what it can do because they don't understand how the human mind operates or even what it is.
07.12.2025 22:05 β π 1 π 0 π¬ 0 π 0constrain how we may use it effectively: that is, that we must use it with a screwdriver, not with, say, a hammer or (more ludicrously) a rope.
07.12.2025 21:56 β π 0 π 0 π¬ 1 π 0Actually, the hammer can't tell us anything. Neither can AI. It can only appear to tell us things.
In your analogy, the screw is a tool that's part of a technology that includes screwdrivers. The designers and manufacturers of screws incorporated decisions into the design of that tool that >
that at all. I am saying that it cannot truly mimic human thought because it is not and cannot be truly human. And I'm saying that the tool of AI is part of a larger technology that includes a vast suite of assumptions that users don't consider and over which they have little or no control.
07.12.2025 20:37 β π 1 π 0 π¬ 1 π 0in boxes. That belief gets baked into AI, and that baked-in belief taints everything generated by AI.
Let me end this multivolume tome by saying that I'm not going to claim right now that there's no decent use* for AI. I'm not claiming >
* Don't use "use case," and don't trust anyone who does.
anyone uses AI to create something, they're generating something that cannot dance. It cannot actually replicate some fully human product of art because AI cannot be human.
It seems to me that many AI developers have the naive belief that we are really equivalent to computers: that we're brains >
beings can dance; and only actual humans can dance like humans. AI could dance only by building a complete human body, identical to an actual human body in every detail. I'm going to claim that this is impossible because doing so would require more heat-energy than exists in the universe.
When >
takes choices away from the user, transferring those choices to the AI's developers. And AI generates something that can only superficially resemble the products of humans.
Consider this: One of our earliest forms and most fundamental arts is the art of dancing. AI can never dance. Only organic >
or Edelman, "Bright Air, Brilliant Fire."
When we produce writing or types of art,* we're not just using our brains: we're also using our bodies. That use of our bodies is fundamental to all human endeavors.
So these are some of the dangers of using AI: Using it >
* All writing is a form of art.
part of your body, and your thinking partakes of your body. You can prove that by knocking back a few martinis and then doing some thinking. Compare that to your thinking sans martinis: the difference will be stark.
Note that I'm not just a-sayin' stuff here. Read Damasio, "Descartes' Error" or >
reflect principally that database, rather than the input of the user.
Another issue is that AI outputs can look a bit as though they were generated by humans; but AI can never truly mimic the products of human thought. Our brains are not computers; they are not even just brains. Your brain is >
often invisible or at least highly obscure (partly because developers want to protect their intellectual property). Also, AI constrains the results because it is built on whatever database the developers used to train the AI: the AI will tend toward the average of that database; and it will >
07.12.2025 20:37 β π 0 π 0 π¬ 1 π 0the bomb as well as certain persons involved in the development of that weapon--and people downwind of testing sites.
AI's developers baked (and continue to bake) into the tool a vast suite of decisions that affect how users use that tool and the ends achieved by those users. Those decisions are >
technology. Consider the users of the atomic bomb: That weapon was used to achieve a goal that could also be achieved through the use of conventional weapons. But another choice was baked into the atomic bomb: that is, the long-term effects of radiation on the populations targeted by the users of >
07.12.2025 20:37 β π 0 π 0 π¬ 1 π 0goals for which we use these tools (e.g., ChatGPT; pen and paper; typewriter; computer).
The big issue is this: In using AI, a user of pen and paper (or computer or typewriter) is adopting the entire technology of AI. That technology in effect bakes in choices made by the developers of that >
return to AI: AI is indeed a "tool in a toolbox": that is, a tool embedded in a technology. And the OP (and the author of the piece) are correct that the more complex tool of AI shifts from the user to the developer and manufacturer the skill--which is part of the technology--needed to achieve the >
07.12.2025 20:37 β π 0 π 0 π¬ 1 π 0take game with a rifle than it is with a bow and arrow. More complex tools--and the technologies of which these tools are a part--typically allow this greater efficiency and effectiveness on the part of the end user; but development and manufacture require greater skill and time input.
Let me now >
an atlatl and dart needs to exercise more skill than the user of the tool consisting of bow and arrow; but each user must exercise a great deal of skill.
More complex tools tend to be more efficient: It's easier to take game with a bow and arrow than it is with an atlatl and dart; and it's easier >
takes much greater skill to manufacture a hand grenade than it does to find and use (and perhaps shape) a suitable rock.
"Advanced" technologies tend to shift the skill from the end user to the tool designer and manufacturer. But this isn't at all a linear shift. A user of the tool consisting of >
used to kill someone at a distance; but it takes greater skill to kill someone with a thrown rock than it does to kill someone with a hand grenade.
But we have something like the conservation of energy going on here: It takes less skill to throw a hand grenade than it does to throw a rock; but it >
does it strike? What is the purpose for using the rock?
Taken together, these all describe a "technology."
Tools can involve greater or lesser skill on the part of the persons who manipulate those tools. Consider a goal or purpose such as killing a person. Both a rock and a hand grenade may be >
warfare.
Even a rock used as a striking implement is part of a technology: a doctrine exists for the use of the rock, and that technology includes the body of the person using the rock as well as the ways in which the rock is used. Is it thrown? How is it thrown? Is it held in the hand? What >
This is a bit inaccurate. Every tool is part of a technology. A pen is part of a technology that includes paper, letterforms, and the transmission of paper between and among individuals.
This is a bit like weapons in warfare. A weapon doesn't stand alone: it's part of a complete doctrine of >
Worth reading for lots of reasons, but particularly because it closes with something I see in my work and travels speaking on writing and AI. Many, maybe most students do not want AI-mediated schooling or lives. We can offer them something better. www.currentaffairs.org/news/ai-is-d...
07.12.2025 16:45 β π 261 π 103 π¬ 4 π 22Can't say that I blame ya . . .
07.12.2025 16:08 β π 0 π 0 π¬ 0 π 0Please accept the profound apologies of the primordial ooze from which my progenitors arose. They'll be giving back their ooziness.
07.12.2025 05:41 β π 1 π 0 π¬ 0 π 0Should be a front page story in every major news outlet tomorrow but our major news outlets donβt cover Trump like that anymore
www.kenklippenstein.com/p/leak-fbi-l...