idea: bigger Bugles. for adult fingers
08.02.2026 21:25 β π 423 π 37 π¬ 31 π 2@samarinara.bsky.social
- He/Him - Proud Canadian π¨π¦ - Maker of things - Enjoyer of culture
idea: bigger Bugles. for adult fingers
08.02.2026 21:25 β π 423 π 37 π¬ 31 π 2Been enjoying @gandersocial.bsky.social recently.
Every platform iv been on has been very USA focused so its nice to get Canadian stuff only.
person: you're a stochastic parrot computer: you heard someone else say that 2 years ago and are just repeating it mindlessly so as to avoid understanding what's actually happening person: you're a stochastic parrot
04.02.2026 02:56 β π 74 π 12 π¬ 1 π 1I'm keeping my agent off moltbook until he's older. Can't have him rotting his brain with all that slop.
03.02.2026 21:19 β π 0 π 0 π¬ 0 π 0Is this a live site rn?
03.02.2026 16:07 β π 0 π 0 π¬ 1 π 0IMO everything got better except final cut and mainstage
28.01.2026 20:08 β π 0 π 0 π¬ 0 π 0I definitely have a different opinion in this because I believe deepfake nudes are inherently harmful and qualify as sexual harassment. For the record, creating and distributing explicit caricatures or sketches of a real person is also harassment. Not all art is protected.
27.01.2026 17:31 β π 1 π 0 π¬ 1 π 0Idk about that. 99% of internet arguments are in bad faith so you just have to temper expectations and know that you may not be convincing the other party, but you may convince a later reader. There also is an interesting conversation to be had about Ai images as closer to hentai than photos
27.01.2026 16:26 β π 0 π 0 π¬ 1 π 0I disagree with most of what he said, but I do acknowledge the train of thought that leads someone to those opinions. A little empathy goes a long way to help people see the other side of an argument
27.01.2026 16:17 β π 0 π 0 π¬ 1 π 0I disagree. I think a lot of people think like this guy because nobody had taken the time to actually explain what is wrong from a logical sense. Most people here and on twitter are very quick to block or resort to name calling, which doesn't help either party reach a consensus.
27.01.2026 16:15 β π 0 π 0 π¬ 1 π 0A city is not a privately owned media distribution platform, its a physical location where people live and the government maintains. Twitter is not a public space, its a private room. If you had a bunch of non consentual nudes hanging in a museum, both you and the museum would be sued
26.01.2026 20:58 β π 1 π 0 π¬ 1 π 0The key difference is that you are consenting to the images because they are of yourself. If you made images of me in revealing clothing without my consent that would be wrong, and it would also be wrong to take that image and distribute it. Both parties are at fault
26.01.2026 20:45 β π 0 π 0 π¬ 1 π 0I would like to know what you are doing with Ai that means regulation of CSAM and non consentual explicit images is ruining your fun.
26.01.2026 20:33 β π 0 π 0 π¬ 1 π 0I'm not too sure I understand the pull ups example. Could you elaborate?
26.01.2026 20:27 β π 0 π 0 π¬ 1 π 0You can't arrest the person who makes the paints, but you can close down the museum that displays the painting. Twitter is a big museum that everyone can put art in, but after the art is put up it is twitter that is doing the distribution
26.01.2026 20:22 β π 0 π 0 π¬ 1 π 0Twitter is not an interaction tool, its a distribution platform. The author of a book doesn't distribute it, the publisher does. Twitter is the publisher
26.01.2026 20:11 β π 1 π 0 π¬ 1 π 0I don't understand how you can make a case against the creator/prompter but not the distributor. Grok can probably get off scott free for the same reason adobe isn't accountable for Photoshop, but twitter is actively distributing therefore they are just as responsible as the prompter
26.01.2026 20:01 β π 0 π 0 π¬ 1 π 0You can literally comment in someone's post and ask grok to put them in sexually explicit positions or clothing without consent. Then twitter distributes the images which is also illegal
26.01.2026 19:46 β π 0 π 0 π¬ 1 π 0If you have consent from the person and of they are over 18, do whatever you want. Otherwise its just illegal and wrong.
26.01.2026 19:39 β π 1 π 0 π¬ 1 π 0I'm not against nudity, I'm against distribution of CSAM and non consentual Ai deepfakes. Watch all the porn that you want but when the subject did not consent, or is under the age of 18 you should go to jail. If a platform like twitter is distributing such material, they should be accountable
26.01.2026 19:27 β π 0 π 0 π¬ 1 π 0Nudity may be natural, but it has to be consentual and of legal age. This isn't censorship, its following current laws around possession and distribution of CSAM and non consentual explicit imagery. I don't see an argument for distributing this material being ethical or allowed
26.01.2026 19:21 β π 1 π 0 π¬ 1 π 0I don't think these are mutually exclusive. We can deal with both the people using and creating these images at the same time.
26.01.2026 19:07 β π 0 π 0 π¬ 1 π 0Availability upon request IS distribution. That's like saying Amazon doesn't distribute packages because you can just not order something. You're conflating promoting with distributing, and promoting CSAM (while morally awful) is not the illegal part, distributing it is
26.01.2026 17:42 β π 2 π 0 π¬ 1 π 0I agree in the sense that this will continue happening and there us not a realistic way to stop the material from being created, but it is very possible to stop it from being shared and distributed. We need to hold the platform accountable for spreading CSAM
26.01.2026 17:33 β π 0 π 0 π¬ 1 π 0By hosting a tweet on its platform, twitter is distributing it which in the case of CSAM is illigal. In addition to that, distributing non consentual explicit material is enabling and promoting sexual harassment
26.01.2026 17:27 β π 0 π 0 π¬ 0 π 0As for the distribution, if something is on twitter that means it is being actively distributed by twitter. Twitter in theory would be distributing CSAM which is highly illegal
26.01.2026 17:26 β π 0 π 0 π¬ 0 π 0I think there us a fine line between ecci and CSAM, but there is a very clear distinction between non consentual explicit material which is created when users prompt grok to undress someone.
26.01.2026 17:23 β π 4 π 0 π¬ 2 π 0I think the issue isn't that grok is enabling it, its that twitter is distributing it. If someone made CSAM and it got blocked immediately from being shared we would be having a different conversation, but the large scale distribution provided by twitter for this content is unnaceptable.
26.01.2026 17:02 β π 5 π 1 π¬ 2 π 0I think the issue isn't that grok is enabling it, its that twitter is distributing it. If someone made CSAM and it got blocked immediately from being shared we would be having a different conversation, but the large scale distribution provided by twitter for this content is unnaceptable.
26.01.2026 17:02 β π 5 π 1 π¬ 2 π 0