Do you have a team member who is dedicated to nothing but docs?
Also, you should fine-tune an LLM on all of your engineering docs and discussions etc so users can converse with it and learn faster
@taek.org.bsky.social
Working on solar and carbon credits. I do LLMs and diffusion models as a hobby. Founder of Sia, Skynet, and Obelisk.
Do you have a team member who is dedicated to nothing but docs?
Also, you should fine-tune an LLM on all of your engineering docs and discussions etc so users can converse with it and learn faster
Trick for launching new inflationary tokens: take a small percentage of the token supply and put it into a smart contract that trades the supply on a dex, adding more liquidity as the price rises.
This mitigates the 'tons of demand but no supply' problem without favoring any particular early party
a local filter probably does pretty well if it just downloads the video title, person who posted it, and maybe a description. no need to scan the entire video
15.04.2023 19:07 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0On the other hand, at 400 bytes per post we're talking like 1 MB to download 2500 messages. Even if you download every post by people you follow, and also by people followed by people you follow, you'll still get hours of data per second of downloading.
15.04.2023 18:46 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0If the labeling is client side, you can have 'secret likes' that update a user's filter model, but don't broadcast anywhere else that the content is preferred by them.
I am sure there are lots of horny lurkers that want more but don't want their preferences publicly visible, for example
How hard would it be to make all of those filters client-side? Then people can pop in their own, you can get a whole hacker culture around custom filters and really be the platform that can service every niche.
15.04.2023 06:14 โ ๐ 17 ๐ 0 ๐ฌ 2 ๐ 0@nomieturtles.bsky.social don't let me down
15.04.2023 05:42 โ ๐ 4 ๐ 0 ๐ฌ 0 ๐ 0bsky isn't the only allowed hosting platform though right? Some other service could opt for different content removal rules?
14.04.2023 16:09 โ ๐ 0 ๐ 0 ๐ฌ 1 ๐ 0The big question for everyone is censorship resistance and audience retention. And the test is when all the hornyposters show up and start pushing the limits. When they get banned from bsky, will they still be able to use AT protocol?
14.04.2023 14:53 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0Who is missing?
14.04.2023 13:54 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0No weights available yet, based on the dev chatter I think that's expected in 2-4 weeks.
14.04.2023 07:12 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0Yeah, earlier today I think. You can use it at https://dreamstudio.ai
It's particularly adept at photographs of humans, psychedelic art, objects like tents and rocket ships, landscapes with mountains.
Devs are saying the next release should have a much broader range of strengths.
Got to try Stable Diffusion XL tonight. It's definitely early and unpolished, but so was Stable Diffusion 1.5 (the main model everyone uses today). It showed a ton of promise. I'm confident that within a month of releasing the weights, SDXL will be the new king of image generation.
14.04.2023 06:40 โ ๐ 2 ๐ 0 ๐ฌ 1 ๐ 0got them!
14.04.2023 06:37 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0Hey! Great to be here. I got a group of AI/LLM friends who want to join if you have invites to spare
14.04.2023 03:37 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0well, eventually I'm hoping to rig everything up to an LLM but that's probably several months away.
14.04.2023 03:36 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0bulk APIs that let me download massive numbers of tweets / posts so that my own local toolchain can sort through them and push the interesting ones to me
13.04.2023 18:18 โ ๐ 2 ๐ 0 ๐ฌ 1 ๐ 0lots of low hanging bugs, just keep grinding it out you are doing great.
13.04.2023 18:17 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0Could I bother you for about a dozen invites?
13.04.2023 18:12 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0Hence the term 'shaggoth'.
Actually though I think that's a really solvable problem. LLMs are smart enough to parse out the valuable parts of a tweet and they can add annotations to what they read to make sure the majority of the training data is aligned with their target personality.
It's only a matter of time before the LLM is the middle-man between you and pretty much every untrusted service on the Internet.
12.04.2023 19:07 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0Bluesky can outsource a lot of the compute power that Twitter needs to serve tweets by having an API that defaults to sending you every tweet from your friends and friends-of-friends (minus images) over an API, and letting a local AI algorithm select what gets displayed.
12.04.2023 14:26 โ ๐ 3 ๐ 0 ๐ฌ 1 ๐ 0I think the right way to approach this is to make the API friendly to LLMs and other AI models that can filter your feed on your behalf. Let GPT/LLaMA know what type of stuff you like and don't like, then have it worry about only presenting good posts.
12.04.2023 14:24 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0Everything you post is going to be used to train the next generation of LLMs
12.04.2023 14:23 โ ๐ 4 ๐ 0 ๐ฌ 1 ๐ 0What's the long term plan for funding bluesky?
12.04.2023 08:36 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0It's a great day to take flight.
(Stable Diffusion, MixProV4, 4 minutes)
took me about 5 minutes, not bad!
12.04.2023 05:41 โ ๐ 2 ๐ 0 ๐ฌ 0 ๐ 0How do I switch my domain?
12.04.2023 05:20 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0