Best part: You can start free, upgrade to Pro ($40/mo) when you need advanced features.
No DevOps. No infrastructure headaches.
Try it: squad.ai
#AIAgents #NoCode #Squad #Automation
@chutesai.bsky.social
Best part: You can start free, upgrade to Pro ($40/mo) when you need advanced features.
No DevOps. No infrastructure headaches.
Try it: squad.ai
#AIAgents #NoCode #Squad #Automation
Squad just made building AI agents ridiculously easy π₯
Here's what you get:
β
Drag-and-drop canvas (no code required)
β
Pre-built agents for common tasks
β
Custom tool creation with code editor (Pro)
β
Runs on decentralized Chutes infrastructure
β
Deploy in minutes, not weeks
The question:
how fast will enterprises realize they have a better option?
Open source won infrastructure. It's winning AI too.
It's about:
β’ Cost efficiency (85% cheaper than AWS)
β’ No single point of failure
β’ Vendor optionality
β’ Access to a large and constantly evolving selection of models vs being locked to one
Decentralized inference is already faster + cheaper.
Hot take: In 5 years, running all your AI on one centralized provider will feel as risky as hosting everything on a single server.
The shift to decentralized AI isn't about ideology.
Chutes infrastructure update π
Currently running:
β’ Thousands of H200 & A6000 GPUs
β’ Processing billions of tokens daily
β’ leading open source inference provider on OpenRouter
β’ 100% decentralized on Bittensor Subnet 64
All with ~85% cost savings vs AWS.
Open source AI is scaling on Chutes πͺ
If you've been sleeping on open source AI infrastructure, maybe it's time to wake up.
Turns out the future of AI might not be owned by 3 companies.
It might be decentralized.
And it might already be happening.
chutes.ai
This is one of those moments where you realize you've been building something bigger than you thought.
We're not just "a decentralized AI platform."
We might be building the Linux of AI inference.
And we're just getting started.
Here's what really gets us:
The AI inference market is projected to hit $255 BILLION by 2030.
We're leading in open source inference.
And according to some quadrant analysis floating around, we're massively undervalued compared to competitors.
The companies we're apparently ahead of have:
- 100x our funding
- 10x our team size
- Every tech blog writing about them daily
We have:
- 60+ open source models
- Decentralized infrastructure
- Apparently no chill
Here's the kicker:
We're #1 even though OpenRouter RED-FLAGS us by default because we don't have TEE implemented yet ( Coming very soon btw )
Imagine what happens when we remove that red flag π
Chutes is apparently the #1 Open Source inference provider on OpenRouter*
Not top 5. Not top 3.
NUMBER ONE.
40+ BILLION tokens per day.
And we literally just found out. Yesterday.
*P.S Based on publicly available information on OpenRouter Only
so... we just found out something absolutely insane π§΅
we weren't even looking for this data.
one of our engineers was going through OpenRouter's Publicly available charts at 3am (as you do) and discovered something that made us all stop and stare at our screens.
Hunyuan-3 Image Generation Live Now on Chutes π»π»
chutes.ai/app/chute/0c...
A stunning new gen image model - available as part of your Pro, Plus or Base subscription or PAYG through our flex tier.
Send us your image magic below β¬οΈ (Attached image taken directly from the API)
We've just enabled more payment options for our customers π
- Apple & Google pay in more countries
- Buy Now, Pay Later providers such as Klarna
- More European, Asian & Latin American providers such as Pix, Naver Pay & More.
- Stablecoin payments
+ lots more coming soon.
Weβve also launched a few new places to stay connected with updates, support, and community discussions:
β’ Reddit: r/ChutesAI - reddit.com/r/chutesAI/
β’ X: x.com/chutes_ai
The team has been working hard behind the scenes to scale up and push Chutes forward, and this marks a big step in that direction.
Say hello to
@0xVeight
@0xsirouk
@0xAlgowary
+ more in our discord, here and in some new channels listed below π
Chutes Team Expansion Update
Weβre excited to share that
@0xTuDudes
has officially joined chutes.ai to bring extra firepower across development, sales, marketing, and support.
20 TRILLION TOKENS PROCESSED ON CHUTES πͺ
After 10 months we've reached a massive milestone, with thousands of production applications, millions of users - all powered by decentralized inference/compute on SN64 Bittensor.