Qwen3.5 small model series are now in Jan ๐
Qwen3.5-9B supports image input, reasoning, and tool calls, while still fitting comfortably on everyday hardware at 6GB RAM.
Download Jan: jan.ai
Qwen3.5 small model series are now in Jan ๐
Qwen3.5-9B supports image input, reasoning, and tool calls, while still fitting comfortably on everyday hardware at 6GB RAM.
Download Jan: jan.ai
Introducing Jan-Code-4B ๐ป
A compact coding model tuned for practical day-to-day tasks. Generation, refactors, debugging, tests โ all runnable locally in Jan.
Download Jan: www.jan.ai
Model: huggingface.co/collections...
Claude Code works great. But your setup can be smarter.
Jan makes it easier to connect Claude Code to any model, all through one setup.
Download Jan v0.7.7 at jan.ai
No Claude plan?
You can still use Claude Code.
Connect it to Jan and run local models on your own machine. That means no subscription and no token caps.
Download Jan v0.7.7 at jan.ai
Qwen3.5-35B-A3B is now in Jan ๐ฅ
It surpasses previous Qwen3 models more than 6ร its size.
Get the latest Jan at jan.ai
Want to reduce your Claude Code bill?
Claude Code uses smaller models like Haiku for simple tasks.
Instead of sending every small request to a paid API, run those tasks locally with Jan and keep heavier work in the cloud.
One setup. Lower API usage. More control.
Get the latest version at jan.ai
Update your Jan or download the latest version at www.jan.ai/
11.02.2026 09:15 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0
Jan Desktop v0.7.7 is live ๐
This update brings native MLX support, a broader UX and UI refresh across the app, and better support for developer workflows.
You can now upload files in Projects, use the local API server with both local and remote models, and work with CLIs like Claude Code.
Introducing Jan v3, with updates to Jan Desktop v0.7.6 ๐
Jan v3 is our first v3 model, a 4B base built for fine-tuning and fast local use, with stronger math and coding. It's available in Jan Desktop and at chat.jan.ai/
Model:
- huggingface.co/Menlo/Jan-v...
- huggingface.co/Menlo/Jan-v...
Wait, you all downloaded Jan-v2-VL-med-gguf 225k times last month? ๐
06.01.2026 08:24 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0You can share your chat.jan.ai conversations with your friends.
06.01.2026 03:31 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0
Jan-v2-VL-Max-Instruct is out on chat.jan.ai ๐
Our newest 30B vision-language model, extending the Jan-v2-VL family. This is our experiment bringing interleaved reasoning to an Instruct model.
It handles long tasks well and stays on track when things get complicated.
๐ chat.jan.ai/
23.12.2025 13:08 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0
Introducing chat.jan.ai, a brand new way to use Jan ๐
You can now search the web, do deep research, and let Jan use your browser.
Powered by Jan-v2-VL-Max, 30B model beating Gemini 2.5 Pro & DeepSeek R1 on execution benchmarks.
Try it: chat.jan.ai/
Jan Browser MCP is now available ๐ฉต
Jan now has its own Chromium extension that makes browser use simpler and more stable. You can install it from the Chrome Web Store and connect it from in Jan. The video above shows the quick steps.
Search for Jan Browser MCP on Chrome Web Extension.
Jan now supports Flatpak ๐ฉต flathub.org/en/apps/ai....
08.12.2025 12:58 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0Jan v0.7.5 is live. We found a model-import issue on Windows and fixed it. Update your Jan, the bug should be gone now.
08.12.2025 10:16 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0
You can now attach files in Jan.
In v0.7.4, you can add a file into the chat and ask anything about it.
Update your Jan or download the latest.
What was the first AI model you ever ran locally?
25.11.2025 03:59 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0Jan takes care of your to-dos.
24.11.2025 07:47 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0You can create, update, complete, or delete your Todoist tasks in Jan. So your AI can actually help you get things done.
24.11.2025 07:24 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0You can run Jan-v2-VL on MLX now. Huge thanks to the amazing MLX community for making this happen ๐งก huggingface.co/mlx-communi...
21.11.2025 06:50 โ ๐ 1 ๐ 0 ๐ฌ 0 ๐ 0
Jan-v2-VL-high reaches 49 steps on the Long-Horizon Execution benchmark.
Qwen3-VL-8B-Thinking reaches 5, Qwen2.5-VL-7B-Instruct and Gemma-3-12B reach 2, and Llama-3.1-Nemotron-8B and GLM-4.1-V-9B-Thinking reach 1.
Models: huggingface.co/collections...
3 variants are available:
- Jan-v2-VL-low (efficiency-oriented)
- Jan-v2-VL-med (balanced)
- Jan-v2-VL-high (deeper reasoning and longer execution)
To use it, update your Jan App and download Jan-v2-VL from the Model Hub. Activate Browser MCP servers for agentic use cases.
Introducing Jan-v2-VL, a multimodal agent built for long-horizon tasks.
Jan-v2-VL executes 49 steps without failure, while the base model stops at 5 and other similar-scale VLMs stop between 1 and 2.
Models: huggingface.co/collections...
Credit to the Qwen team for Qwen3-VL-8B-Thinking!
๐
12.11.2025 07:18 โ ๐ 3 ๐ 0 ๐ฌ 0 ๐ 0What is your open-source AI stack for coding?
10.11.2025 07:00 โ ๐ 1 ๐ 0 ๐ฌ 1 ๐ 0You can run Kimi-K2-Thinking, the most strongest agentic model, in Jan through Hugging Face. Add it to your Hugging Face Inference models to use it.
07.11.2025 06:50 โ ๐ 1 ๐ 1 ๐ฌ 0 ๐ 0
We're looking for someone who can take full ownership of Jan's growth ๐ฉต
menlo.bamboohr.com/careers/109
What's your go-to open-source model now?
04.11.2025 09:00 โ ๐ 0 ๐ 0 ๐ฌ 0 ๐ 0