The easiest way to use local and online AI models

Chat with any AI model in a single-click. No prior model setup experience needed.

MetaOpenAIClaudeGeminiMistralLLaVaQwen
Msty

"I just discovered Msty and I am in love. Completely. It’s simple and beautiful and there’s a dark mode and you don’t need to be super technical to get it to work. It’s magical!" - Alexe

Offline-First, Online-Ready

Msty is designed to function seamlessly offline, ensuring reliability and privacy. For added flexibility, it also supports popular online model vendors, giving you the best of both worlds.


Parallel Multiverse Chats

Revolutionize your research with split chats. Compare and contrast multiple AI models' responses in real-time, streamlining your workflow and uncovering new insights.


Craft your conversations

Msty puts you in the driver's seat. Take your conversations wherever you want, and stop whenever you're satisfied.


"...Its branching capabilities are more advanced than so many other tools. It's pretty awesome." - Matt Williams, Founding member of Ollama
Powerful analytics for the modern marketer

Summon Real-Time Data

Stay ahead of the curve with our web search feature. Ask Msty to fetch live data into your conversations for unparalleled relevance and accuracy.


RAG done Right.

Msty's Knowledge Stack goes beyond a simple document collection. Leverage multiple data sources to build a comprehensive information stack.

Powerful analytics for the modern marketer

Unified Access to Models

Use any models from Hugging Face, Ollama and Open Router. Choose the best modelfor your needs and seamlessly integrate it into your conversations.


Prompt Paradise

Access a ready-made library of prompts to guide the AI model, refine responses, and fulfill your needs. You can also add your own prompts to the library.

Powerful analytics for the modern marketer

And many more features...

Ollama Integration

Use your existing models with Msty

Ultimate Privacy

No personal information ever leaves your machine

Offline Mode

Use Msty when you're off the grid

Organization

Organize your chats into folders

Model Settings

Tweak temperatures, context length, system prompt, etc.

Attachments

Attach images and documents to your chat

See what people are saying

Yes, these are real comments from real users. Join our Discord to say hi to them.

This is fantastic! I love how you've lowered the barrier to entry for us to get into working with AI.

@admiralnines
The best Interface I've found yet to work with local models, so thank you for releasing it!

Francesco (@francesco_36500)
...arguably the most user-friendly local LLM experience available today. Msty fits the bill so well that we had to write a HOWTO post around how we got it installed on an M1 MacBook - and in that same post had to essentially disavow our previous suggestion to use ███████

Peter Thomas
I just discovered Msty and I am in love. Completely. It’s simple and beautiful and there’s a dark mode and you don’t need to be super technical to get it to work. It’s magical!

Alexe (@alexemartel)
Msty has answered most of my demands for a simple local app for comparing frontier & local models and refining prompts with a great interface. Accessible to anyone who can learn to get an API key

Dominik Lukes (@techczech)
This is the client I would recommend to most users who don't want (or can't) mess around with the command prompt or Docker. So, it's already a big step forward. 👍

@niikv
... of all Ollama GUI's this is definitely the best so far. Its branching capabilities are more advanced than so many other tools. It's pretty awesome.

Matt Williams (@Technovangelist)
Founding member of the Ollama team
One of the simplest ways I've found to get started with running a local LLM on a laptop (Mac or Windows). I've been using this for the past several days, and am really impressed. Whether you're interested in starting in open source local models, concerned about your data and privacy, or looking for a simple way to experiment as a developer, Msty is a great place to start.

Mason James (@masonjames)
I have tried MANY LLM (some paid ones) UI and I wonder why no one managed to build such beautiful, simple, and efficient before you ... 🙂 keep the good work!

Olivier H. (@notan_ai)
Fantastic app - and in such a short timeframe ❤️ I have been using ███████ up until now, and (now) I prefer Msty.

@BeerPowered
I really like the app so far, so thank you. I also have ███████ installed for some other stuff, the terminal pops up with warnings, I start feeling stressed, but I like that your app is much more opaque. No terminal outputs on your screen. Very normie friendly

Kaya Orsan (@kayaorsan)
I found your app from the Ollama repo, btw, I'm a fan, it's super clean keep up the good work I'll plug it around my circles!

Brent (@bizzy oO)
I self-host, so having a local assistant has been on my list of to-dos for a while. I've used a number of others in this space and they all required many dependencies or technical know-how. I was looking for something that my spouse could also download and easily use. MSTY checks all the boxes for us. Easy setup (now available in Linux flavor!), local storage (security/privacy amirite?), model variety (who doesn't like model variety?), simple clean interface. The only drawback would be your computer and the requirements needed to run some of the larger more powerful models. 5/7 would download again

@/steve
Chiming in to say it looks really good and professional! It’s definitely “enterprise level” in a good way, I was initially searching for a “pricing” tab lol.

@user_7832
My brother isn’t a coder at all and he’s gotten very into ChatGPT over the last year or so, but a few weeks ago he actually cancelled his subscription in favour of Msty. You’re making a good product dude 🙏

@7ormen7
just came here to say i just discovered msty from the official ollama github and that ya'll are doing incredible work 🙏

@soyhenry

Questions you may have

Ready to get started?

We are available in Discord if you have any questions or need help.