The simplest way to use local and online LLMs

Interact with popular AI models with just a click of a button

MetaOpenAIMistralGemmaQwenDuckDBGemini

Use offline models like Llama 3, Mixtral, Qwen, LlaVa, Gemma, Perplexity, or online models like GPT-4o, Gemini, Mistral, Groq, Claude, and more in an unified easy to use interface.

Focus on your productivity and not on the setup.

"I have tried MANY LLM UI and I wonder why no one managed to build such beautiful, simple, and efficient before you ... 🙂" - Olivier H. (@notan_ai)
Fast Onboarding

We have sweated over the details and deliberately removed AI jargon to make it as easy as possible to get started with AI.

  • Install-Click-Chat
  • Get started in 3 clicks
  • No prior LLM setup experience needed nor you need to know what they mean

"This is fantastic! I love how you've lowered the barrier to entry for us to get into working with AI." - @admiralnines

Split Chats

Get advice from multiple assistants, compare, and pick the best.

  • Sync inputs across multiple models
  • Compare model responses side-by-side
  • Offline vs Online, Free vs Paid, Mixtral vs GPT-4, Google's vs Microsoft's; pick your models, choose the best response

RAG done right.

Msty's Knowledge Stack is more than just a collection of documents. You can use multiple data sources to compose a stack.

Refinements

You can regenerate a model's response and get better results. You can even refine your own prompt before sending it to the model. Keep AI in track by editing in-place or removing hallucinating branches.

  • Create multiple conversation branches and remove hallucinating branches
  • Edit in-place to direct the conversation
  • Regenerate AI response or refine your own prompt before sending
Quick Prompts

We have a library of prompts to help you get started. You can also create your own. They are easy to summon and use when you need them.

  • +230 curated prompts
  • Add your own prompts
  • Bookmark your favorite prompts for even quicker access
Sticky Prompt

Use sticky prompt feature to quickly complete repetitive tasks.

  • No need to type the same prompt again and again and again
  • Maintain context throughout the conversation
Mix-and-Match Models

Download a variety of models depending on your requirements. Mix-and-match models in a single conversation to get the best results.

  • A wide selection of models to choose from
  • Chat with models offline
  • No API keys needed for local models
  • Use multiple models in a single conversation
And many more features...
"this is definitely the best Ollama GUI so far..." - Matt Williams, founder member of Ollama
Ollama Integration

Use your existing models with Msty

Ultimate Privacy

No personal information ever leaves your machine

Offline Mode

Use Msty when you're off the grid

Organization

Organize your chats into folders

Model Settings

Tweak temperatures, context length, system prompt, etc.

Theme

Choose from a variety of colors and appearances

See what people are saying

I just discovered Msty and I am in love. Completely. It’s simple and beautiful and there’s a dark mode and you don’t need to be super technical to get it to work. It’s magical!

Alexe (@alexemartel)
This is fantastic! I love how you've lowered the barrier to entry for us to get into working with AI.

@admiralnines
The best Interface I've found yet to work with local models, so thank you for releasing it!

Francesco (@francesco_36500)
...arguably the most user-friendly local LLM experience available today. Msty fits the bill so well that we had to write a HOWTO post around how we got it installed on an M1 MacBook - and in that same post had to essentially disavow our previous suggestion to use ███████

Peter Thomas
Msty has answered most of my demands for a simple local app for comparing frontier & local models and refining prompts with a great interface. Accessible to anyone who can learn to get an API key

Dominik Lukes (@techczech)
This is the client I would recommend to most users who don't want (or can't) mess around with the command prompt or Docker. So, it's already a big step forward. 👍

@niikv
Not sure how I stumbled onto MSTY, but of all Ollama GUI's this is definitely the best so far. It's branching capabilities are more advanced than so many other tools. It's pretty awesome.

Matt Williams (@Technovangelist)
Founding member of the Ollama team
One of the simplest ways I've found to get started with running a local LLM on a laptop (Mac or Windows). I've been using this for the past several days, and am really impressed. Whether you're interested in starting in open source local models, concerned about your data and privacy, or looking for a simple way to experiment as a developer, Msty is a great place to start.

Mason James (@masonjames)
I have tried MANY LLM (some paid ones) UI and I wonder why no one managed to build such beautiful, simple, and efficient before you ... 🙂 keep the good work!

Olivier H. (@notan_ai)
Fantastic app - and in such a short timeframe ❤️ I have been using ███████ up until now, and (now) I prefer Msty.

@BeerPowered
I really like the app so far, so thank you. I also have ███████ installed for some other stuff, the terminal pops up with warnings, I start feeling stressed, but I like that your app is much more opaque. No terminal outputs on your screen. Very normie friendly

Kaya Orsan (@kayaorsan)
I found your app from the Ollama repo, btw, I'm a fan, it's super clean keep up the good work I'll plug it around my circles!

Brent (@bizzy oO)
I self-host, so having a local assistant has been on my list of to-dos for a while. I've used a number of others in this space and they all required many dependencies or technical know-how. I was looking for something that my spouse could also download and easily use. MSTY checks all the boxes for us. Easy setup (now available in Linux flavor!), local storage (security/privacy amirite?), model variety (who doesn't like model variety?), simple clean interface. The only drawback would be your computer and the requirements needed to run some of the larger more powerful models. 5/7 would download again

@/steve
Chiming in to say it looks really good and professional! It’s definitely “enterprise level” in a good way, I was initially searching for a “pricing” tab lol.

@user_7832
My brother isn’t a coder at all and he’s gotten very into ChatGPT over the last year or so, but a few weeks ago he actually cancelled his subscription in favour of Msty. You’re making a good product dude 🙏

@7ormen7
just came here to say i just discovered msty from the official ollama github and that ya'll are doing incredible work 🙏

@soyhenry

Yes, these are real comments from real users. Join our Discord to say hi to them.

Ready to get started?

We are available on Discord if you have any questions or need help.

Questions you may have

Msty is offline-first but you can also easily use popular online models. You can even mix and match them.

Made with sipping lots of ☕️ by the bank of the Scioto River in Columbus, Ohio. If the world runs out of coffee, blame our CloudStack, LLC Team.