The simplest way to use local and online LLMs

Interact with popular AI models with just a click of a button


Use offline models like Llama 2, Mixtral, Qwen, LlaVa, Gemma, or online models like GPT 4, Gemini, Mistral, Groq, Claude, and many more in an unified easy to use interface.

Focus on your productivity and not on the setup.

"I have tried MANY LLM (some paid ones) UI and I wonder why no one managed to build such beautiful, simple, and efficient before you ... πŸ™‚" - Olivier H. (@notan_ai)
Fast Onboarding

We have sweated over the details and deliberately removed AI jargon to make it as easy as possible to get started with AI.

  • Install-Click-Chat
  • Get started in 3 clicks
  • No prior LLM setup experience needed nor you need to know what they mean

"This is fantastic! I love how you've lowered the barrier to entry for us to get into working with AI." - @admiralnines

Split Chats

Get advice from multiple assistants, compare, and pick the best.

  • Sync inputs across multiple models
  • Compare model responses side-by-side
  • Offline vs Online, Free vs Paid, Mixtral vs GPT-4, Google's vs Microsoft's; pick your models, choose the best response

You can regenerate a model's response and get better results. You can even refine your own prompt before sending it to the model. Keep AI in track by editing in-place or removing hallucinating branches.

  • Create multiple conversation branches and remove hallucinating branches
  • Edit in-place to direct the conversation
  • Regenerate AI response or refine your own prompt before sending
Quick Prompts

We have a library of prompts to help you get started. You can also create your own. They are easy to summon and use when you need them.

  • +150 curated prompts
  • Add your own prompts
  • Bookmark your favorite prompts for even quicker access
Sticky Prompt

Use sticky prompt feature to quickly complete repetitive tasks.

  • No need to type the same prompt again and again and again
  • Maintain context throughout the conversation
Mix-and-Match Models

Download a variety of models depending on your requirements. Mix-and-match models in a single conversation to get the best results.

  • A wide selection of models to choose from
  • Chat with models offline
  • No API keys needed for local models
  • Use multiple models in a single conversation
And many more features...
Ollama Integration

Use your existing models with Msty

Ultimate Privacy

No personal information ever leaves your machine

Offline Mode

Use Msty when you're off the grid


Organize your chats into folders

Model Settings

Tweak temperatures, context length, system prompt, etc.


Choose from a variety of colors and appearances

See what people are saying

I just downloaded Msty and I love the first impression of it. Great work πŸŽ‰. Great work on the site as well - very sleek just like the app and I found the answers to the questions I did have. Can't wait to explore Msty more.

This is fantastic! I love how you've lowered the barrier to entry for us to get into working with AI.

...arguably the most user-friendly local LLM experience available today. Msty fits the bill so well that we had to write a HOWTO post around how we got it installed on an M1 MacBook - and in that same post had to essentially disavow our previous suggestion to use β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ

Peter Thomas
That's what I was looking for. Looks great!

This is the client I would recommend to most users who don't want (or can't) mess around with the command prompt or Docker. So, it's already a big step forward. πŸ‘

I really like the app so far, so thank you. I also have β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ installed for some other stuff, the terminal pops up with warnings, I start feeling stressed, but I like that your app is much more opaque. No terminal outputs on your screen. Very normie friendly

Kaya Orsan (@kayaorsan)
One of the simplest ways I've found to get started with running a local LLM on a laptop (Mac or Windows). I've been using this for the past several days, and am really impressed. Whether you're interested in starting in open source local models, concerned about your data and privacy, or looking for a simple way to experiment as a developer, Msty is a great place to start.

Mason James (@masonjames)
The best Interface I've found yet to work with local models, so thank you for releasing it!

Francesco (@francesco_36500)
I have tried MANY LLM (some paid ones) UI and I wonder why no one managed to build such beautiful, simple, and efficient before you ... πŸ™‚ keep the good work!

Olivier H. (@notan_ai)
Fantastic app - and in such a short timeframe ❀️ I have been using β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ up until now, and (now) I prefer Msty.

I self-host, so having a local assistant has been on my list of to-dos for a while. I've used a number of others in this space and they all required many dependencies or technical know-how. I was looking for something that my spouse could also download and easily use. MSTY checks all the boxes for us. Easy setup (now available in Linux flavor!), local storage (security/privacy amirite?), model variety (who doesn't like model variety?), simple clean interface. The only drawback would be your computer and the requirements needed to run some of the larger more powerful models. 5/7 would download again

I found your app from the Ollama repo, btw, I'm a fan, it's super clean keep up the good work I'll plug it around my circles!

Brent (@bizzy oO)
Chiming in to say it looks really good and professional! It’s definitely β€œenterprise level” in a good way, I was initially searching for a β€œpricing” tab lol.

My brother isn’t a coder at all and he’s gotten very into ChatGPT over the last year or so, but a few weeks ago he actually cancelled his subscription in favour of Msty. You’re making a good product dude πŸ™


Yes, these are real comments from real users. Join our Discord to say hi to them.

Ready to get started?

We are available on Discord if you have any questions or need help.

Questions you may have

Is it possible to use this app offline?

Yes! You can use this app offline. Making Msty work offline is our top priority. However, you'll need to download the models you want to use beforehand.

What about privacy?

All your data remains on your computer including chat history, models, settings, and prompts.

Why am I getting virus alert on Windows?

This is a false positive and, unfortunately, a big downside of being on Windows. Companies much bigger than us have faced this issue. We are working on ways to get it resolved. In the meantime, you can whitelist Msty in your antivirus software.

How can I download models locally?

Models can be downloaded by going to the Local AI tab from the sidebar.

Does Msty support GPUs?

Yes on MacOS. On Windows* only Nvidia GPU cards are supported; AMD GPUs will be supported soon.

Can I use online models such as GPT-Turbo, Mixtral, Gemini, etc?

Yes! You can use your Open AI or Mistral API keys to chat with their models.

What's the memory and storage requirements?

It mostly depends on the model you're using. For example, 3.8 GB storage and 16 GB memory is recommended for Llama2 model. Where as TinyLlama model requires only about 700MB storage and 2 GB memory.

How can I get the latest updates?

Msty will automatically download the latest updates when you open it. If there is a new update, you can just restart the app to get the latest version.

Why is Msty a desktop app and not a web app?

We believe that a desktop app provides a better user experience and performance for an app like Msty. It also allows us to provide offline support and better privacy. It also means the limit of which models you could use is only limited by how powerful your computer is.

Why should I trust Msty?

We are a small team of developers who are passionate about AI and privacy. We have worked on projects before that have been used by thousands of people such as this. There are real faces behind the product. And come chat with us on our Discord server to know us better.

I've some feedback. How can I get in touch?

Great! Please join our Discord server and share your thoughts. We'd love to hear from you!

Msty is offline-first but you can also easily use popular online models. You can even mix and match them.

Made with sipping lots of β˜•οΈ by the bank of the Scioto River in Columbus, Ohio. If the world runs out of coffee, blame our CloudStack, LLC Team.