How to use existing Ollama models with Msty

To use your existing models from Ollama with Msty, you can:

  • 1. Onboard with Ollama models during initial setup
  • 2. Or, set Msty's model download location to the one used by Ollama (if you have already onboarded in Msty)

#1 If setting up Msty for the first time on your machine

When you install Msty and onboard with Local Text AI, it'll ask if you would like to use existing models from Ollama.

Local Text AI onboarding with existing Ollama models

If you would like to use the models you downloaded from Ollama, click on 'Yes'.

#2 If you've onboarded already and would like to switch to Ollama models

If you've onboarded already and would like to use your existing models from Ollama, you can edit Msty's model download location and set it to Ollama's models directory path.

To edit the models path, go to Local AI > Text Module > Page Actions > Edit Models Path

Navigate to Edit Models Path
Edit Models Path dialog

If you haven't changed Ollama's default models directory using OLLAMA_MODELS environment variable, the models should be located under:

OSLocation
Mac/Users/<username>/.ollama/models
WindowsC:\Users\<username>\.ollama\models
Linux/usr/share/ollama/.ollama/models

If you changed Ollama's default models directory using OLLAMA_MODELS environment variable, then please use the path you set instead of using the default location mentioned above.

That's it! Once you apply the path the Text Module will restart and you'll be able to use your existing Ollama models with Msty.

Interact with any AI model with just a click of a button