Msty Blog

Msty 1.2

Vapor Mode, Response Metrics, Custom Real Time Query, LaTeX support, Context Shield, and plenty more!

Msty 1.1

Msty 1.1 introduces new unique features, improvements, and bug fixes.

Why 1.0 and What's Next?

The first release of Msty was released about 6 months ago and we have been working hard to make Msty better with each release. We have added so many features and improvements that we feel that Msty is now ready for a 1.0 release.

Msty 1.0

Msty 1.0 is here! With new features and improvements, this is our best release yet and makes Msty better than ever.

Vision Models - Claude vs. GPT 4 Turbo vs. Llava

A comparison of three popular vision models - Claude, ChatGPT, and Llava. We will compare their performance on image interpretation, profile picture assessment, and identifying complex objects. We will also compare the cost of using these models.

Introducing New Prompts Library

The prompts library in Msty is now more useful and user-friendly. It is now more organized and easier to navigate. The prompts are now grouped into categories based on their type and purpose - system prompts, user prompts, and refinement prompts.

How to use existing Ollama models with Msty

To use your existing models from Ollama with Msty, you can onboard with Ollama models during initial setup or set Msty's model download location to the one used by Ollama (if you have already onboarded in Msty).

How to update Msty's Text Module Service

Msty installs and runs a service as part of its Text Module to serve Local AI models. We periodically release new updates to the module - sometimes more frequently than we release app updates. This requires users to manually update the service as updating the service will stop and restart the service during the process.