Msty Blog
How, What, Why, and Which of Real-Time Data in Msty
Real-time data feature allows you to use data from external sources from the web in your conversations with Msty. In this post, we'll explain how this feature works under the hood and how you can use it in your conversations.
Why 1.0 and What's Next?
The first release of Msty was released about 6 months ago and we have been working hard to make Msty better with each release. We have added so many features and improvements that we feel that Msty is now ready for a 1.0 release.
How to use Gemma 2 in Msty
Google just published its new model Gemma 2 in 9B and 27B variants. Let's use them in Msty.
Claude 3.5 Sonnet vs GPT-4o
Anthropic's new model in the 3.5 model family is here. Can it beat Open AI's GPT-4o?
Multiverse AI Conversations with Split Chats
Split chats in Msty allow you to have multiple conversations with the different AI models in the same chat window.
Audio Transcription in Msty
Msty's audio transcription feature is packed with some very useful features for your audio transcription needs.
Llama 3 vs. GPT-4 vs. Gemini Pro
Let's look at how Llama 3, GPT-4, and Gemini Pro compare at some common tasks.
Vision Models - Claude vs. GPT 4 Turbo vs. Llava
A comparison of three popular vision models - Claude, ChatGPT, and Llava. We will compare their performance on image interpretation, profile picture assessment, and identifying complex objects. We will also compare the cost of using these models.
Introducing New Prompts Library
The prompts library in Msty is now more useful and user-friendly. It is now more organized and easier to navigate. The prompts are now grouped into categories based on their type and purpose - system prompts, user prompts, and refinement prompts.
CodeGemma Vs CodeLlama
We compare Google's CodeGemma model with Meta's CodeLlama model to see how they perform on various coding tasks.
How to use existing Ollama models with Msty
To use your existing models from Ollama with Msty, you can onboard with Ollama models during initial setup or set Msty's model download location to the one used by Ollama (if you have already onboarded in Msty).
How to update Msty's Text Module Service
Msty installs and runs a service as part of its Text Module to serve Local AI models. We periodically release new updates to the module - sometimes more frequently than we release app updates. This requires users to manually update the service as updating the service will stop and restart the service during the process.
LLaVA Integration in Msty
Msty integrates LLaVA, Large Language and Vision Assistant, in a unique way that allows you to use it on-demand.