Add support for Mistral Nemo model
What does this MR do and why?
Add support for Mistral Nemo model
According to https://ollama.com/library/mistral-nemo:latest
Mistral NeMo offers a large context window of up to 128k tokens. Its reasoning, world knowledge, and coding accuracy are state-of-the-art in its size category. As it relies on standard architecture, Mistral NeMo is easy to use and a drop-in replacement in any system using Mistral 7B.
A larger context window is crucial for the features like Duo Chat