Tag: Ollama
-
Review Generative AI Orca Mini 3B Model
How to Run Orca Mini 3B on Your Machine Using the Alpaca Ollama Client If you’ve ever wanted to explore local AI models without needing expensive cloud infrastructure, the Orca Mini 3B model is a great starting point.
Written by

-
Review Generative AI Phi 2.7B Model
How to Run Phi 2.7B Locally Using Alpaca (Ollama Client) If you’re interested in experimenting with powerful AI models on your own machine, Phi 2.7B is a great starting point.
Written by

-
Review Generative AI Llava 7B Model
Curious about running open-source AI models right from your Linux desktop?
Written by

-
Review Generative AI Phi3 14B Model
🔍 Reviewing Phi-3 LLM on Linux Using Alpaca & Ollama In this article, I’ll walk you through my experience using Phi-3, a lightweight large language model (LLM) developed by Microsoft, running locally on Linux via Alpaca, a user-friendly front-end for Ollama packaged conveniently as a Flatpak.
Written by

-
Review Generative AI Qwen2 7b Model
🚀 Exploring Qwen2: A Powerful Open LLM – Installation, Testing with Alpaca + Screencast!
Written by
