Ollama

Run LLMs locally with Ollama’s simple app for macOS, Windows, and Linux. Chat with models, process files, and enjoy privacy with no cloud dependencies.

Aug 5, 2025 - 14:40
Sep 16, 2025 - 22:45
 0
Ollama
Ollama

Ollama is a versatile AI interface that enables users to run large language models (LLMs) locally on their macOS, Windows, or Linux machines. Developed by the Ollama team, it provides a native GUI for easier model management and interaction. The app emphasizes privacy, speed, and usability, allowing users to download, manage, and interact with various models without relying on cloud services.

Key Features

  • Run LLMs locally with a native GUI
  • Drag and drop files for text processing
  • Multimodal support including image input
  • Chat with AI models using simple interface
  • Pull and switch between multiple LLMs
  • Customizable context length for documents
  • Enhanced privacy with local AI use
  • Supports command-line for advanced users

Download Ollama

Download the latest version of Ollama for free using the direct links below. You’ll find app store listings and, where available, installer files for macOS, Windows. All files come from official sources, are original, unmodified, and safe to use.

Last updated on: 16 September 2025. Version: 0.11.11.

What's new in this version
  • Added support for CUDA 13
  • Improved memory usage when using gpt-oss in Ollama's app
  • Improved scrolling in Ollama's app when submitting long prompts
  • Cmd +/- now zooms and shrinks text in Ollama's app
  • Assistant messages can now be copied in Ollama's app
  • Improved memory estimates for hybrid and recurrent models
  • Added dimensions field to embed requests
  • Enabled new memory estimates in Ollama's new engine by default
  • Ollama will no longer load split vision models in the Ollama engine
  • Bug fixes

Installation

Download files are available in different formats depending on your operating system. Make sure to follow the appropriate installation guide: APK and XAPK for Android, EXE for Windows, DMG for macOS.

Linux users can install Ollama with a single command in the terminal: curl -fsSL https://ollama.com/install.sh | sh

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0
IhorLev I write about artificial intelligence, technology, and software. I enjoy breaking down complex things and explaining them simply, with a touch of self-irony, since I’m always learning along with my readers.