Ollama
Run LLMs locally with Ollama’s simple app for macOS, Windows, and Linux. Chat with models, process files, and enjoy privacy with no cloud dependencies.

Ollama is a versatile AI interface that enables users to run large language models (LLMs) locally on their macOS, Windows, or Linux machines. Developed by the Ollama team, it provides a native GUI for easier model management and interaction. The app emphasizes privacy, speed, and usability, allowing users to download, manage, and interact with various models without relying on cloud services.
Key Features
- Run LLMs locally with a native GUI
- Drag and drop files for text processing
- Multimodal support including image input
- Chat with AI models using simple interface
- Pull and switch between multiple LLMs
- Customizable context length for documents
- Enhanced privacy with local AI use
- Supports command-line for advanced users
Download Ollama
Download the latest version of Ollama for free using the direct links below. You’ll find app store listings and, where available, installer files for macOS, Windows. All files come from official sources, are original, unmodified, and safe to use.
Last updated on: 16 September 2025. Version: 0.11.11.
- Download Ollama 0.11.11 exe (1,1 GB) [Windows 10+]
- Download Ollama 0.11.11 dmg (45,59 MB) [macOS 12+]
What's new in this version
- Added support for CUDA 13
- Improved memory usage when using gpt-oss in Ollama's app
- Improved scrolling in Ollama's app when submitting long prompts
- Cmd +/- now zooms and shrinks text in Ollama's app
- Assistant messages can now be copied in Ollama's app
- Improved memory estimates for hybrid and recurrent models
- Added dimensions field to embed requests
- Enabled new memory estimates in Ollama's new engine by default
- Ollama will no longer load split vision models in the Ollama engine
- Bug fixes
Installation
Download files are available in different formats depending on your operating system. Make sure to follow the appropriate installation guide: APK and XAPK for Android, EXE for Windows, DMG for macOS.
Linux users can install Ollama with a single command in the terminal: curl -fsSL https://ollama.com/install.sh | sh
What's Your Reaction?






