1# Install pre-built version of llama.cpp
 2
 3| Install via | Windows | Mac | Linux |
 4|-------------|---------|-----|-------|
 5| Winget      | ✅      |      |      |
 6| Homebrew    |         | ✅   | ✅   |
 7| MacPorts    |         | ✅   |      |
 8| Nix         |         | ✅   | ✅   |
 9
10## Winget (Windows)
11
12```sh
13winget install llama.cpp
14```
15
16The package is automatically updated with new `llama.cpp` releases. More info: https://github.com/ggml-org/llama.cpp/issues/8188
17
18## Homebrew (Mac and Linux)
19
20```sh
21brew install llama.cpp
22```
23
24The formula is automatically updated with new `llama.cpp` releases. More info: https://github.com/ggml-org/llama.cpp/discussions/7668
25
26## MacPorts (Mac)
27
28```sh
29sudo port install llama.cpp
30```
31
32See also: https://ports.macports.org/port/llama.cpp/details/
33
34## Nix (Mac and Linux)
35
36```sh
37nix profile install nixpkgs#llama-cpp
38```
39
40For flake enabled installs.
41
42Or
43
44```sh
45nix-env --file '<nixpkgs>' --install --attr llama-cpp
46```
47
48For non-flake enabled installs.
49
50This expression is automatically updated within the [nixpkgs repo](https://github.com/NixOS/nixpkgs/blob/nixos-24.05/pkgs/by-name/ll/llama-cpp/package.nix#L164).