1## Running MUSA CI in a Docker Container
 2
 3Assuming `$PWD` is the root of the `llama.cpp` repository, follow these steps to set up and run MUSA CI in a Docker container:
 4
 5### 1. Create a local directory to store cached models, configuration files and venv:
 6
 7```bash
 8mkdir -p $HOME/llama.cpp/ci-cache
 9```
10
11### 2. Create a local directory to store CI run results:
12
13```bash
14mkdir -p $HOME/llama.cpp/ci-results
15```
16
17### 3. Start a Docker container and run the CI:
18
19```bash
20docker run --privileged -it \
21    -v $HOME/llama.cpp/ci-cache:/ci-cache \
22    -v $HOME/llama.cpp/ci-results:/ci-results \
23    -v $PWD:/ws -w /ws \
24    mthreads/musa:rc4.3.0-devel-ubuntu22.04-amd64
25```
26
27Inside the container, execute the following commands:
28
29```bash
30apt update -y && apt install -y bc cmake ccache git python3.10-venv time unzip wget
31git config --global --add safe.directory /ws
32GG_BUILD_MUSA=1 bash ./ci/run.sh /ci-results /ci-cache
33```
34
35This setup ensures that the CI runs within an isolated Docker environment while maintaining cached files and results across runs.