1name: Bug (misc.)
  2description: Something is not working the way it should (and it's not covered by any of the above cases).
  3title: "Misc. bug: "
  4labels: ["bug-unconfirmed"]
  5body:
  6  - type: markdown
  7    attributes:
  8      value: >
  9        Thanks for taking the time to fill out this bug report!
 10        This issue template is intended for miscellaneous bugs that don't fit into any other category.
 11        If you encountered the issue while using an external UI (e.g. ollama),
 12        please reproduce your issue using one of the examples/binaries in this repository.
 13  - type: textarea
 14    id: version
 15    attributes:
 16      label: Name and Version
 17      description: Which version of our software is affected? (You can use `--version` to get a version string.)
 18      placeholder: |
 19        $./llama-cli --version
 20        version: 2999 (42b4109e)
 21        built with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu
 22    validations:
 23      required: true
 24  - type: dropdown
 25    id: operating-system
 26    attributes:
 27      label: Operating systems
 28      description: Which operating systems do you know to be affected?
 29      multiple: true
 30      options:
 31        - Linux
 32        - Mac
 33        - Windows
 34        - BSD
 35        - Other? (Please let us know in description)
 36    validations:
 37      required: false
 38  - type: dropdown
 39    id: module
 40    attributes:
 41      label: Which llama.cpp modules do you know to be affected?
 42      multiple: true
 43      options:
 44        - Documentation/Github
 45        - libllama (core library)
 46        - llama-cli
 47        - llama-server
 48        - llama-bench
 49        - llama-quantize
 50        - Python/Bash scripts
 51        - Test code
 52        - Other (Please specify in the next section)
 53    validations:
 54      required: false
 55  - type: textarea
 56    id: command
 57    attributes:
 58      label: Command line
 59      description: >
 60        Please provide the exact commands you entered, if applicable. For example: `llama-server -m ... -c ...`, `llama-cli -m ...`, etc.
 61        This will be automatically formatted into code, so no need for backticks.
 62      render: shell
 63    validations:
 64      required: false
 65  - type: textarea
 66    id: info
 67    attributes:
 68      label: Problem description & steps to reproduce
 69      description: >
 70        Please give us a summary of the problem and tell us how to reproduce it (if applicable).
 71    validations:
 72      required: true
 73  - type: textarea
 74    id: first_bad_commit
 75    attributes:
 76      label: First Bad Commit
 77      description: >
 78        If the bug was not present on an earlier version and it's not trivial to track down: when did it start appearing?
 79        If possible, please do a git bisect and identify the exact commit that introduced the bug.
 80    validations:
 81      required: false
 82  - type: textarea
 83    id: logs
 84    attributes:
 85      label: Relevant log output
 86      description: >
 87          If applicable, please copy and paste any relevant log output, including any generated text.
 88          If you are encountering problems specifically with the `llama_params_fit` module, always upload `--verbose` logs as well.
 89          For very long logs (thousands of lines), please upload them as files instead.
 90          On Linux you can redirect console output into a file by appending ` > llama.log 2>&1` to your command.
 91      value: |
 92        <details>
 93        <summary>Logs</summary>
 94        <!-- Copy-pasted short logs go into the "console" area here -->
 95
 96        ```console
 97
 98        ```
 99        </details>
100
101        <!-- Long logs that you upload as files go here, outside the "console" area -->
102    validations:
103      required: false