From d8bc33b57e1fb80d10def874a54e91bed84df79b Mon Sep 17 00:00:00 2001 From: Mitja Felicijan Date: Wed, 18 Feb 2026 01:53:14 +0100 Subject: Update readme --- README.md | 24 ++++++++++++------------ 1 file changed, 12 insertions(+), 12 deletions(-) (limited to 'README.md') diff --git a/README.md b/README.md index 240fcfb..fee82f8 100644 --- a/README.md +++ b/README.md @@ -1,8 +1,7 @@ # llmnpc -Command-line LLM inference and simple context retrieval powered by -[llama.cpp](https://github.com/ggerganov/llama.cpp) to test viability of using -LLM's to drive NPC behaviour. +Command-line tooling for NPC-focused LLM experiments with lightweight context +retrieval, powered by [llama.cpp](https://github.com/ggerganov/llama.cpp). ## Building @@ -26,8 +25,8 @@ LLM's to drive NPC behaviour. 3. Build binaries: ```bash - make build/prompt make build/context + make build/npc ``` ## Usage @@ -42,14 +41,14 @@ produces a binary vector database file. ./context -m flan-t5-small -i context.txt -o context.vdb ``` -### Run a prompt with retrieved context +### Run an NPC query with retrieved context -`prompt` reads the context text file, embeds the query, selects the top 3 -matching lines by cosine similarity, and builds a prompt from those lines. +`npc` loads a vector database, embeds the prompt, selects the top 3 matching +lines by cosine similarity, and runs the NPC system prompt against that context. ```bash -./prompt -p "What is machine learning?" -c context.txt -./prompt -m flan-t5-small -p "What is machine learning?" -c context.txt +./npc -m flan-t5-small -p "Who is Gandalf?" -c context.vdb +./npc -m flan-t5-small -p "Who is Frodo?" -c context.vdb ``` ### context options @@ -63,13 +62,14 @@ matching lines by cosine similarity, and builds a prompt from those lines. | `-v, --verbose` | Enable llama.cpp logging | | `-h, --help` | Show help message | -### prompt options +### npc options | Flag | Description | |------|-------------| -| `-m, --model` | Model to use (default: first model in config) | +| `-m, --model` | Model to use (required) | | `-p, --prompt` | Prompt text (required) | -| `-c, --context` | Context text file (required) | +| `-c, --context` | Context vector database file (.vdb) (required) | +| `-l, --list` | List available models | | `-v, --verbose` | Enable llama.cpp logging | | `-h, --help` | Show help message | -- cgit v1.2.3