diff options
Diffstat (limited to 'llama.cpp/tools/server/webui/docs/flows/data-flow-simplified-model-mode.md')
| -rw-r--r-- | llama.cpp/tools/server/webui/docs/flows/data-flow-simplified-model-mode.md | 45 |
1 files changed, 45 insertions, 0 deletions
diff --git a/llama.cpp/tools/server/webui/docs/flows/data-flow-simplified-model-mode.md b/llama.cpp/tools/server/webui/docs/flows/data-flow-simplified-model-mode.md new file mode 100644 index 0000000..07b3621 --- /dev/null +++ b/llama.cpp/tools/server/webui/docs/flows/data-flow-simplified-model-mode.md @@ -0,0 +1,45 @@ +```mermaid +%% MODEL Mode Data Flow (single model) +%% Detailed flows: ./flows/server-flow.mmd, ./flows/models-flow.mmd, ./flows/chat-flow.mmd + +sequenceDiagram + participant User as đ¤ User + participant UI as đ§Š UI + participant Stores as đī¸ Stores + participant DB as đž IndexedDB + participant API as đ llama-server + + Note over User,API: đ Initialization (see: server-flow.mmd, models-flow.mmd) + + UI->>Stores: initialize() + Stores->>DB: load conversations + Stores->>API: GET /props + API-->>Stores: server config + modalities + Stores->>API: GET /v1/models + API-->>Stores: single model (auto-selected) + + Note over User,API: đŦ Chat Flow (see: chat-flow.mmd) + + User->>UI: send message + UI->>Stores: sendMessage() + Stores->>DB: save user message + Stores->>API: POST /v1/chat/completions (stream) + loop streaming + API-->>Stores: SSE chunks + Stores-->>UI: reactive update + end + API-->>Stores: done + timings + Stores->>DB: save assistant message + + Note over User,API: đ Regenerate + + User->>UI: regenerate + Stores->>DB: create message branch + Note right of Stores: same streaming flow + + Note over User,API: âšī¸ Stop + + User->>UI: stop + Stores->>Stores: abort stream + Stores->>DB: save partial response +``` |
