Agentic Bill Payments MCP Server
generic skill
Documentation for how system prompts are dynamically assembled, structured, and sent to LLMs.
Sign in to like and favorite skills
generic skill
References to demo script and key prompts for validation
I want you to act as an English translator, spelling corrector and improver. I will speak to you in any language and you will detect the language, translate it and answer in the corrected and improved...
Documentation for how system prompts are dynamically assembled, structured, and sent to LLMs.
The system prompt is the foundational instruction set that defines the AI's persona, capabilities, tools, and operational constraints. In SafeAppeals, this prompt is dynamically generated at runtime based on:
case_manager, research, drafting, etc.)The construction of the prompt is orchestrated by the browser process before the message is sent to the main process.
graph TD subgraph "Browser Process (Services)" Service[ConvertToLLMMessageService<br>src/.../browser/convertToLLMMessageService.ts] end subgraph "Common (Prompt Logic)" Entry[chat_systemMessage<br>src/.../common/prompt/prompts.ts] subgraph "Core Instructions" SysPrompt[getSystemPrompt<br>src/.../common/prompt/systemPrompt.ts] end subgraph "Tool Definitions" ToolGen[systemToolsXMLPrompt<br>src/.../common/prompt/prompts.ts] ToolsObj[builtinTools Object<br>src/.../common/prompt/prompts.ts] Schemas[toolSchemas.ts<br>src/.../common/prompt/toolSchemas.ts] end end %% Execution Flow Service -->|1. Calls with Context| Entry Entry -->|2. Request Text| SysPrompt Entry -->|3. Request XML| ToolGen %% Internal Dependencies ToolGen -->|4. Read Definitions| ToolsObj ToolsObj -.->|5. Import Description| Schemas %% Return Flow SysPrompt -->|Return| Entry ToolGen -->|Return| Entry Entry -->|Return Full String| Service
ConvertToLLMMessageServiceLocation:
src/vs/workbench/contrib/void/browser/convertToLLMMessageService.ts
This service is the entry point. It gathers live data from the IDE:
openedURIs)activeURI)It then calls
chat_systemMessage with this context.
prompts.tsLocation:
src/vs/workbench/contrib/void/common/prompt/prompts.ts
This file contains the logic to stitch everything together.
chat_systemMessage: The main function that combines the base prompt with tool definitions.systemToolsXMLPrompt: Generates the XML block for available tools and guidelines.builtinTools: Defines the name, description, and parameters for every tool.systemPrompt.tsLocation:
src/vs/workbench/contrib/void/common/prompt/systemPrompt.ts
This file contains the static text and logic for the AI's behavior. It exports
getSystemPrompt, which returns the prompt string based on the ChatMode.
Sections include:
case_manager, research, etc.<function_calls>).toolSchemas.tsLocation:
src/vs/workbench/contrib/void/common/prompt/toolSchemas.ts
Contains complex JSON schemas for tools like
edit_document (DOCX/XLSX editing operations). These schemas are imported by prompts.ts to populate the tool descriptions.
The final string sent to the LLM follows this order: