Configuring the Prompt Library
CodeCompanion enables you to leverage prompt templates to quickly interact with your codebase. These prompts can be the built-in ones or custom-built. CodeCompanion uses a prompt library to manage and organize these prompts.
IMPORTANT
Prompts can be pure Lua tables, residing in your configuration, or markdown files stored in your filesystem.
Adding Prompts
NOTE
See the Creating Prompts section to learn how to create your own.
There are two ways to add prompts to the prompt library. You can either define them directly in your configuration file as Lua tables, or you can store them as markdown files in your filesystem and reference them in your configuration.
require("codecompanion").setup({
prompt_library = {
["Docusaurus"] = {
strategy = "chat",
description = "Write documentation for me",
prompts = {
{
role = "user",
content = [[Just some prompt that will write docs for me.]],
},
},
},
},
})Refreshing Markdown Prompts
If you add or modify markdown prompts whilst your Neovim session is running, you can refresh the prompt library to pick up the changes with:
:CodeCompanionActions refreshCreating Prompts
As mentioned earlier, prompts can be created in two ways: as Lua tables or as markdown files.
NOTE
Markdown prompts are new in v18.0.0. They provide a cleaner, more maintainable way to define prompts with support for external Lua files for dynamic content.
Why Markdown?
Markdown prompts offer several advantages:
- Cleaner syntax - No Lua string escaping or concatenation
- Better readability - Natural formatting with proper indentation
- Easier editing - Edit in any markdown editor with syntax highlighting
- Reusability - Share Lua helper files across multiple prompts
- Version control friendly - Easier to diff and review changes
For complex prompts with multiple messages or dynamic content, markdown files are significantly easier to maintain than Lua tables.
Basic Structure
At their core, prompts define a series of messages sent to an LLM. Let's start with a simple example:
---
name: Explain Code
strategy: chat
description: Explain how code works
---
## system
You are an expert programmer who excels at explaining code clearly and concisely.
## user
Please explain the following code:
```${context.filetype}
${shared.code}
```Markdown Format
Markdown prompts consist of two main parts:
- Frontmatter - YAML metadata between
---delimiters that defines the prompt's configuration - Prompt sections - Markdown headings (
## system,## user) that define the role and content of each message
Required frontmatter fields:
name- The display name in the Action Palettedescription- Description shown in the Action Palettestrategy- The strategy to use (chat,inline,workflow)
Optional frontmatter fields:
opts- Additional options (see Options section)context- Pre-loaded context (see Prompts with Context section)
Prompt sections:
## system- System messages that set the LLM's behaviour## user- User messages containing your requests
Options
Both markdown and Lua prompts support a wide range of options to customise behaviour:
---
name: Generate Tests
strategy: inline
description: Generate unit tests
opts:
alias: tests
auto_submit: true
modes:
- v
placement: new
stop_context_insertion: true
---
## system
Generate comprehensive unit tests for the provided code.
## user
```${context.filetype}
${shared.code}
```Common options:
adapter- Specify a different adapter/model:
---
name: My Prompt
strategy: chat
description: Uses a specific model
opts:
adapter:
name: ollama
model: deepseek-coder:6.7b
---alias- Allows the prompt to be triggered via:CodeCompanion /{alias}auto_submit- Automatically submit the prompt to the LLMdefault_rules- Specify a default rule group to load with the promptignore_system_prompt- Don't send the default system prompt with the requestintro_message- Custom intro message for the chat buffer UIis_slash_cmd- Make the prompt available as a slash command in chatmodes- Only show in specific modes ({ "v" }for visual mode)placement- For inline strategy:new,replace,add,before,chatpre_hook- Function to run before the prompt is executed (Lua only)stop_context_insertion- Prevent automatic context insertionuser_prompt- Get user input before actioning the response
Using Placeholders
Placeholders allow you to inject dynamic content into your prompts. In markdown prompts, use ${placeholder.name} syntax:
Context Placeholders
The context object contains information about the current buffer:
---
name: Buffer Info
strategy: chat
description: Show buffer information
---
## user
I'm working in buffer ${context.bufnr} which is a ${context.filetype} file.Available context fields:
{
bufnr = 7,
buftype = "",
cursor_pos = { 10, 3 },
end_col = 3,
end_line = 10,
filetype = "lua",
is_normal = false,
is_visual = true,
lines = { "local function fire_autocmd(status)", "..." },
mode = "V",
start_col = 1,
start_line = 8,
winnr = 1000
}External Lua Files
For markdown prompts, you can reference functions and values from external Lua files placed in the same directory as your prompt. This is useful for complex logic or reusable components:
Example directory structure:
.prompts/
├── commit.md
├── commit.lua
├── shared.lua
└── utils.luashared.lua:
return {
code = function(args)
local actions = require("codecompanion.helpers.actions")
return actions.get_code(args.context.start_line, args.context.end_line)
end,
}commit.lua:
return {
diff = function(args)
return vim.system({ "git", "diff", "--no-ext-diff", "--staged" }, { text = true }):wait().stdout
end,
}commit.md:
---
name: Commit message
strategy: chat
description: Generate a commit message
opts:
alias: commit
---
## user
You are an expert at following the Conventional Commit specification. Given the git diff listed below, please generate a commit message for me:
```diff
${commit.diff}
```In this example, ${commit.diff} references the diff function from commit.lua. The plugin automatically:
- Detects the dot notation (
commit.) - Loads
commit.luafrom the same directory - Calls the
difffunction - Replaces
${commit.diff}with the result
Multiple files example:
---
name: Code Review
strategy: chat
description: Review code changes
---
## user
Please review this code:
```${context.filetype}
${shared.code}
```
Here's the git diff:
```diff
${utils.git_diff}
```This prompt can reference functions from both shared.lua and utils.lua in the same directory.
Function signature:
External Lua functions receive an args table:
return {
my_function = function(args)
-- args.context - Buffer context
-- args.item - The full prompt item
return "some value"
end,
static_value = "I'm just a string",
}Built-in Helpers
You can also reference built-in values using dot notation:
${context.bufnr}- Current buffer number${context.filetype}- Current filetype${context.start_line}- Visual selection start${context.end_line}- Visual selection end
And many more from the context object.
Advanced Example
Here's a complete example showing the power of markdown prompts:
unit_tests.md:
---
name: Unit tests
strategy: inline
description: Generate unit tests for the selected code
opts:
alias: tests
auto_submit: true
modes:
- v
placement: new
stop_context_insertion: true
---
## system
When generating unit tests, follow these steps:
1. Identify the programming language.
2. Identify the purpose of the function or module to be tested.
3. List the edge cases and typical use cases that should be covered in the tests and share the plan with the user.
4. Generate unit tests using an appropriate testing framework for the identified programming language.
5. Ensure the tests cover:
- Normal cases
- Edge cases
- Error handling (if applicable)
6. Provide the generated unit tests in a clear and organized manner without additional explanations or chat.
## user
Please generate unit tests for this code from buffer ${context.bufnr}:
```${context.filetype}
${shared.code}
```shared.lua:
return {
code = function(args)
local actions = require("codecompanion.helpers.actions")
return actions.get_code(args.context.start_line, args.context.end_line)
end,
}This prompt:
- Only appears in visual mode
- Automatically submits to the LLM
- Places results in a new buffer
- Uses a reusable
shared.codefunction - Includes detailed instructions for the LLM
Advanced Configuration
Conditionals
Lua only:
You can conditionally control when prompts appear in the Action Palette or conditionally include specific prompt messages using condition functions:
Item-level conditions (controls visibility in Action Palette):
["Open chats ..."] = {
name = "Open chats ...",
strategy = " ",
description = "Your currently open chats",
condition = function(context)
return #require("codecompanion").buf_get_chat() > 0
end,
picker = {
---
}
}Prompt-level conditions (controls individual messages):
["Visual Only"] = {
strategy = "chat",
description = "Only appears in visual mode",
prompts = {
{
role = "user",
content = "This prompt only appears when you're in visual mode.",
condition = function(context)
return context.is_visual
end,
},
},
}NOTE
Conditionals are not supported in markdown prompts since they require Lua functions. Use the modes option in frontmatter instead to control visibility by mode.
Prompts with Context
Pre-load a chat buffer with context from files, symbols, or URLs:
---
name: Test Context
strategy: chat
description: Add some context
context:
- type: file
path:
- lua/codecompanion/health.lua
- lua/codecompanion/http.lua
- type: symbols
path: lua/codecompanion/strategies/chat/init.lua
- type: url
url: https://raw.githubusercontent.com/olimorris/codecompanion.nvim/refs/heads/main/lua/codecompanion/commands.lua
---
## user
I'll think of something clever to put here...Context items appear at the top of the chat buffer. URLs are automatically cached for you.
Using Pre-hooks
Pre-hooks allow you to run custom logic before a prompt is executed. This is particularly useful for creating new buffers or setting up the environment:
Lua only:
["Boilerplate HTML"] = {
strategy = "inline",
description = "Generate some boilerplate HTML",
opts = {
---@return number
pre_hook = function()
local bufnr = vim.api.nvim_create_buf(true, false)
vim.api.nvim_set_current_buf(bufnr)
vim.api.nvim_set_option_value("filetype", "html", { buf = bufnr })
return bufnr
end,
},
prompts = {
{
role = "system",
content = "You are an expert HTML programmer",
},
{
role = "user",
content = "Please generate some HTML boilerplate for me. Return the code only and no markdown codeblocks",
},
},
}For the inline strategy, the plugin will detect a number being returned from the pre_hook and assume that is the buffer number you wish any code to be streamed into.
Others
Hiding Built-in Prompts
You can hide the built-in prompts from the Action Palette by setting the following configuration option:
require("codecompanion").setup({
display = {
action_palette = {
opts = {
show_prompt_library_builtins = false,
}
},
},
})