# just-prompt MCP server
[[Model Context Protocol (MCP)]] server that can be used to prompt different [[Large Language Models (LLMs)]] from the current one. It provides a unified interface for all supported models. It can also prompt multiple models in parallel.
Supports [[OpenAI]]'s models, [[Anthropic]]'s [[30 Areas/33 Permanent notes/33.02 Content/Claude|Claude]], Google's [[Gemini]], [[xAI]]'s [[Grok]], [[Deepseek]], [[Ollama]], ...
## Tools
This MCP server provides the following tools:
- `prompt`: send a prompt to multiple LLMs
- Parameters
- `text` the prompt text
- `models_prefixed_by_provider` (optional): list of models with provider prefixes. If not defined, uses default models
- `prompt_from_file`: send a prompt from a file to multiple LLMs
- Parameters
- `abs_file_path`: absolute path to the file containing the prompt
- `models_prefixed_by_provider` (optional): list of models with provider prefixes. If not defined, uses default models
- `ceo_and_board`: prompt multiple 'board member' models and have a 'CEO' model make a decision based on their responses
- Parameters
- `abs_file_path`: absolute path to the file containing the prompt
- `models_prefixed_by_provider` (optional): list of models with provider prefixes. If not defined, uses default models
- `abs_output_dir` (default: `'.'`): absolute directory path to save the response files and CEO decision
- `ceo_model` (default: `openai:o3`): model to use for the CEO decision using format `provider:model`
- `list_providers`: list available LLM providers
- Parameters
- None
- `list_models`: list all available for a specific LLM provider
- Parameters
- `provider`: provider to list models for (e.g., `openai`)
## Provider prefixes
Every model must be prefixed with the provider name. Use the short name for faster referencing.
- `o` or `openai`: OpenAI
- `o:gpt-4o-mini`
- `openai:gpt-4o-mini`
- `a` or `anthropic`: Anthropic
- `a:claude-3-5-haiku`
- `anthropic:claude-3-5-haiku`
- `g` or `gemini`: Google Gemini
- `g:gemini-2.5-pro-exp-03-25`
- `gemini:gemini-2.5-pro-exp-03-25`
- `q` or `groq`: Groq
- `q:llama-3.1-70b-versatile`
- `groq:llama-3.1-70b-versatile`
- `d` or `deepseek`: DeepSeek
- `d:deepseek-coder`
- `deepseek:deepseek-coder`
- `l` or `ollama`: Ollama
- `l:llama3.1`
- `ollama:llama3.1`
## Examples
Using an input file:
```
prompts_from_file_to_file(
from_file = foo.md,
models = "openai:o3-mini, anthropic:claude-3-7-sonnet-20250219:4k, gemini-2.0-flash-thinking-exp",
output_dir = bar/
)
```
## Configuration
```
"mcpServers": {
"just-prompt": {
"type": "stdio",
"command": "uv",
"args": [
"--directory",
".",
"run",
"just-prompt",
"--default-models",
"anthropic:claude-3-7-sonnet-20250219,openai:o3-mini,gemini:gemini-2.0-flash"
],
"env": {}
}
}
```
## References
- Official website and source code: https://github.com/disler/just-prompt