A lightweight, offline-first desktop GUI for managing a CSV library of prompt "roles", composing/running prompts against multiple LLM backends (OpenAI API, Ollama, LM Studio), auto-categorizing prompts, and building runnable drag-and-drop prompt workflows.
- Prompt Manager & Launcher: Import from CSV, browse with categories/tags, search/filter, CRUD operations, favorites, and history
- Smart Prompt Composer: Dynamic placeholders with types and defaults, parameter presets, template preview
- API Connector: Pluggable connectors for OpenAI API, Ollama, LM Studio with streaming support
- Auto-Categorization: LLM-powered prompt analysis and category/tag suggestions
- Workflow Builder: Drag-and-drop node editor for building prompt pipelines (planned)
- Cross-platform: Works on macOS and Windows
-
Prerequisites: Python 3.11 or higher
-
Install dependencies:
pip install -r requirements.txt
-
Run the application:
python main.py
On first launch, the application will automatically import the sample prompts from assets.csv
. The database file prompt_studio.db
will be created in the application directory.
The included sample prompts are sourced from the Awesome ChatGPT Prompts dataset by fka on Hugging Face, available at: https://huggingface.co/datasets/fka/awesome-chatgpt-prompts
This dataset is licensed under CC0-1.0 (Creative Commons Zero), making it freely available for any use.
- Go to File → Import CSV...
- Select your CSV file with the following expected columns:
name
(required): The prompt name/titlecontent
orprompt
(required): The actual prompt textcategory
(optional): Category for organizationtags
(optional): Comma-separated tagsdescription
(optional): Prompt descriptionplaceholders_schema
(optional): JSON schema for dynamic placeholders
- Select a prompt from the center list
- Fill in parameters in the right panel (if the prompt has placeholders)
- Choose a backend from the toolbar (OpenAI, Ollama, LM Studio)
- Select a model for the chosen backend
- Click Run or press
Ctrl+Enter
Ctrl+K
: Quick open (fuzzy search prompts)Ctrl+Enter
: Run current promptEscape
: Stop executionF5
: Refresh prompt listCtrl+N
: New prompt (planned)
Prompts support Jinja2 templating with dynamic placeholders:
Act as a {{ expertise }} and help me with {{ task }}.
Constraints: {{ constraints or "none" }}
Supported placeholder types:
str
: Short text inputtext
: Long text (textarea)int
: Integer inputfloat
: Float inputbool
: Boolean checkboxchoice
: Single choice dropdownmultichoice
: Multiple choice checkboxes
API keys are stored securely using the system keyring:
- OpenAI: Stored under service "PromptStudio", account "openai_api_key"
- Other backends: Local servers (Ollama, LM Studio) don't require API keys
You can set API keys programmatically:
import keyring
keyring.set_password("PromptStudio", "openai_api_key", "your_api_key_here")
- OpenAI API: Default endpoint is
https://api.openai.com/v1
- Ollama: Default endpoint is
http://localhost:11434
- LM Studio: Default endpoint is
http://localhost:1234/v1
Settings can be modified through the Settings dialog (planned) or directly in the database.
prompt_studio/
├── models/ # Database models and ORM
├── ui/ # PySide6 user interface components
├── backends/ # LLM backend connectors
├── utils/ # Utilities (CSV import, templating)
└── __init__.py
main.py # Application entry point
requirements.txt # Python dependencies
assets.csv # Sample prompts data
To add a new LLM backend:
- Implement the
LLMBackend
protocol inbackends/llm_backends.py
- Add your backend to the
BackendManager
- The UI will automatically detect and include it
Use PyInstaller to create standalone executables:
# For current platform
pyinstaller --windowed --onefile main.py
# For macOS app bundle
pyinstaller --windowed --onefile --name "Prompt Studio" main.py
- Settings dialog with API key management
- Quick open fuzzy search dialog
- Export functionality (JSON, CSV)
- Workflow builder with drag-and-drop nodes
- Auto-categorization using LLMs
- History search and filtering
- Dark/light theme toggle
- Plugin system for custom backends
- Prompt templates marketplace
- Batch processing capabilities
This project is open source. See LICENSE file for details.
Contributions are welcome! Please feel free to submit pull requests or open issues for bugs and feature requests.