📢  New Launch: Chat template testing

Level up your
prompt management

Test, collaborate, version, and deploy prompts, from a single place, with PromptHub.

Press A to request access
Lock icon
GDPR Compliant
gray heart icon
Loved by thousands
Schedule a demo
PromptHub Library page

The ultimate prompt playground

Put an end to continuous copy and pasting

Utilize variables to simplify prompt creation

A text box in PromptHub with variables and text

Say goodbye to spreadsheets

Easily compare outputs side-by-side when tweaking prompts

Testing prompts side-by-side

Bring your datasets and test prompts at scale with batch testing

Make sure your prompts are consistent by testing with different models, variables, and parameters

A text box in PromptHub with variables and text
"Seeing many outputs with batch testing gives me more confidence in the statistical significance of the changes I make If you're not batch testing your prompts, you're missing out on some low-hanging fruit."
Headshot of Jade Samadi
Jade Samadi
Founder, Smart Recover

The easiest way to test chat prompts

Stream two conversations and test different models, system messages or chat templates

A list of requests in a table

Test across different models

Compare outputs side-by-side

OpenAI logo
green checkmark
Anthropic Logo
Green check mark on light green background
Microsoft logo
Green check mark on light green background

Collaboration features made for writing prompts

Write better prompts with better versioning

Commit prompts, create branches, and collaborate seamlessly

Version history UI from PromptHub

Automatically track what changed

We detect prompt changes, so you can focus on outputs

Diff checker view of 2 system messages and prompts

Open merge requests for review

Review changes as a team, approve new versions, and keep everyone on the same page

PromptHub interface for analyzing merge requests

Retrieve prompts programmatically via our API

Run a deployed prompt by making a POST to the https://app.prompthub.us/api/v1/projects/{id}/run endpoint.

Inject variables with your app data and pass along any metadata.

: {
   "user_id": 42,
   "user_email": user.email,
 "variables": {
   "type": myVariable,
   "subject": user.subject

Return the head of a prompt by using a GET request to the https://app.prompthub.us/api/v1/projects/{id}/head endpoint.

Ensures your app uses the most up to date prompt, without any engineering effort.

  "data": {
      "id": "1951",
      "project_id": "49",
       "branch_id": "235",
      "provider": "OpenAI",
      "model": "gpt-3.5-turbo",
      "prompt": "Write a song",
      "hash": "GzeuWmzN",
      "commit_title": "Updated temperature",
      "prompt_tokens": 135,
      "branch": {
          "name": "Master"
      "configuration": {
          "id": "15",
          "max_tokens": 4096,
          "temperature": 0.5

Return a list of all your prompts by making a GET request to the https://app.prompthub.us/api/v1/teams/{id}/projects endpoint.

Filter by group, model, provider and manage your prompts programatically.

      "id": "1",
      "name": "LinkedIn Post Generator",
       "description": "Autogenerate posts",
      "head": {
        "id": "1",

      "id": "2",
      "name": "Feedback Classifier",
       "description": "Classify feedback",
      "head": {
        "id": "2",

Deploy your prompts as a shareable form with a few clicks. Control access, embed anywhere, and distribute the power of your prompts.
Chevron pointing right
a wireframe of a form

Deploy your prompts into your workflows through our Zapier integration. Pass variables, log requests, and centralize your prompts.
Chevron pointing right
Zapier logo on a white background with a slight linear gradient

Helping teams get better outputs

A single large gray quotation mark

We quickly outgrew our homegrown prompt engineering solution. That's when we got set up with PromptHub and it streamlined our workflow with an easy to use UI, versioning, logging and a straightforward API, all out of the box. PromptHub saved us the time we would've spent building in-house and let us focus on our core product.

Headshot of Erich Hellstrom
John Yu
Co-Founder, ChainDefender.ai
Chevron pointing right

Easily monitor requests, costs, and latencies

See all requests in one place

See how users are interacting with your prompts

A list of requests in a table

Easily re-run and debug requests

🚀 Run in playground

Meaningful insights, automatically

a graph with a linear gradient as the line showing average latency

Professionally built templates, ready for you

Let the LLM enhance your prompt
Chain of Density
Generate human-level summaries
Step-back prompting
Take a step back for better results
Skeleton of thought
Win more RFPs
LinkedIn post generator
Content optimized for LinkedIn
Product description writer
E-commerce product management
Product Hunt Tagline
Taglines based on successful launches
High impact keyword analysis
Find your next SEO goldmine
Tree of thoughts
Traverse many paths for a solution
Multi-persona collaboration
A new prompting method
Add emotional stimuli -> better outputs
Growth Hacking Techniques
10 growth ideas for your company
Meta tags for product pages
Polish up product SEO
User discovery survey
Ask the right questions
Pitch presentation
Deliver a pitch that cuts through noise

Join the waitlist

Organize your prompts, test them thoroughly, and get better outputs

Got questions? Schedule a demo with the founder