Adding AI Prompts in Tadabase
AI Prompts in Tadabase let you integrate ChatGPT-powered responses into your workflows, making it easy to automate tasks, generate content, and enhance your app's interactivity. This guide walks you through setting up and configuring AI Prompts, including how to define prompts, use dynamic values, and map responses to your database fields.
The process of adding an AI Prompts requires to first and foremost add or install a prompt and the settings, only once that's done you can add the Prompt into your app. In this article, we'll look into how to add an AI Prompt.
The process is outlined in the visual below:
Getting Started
To begin using AI Prompts, navigate to the AI Prompts section in your Tadabase app. If you haven't created any prompts yet, you'll see an introduction with options to install a template or build a custom prompt.
Click Add AI Prompt to start setting up your first prompt.
Setting Up Your AI Prompt
When adding a new AI Prompt, you'll need to configure a few essential settings:
1. Name
The Name is how your AI Prompt will be referenced within Tadabase. It doesn’t affect the AI’s response but is used internally in Action Links and Record Rules to call the prompt when needed. Choose something descriptive so it’s easy to recognize later.
2. User Prompt
The User Prompt is where you define what the AI should do. This can include placeholders—called Dynamic Values—which will be replaced with actual data when the prompt runs.
For example:
Write a product description for {Product Name} that highlights its key benefits.
In this case, {Product Name}
will be replaced by the actual product name when the prompt is triggered. You can add multiple placeholders to make your prompts more flexible and reusable.
3. System Prompt
The System Prompt is a static instruction that is always included in the request, helping shape the AI's responses. Use this field to provide guidelines on tone, style, or output format.
Example:
Always respond in a professional tone and provide responses in under 200 words.
This ensures consistency across AI-generated outputs.
4. Status
The Status setting lets you activate or deactivate a prompt. If a prompt is inactive, it won’t be available for use in workflows or forms. This is useful if you need to temporarily disable a prompt without deleting it.
Configuring Dynamic Values
Dynamic Values allow you to insert placeholders into the User Prompt. These are automatically detected and listed in the Prompt Variables tab, where you can set default values.
How Dynamic Values Work
Each placeholder in your prompt (e.g., {Product Name}
) is treated as a variable. When the AI Prompt runs, the system replaces these placeholders with real data from your app.
Setting Default Values
If a dynamic value isn’t provided when the AI Prompt is triggered, the Default Value will be used instead. This ensures that the prompt always has valid input, even when certain fields are left blank.
Example:
-
{Product Name}
→ Default Value: "Sample Product"
If no actual product name is provided, "Sample Product" will be used in the AI request.
Defining Expected AI Responses
Once you've set up your prompt and dynamic values, the next step is defining what kind of response you expect from the AI.
Response Configuration Options
You can add multiple expected responses, each with specific rules and formatting. Here’s what you can configure:
-
Type – Determines the format of the AI response. Choose from String, Number, or Boolean, depending on the kind of data you expect.
-
Name – This is the label for the response and helps you map AI-generated outputs to database fields.
-
Description – Provides additional instructions to guide the AI in generating the best possible response.
-
Allowed Responses – If the response type is String, you can define a set list of acceptable responses (e.g., "Yes, No, Maybe"). This ensures that AI outputs align with predefined choices.
-
Required – Ensures that the response is always included in the AI’s reply, even if it’s blank. This is useful for maintaining consistent data formatting in your workflows.
Example setup:
-
Type: String
-
Name: Sentiment
-
Description: "Analyze the tone of the text and return either Positive, Neutral, or Negative."
-
Allowed Responses: Positive, Neutral, Negative
-
Required: Yes
With this setup, the AI will always return a response, and it will be limited to one of the three predefined options.
Reviewing AI Prompt Logs
Once your AI Prompts are running, you can monitor their usage in the Logs tab. This section provides insight into how prompts are being executed, including:
-
Request Data – What data was sent to the AI.
-
System Prompt Data – The static instructions included in the request.
-
Tokens Used – How many tokens were consumed for the request. This helps track usage and optimize prompt efficiency.
Logs are useful for troubleshooting and refining your AI Prompts to get better results.
Limits and Future Enhancements
Currently, there are no enforced limits on AI Prompt usage. However, limits will be introduced in the future to ensure fair usage and prevent excessive resource consumption. More details will be shared as limits are implemented.
Next Steps (coming soon)
Now that you know how to create and configure AI Prompts, the next step is learning how to use them in different parts of Tadabase:
These integrations allow you to trigger AI responses dynamically, enhancing automation and interactivity in your app. More details on these implementations will follow.