2503 Batch Operations
Batch Operations in Tadabase
Introduction to Batch Operations
Batch operations allow you to create, update, or delete multiple records simultaneously—sometimes thousands at once. Instead of processing records one by one, batch operations handle them in bulk, saving enormous amounts of time and effort.
Imagine updating the status of 5,000 customer records manually. It would take days. With batch operations, you can complete it in minutes. This capability is essential for any application that scales beyond a few hundred records.
When to Use Batch Operations
Batch operations are perfect for:
- Initial Data Loading - Creating thousands of records when launching
- Mass Updates - Changing a field value across many records
- Data Cleanup - Correcting errors or standardizing data
- Status Changes - Updating record status based on criteria
- Price Updates - Applying new pricing across product catalogs
- Archiving - Moving old records to archived status
- Bulk Deletion - Removing multiple records at once
- Scheduled Maintenance - Regular automated updates
Batch Operation Methods
Tadabase provides several methods for batch operations, each suited for different scenarios.
Method 1: Import for Batch Create/Update
Use the import feature (covered in previous article) for:
- Creating many records from CSV/Excel
- Updating existing records via upsert
- Most user-friendly approach
- Good for occasional batch operations
- Handles up to 10,000+ records
Method 2: REST API Batch Operations
Use the REST API for:
- Programmatic batch operations
- Integration with external systems
- Automated processes
- Complex conditional logic
- Very large datasets (with pagination)
Method 3: Scheduled Tasks
Use scheduled tasks for:
- Automated recurring batch operations
- Nightly data maintenance
- Regular status updates
- Scheduled archiving
- Time-based changes
Method 4: Action Links with Filters
Use action links with filters for:
- User-initiated batch operations
- Update all filtered records
- Interactive batch processing
- Controlled by page filters
Bulk Create Operations
Creating many records at once efficiently.
Via Import
The simplest method for bulk create:
- Prepare CSV - Create file with all records to create
- Import File - Use import feature in Builder
- Map Fields - Connect columns to table fields
- Choose "Create New" - Select create option
- Execute - All records created at once
Advantages:
- No technical knowledge required
- Visual confirmation before creating
- Error handling built-in
- Can handle connection fields
Limitations:
- Manual process (not automated)
- Limited to import schedule
- Less control over logic
Via REST API
For programmatic bulk creation:
POST https://api.tadabase.io/api/v1/data-tables/{tableId}/records/batch
Headers:
X-Tadabase-App-id: {appId}
X-Tadabase-App-Key: {appKey}
X-Tadabase-App-Secret: {appSecret}
Content-Type: application/json
Body:
{
"items": [
{
"field_123": "Value 1",
"field_456": "Value 2"
},
{
"field_123": "Value 3",
"field_456": "Value 4"
}
]
}
Advantages:
- Fully automated
- Can include complex logic
- Integration-friendly
- Real-time processing
Limitations:
- Requires API knowledge
- Limited to 100 records per request
- Need to implement error handling
Best Practices for Bulk Create
- Validate Data First - Check for required fields, correct formats
- Test with Small Batch - Try 10-20 records first
- Backup Before - Create backup in case you need to rollback
- Check for Duplicates - Ensure you're not creating duplicate records
- Consider Record Rules - Rules will fire for each record (may slow process)
- Monitor Progress - Watch for errors during creation
Bulk Update Operations
Updating many records simultaneously is one of the most powerful batch operations.
Scenario: Quarterly Price Update
You need to increase prices by 10% for all products in the "Electronics" category.
Method 1: Import Update
Using import for updates:
Step 1: Export Current Data
- Export Products table
- Filter for Category = "Electronics"
- Include Product ID and Current Price
Step 2: Modify in Excel
- In Excel, add formula: =Current_Price * 1.10
- Calculate new prices for all rows
- Keep Product ID for matching
Step 3: Import Update
- Import file with Product ID and New Price columns
- Choose "Update Existing Records"
- Match on Product ID
- Map New Price to Price field
- Execute update
Method 2: Scheduled Task Update
Automated updates via scheduled tasks:
Create Scheduled Task
- Builder > Scheduled Tasks > Add New
- Name: "Quarterly Price Increase"
- Schedule: Run manually or set date
- Add conditions to select records
- Add update action
- Execute
Configure Update Action
In the update action:
- Table - Select Products table
- Filters - Category equals "Electronics"
- Field Updates - Price = Price * 1.10 (using equation)
- Limit - Set limit or leave unlimited for all matching records
Method 3: API Batch Update
For programmatic updates:
PATCH https://api.tadabase.io/api/v1/data-tables/{tableId}/records/batch
Headers:
X-Tadabase-App-id: {appId}
X-Tadabase-App-Key: {appKey}
X-Tadabase-App-Secret: {appSecret}
Content-Type: application/json
Body:
{
"items": [
{
"id": "record_id_1",
"field_123": "Updated Value 1"
},
{
"id": "record_id_2",
"field_123": "Updated Value 2"
}
]
}
Conditional Bulk Updates
Update records only if certain conditions are met.
Example: Expire Old Subscriptions
Update subscription status to "Expired" for all subscriptions where End Date is in the past:
Via Scheduled Task:
- Create daily scheduled task
- Filter: End Date is before today
- Filter: Status is not "Expired"
- Update: Status = "Expired"
Example: Auto-Archive Old Records
Move records older than 2 years to archived status:
Via Scheduled Task:
- Schedule to run monthly
- Filter: Created Date is more than 2 years ago
- Filter: Status is not "Archived"
- Update: Status = "Archived", Archived Date = Today
Bulk Update Best Practices
- Always Backup First - Critical for updates affecting many records
- Test Filter Logic - Verify filters select correct records
- Start Small - Test update on 10 records first
- Use Staging Field - Update to temp field first, verify, then copy to final field
- Log Changes - Record what was updated and when
- Verify Results - Check random sample after update
- Consider Rules - Record rules will fire (may want to temporarily disable)
Bulk Delete Operations
Deleting multiple records requires extra caution.
When to Bulk Delete
Appropriate use cases:
- Removing test data
- Deleting expired records
- Cleaning up duplicate records
- Removing spam submissions
- Data retention compliance
Danger: Bulk Delete is Permanent
Important Warnings:
- Deleted records cannot be recovered without backup
- Connected records may be affected
- Deletion cascades to child records (depending on settings)
- All file attachments will be deleted
- Audit trails are lost (unless logged separately)
Soft Delete Alternative
Instead of actually deleting, consider soft delete:
- Add "Deleted" checkbox field or "Status" dropdown
- Update records to mark as deleted instead of deleting
- Filter out "deleted" records in views
- Can always restore if needed
- Maintains audit trail
Bulk Delete via Builder
Manual bulk delete:
- Navigate to table in Builder
- Apply filters to show only records to delete
- Verify filter results carefully
- Select records (if selection available)
- Click Delete button
- Confirm deletion
Bulk Delete via Scheduled Task
Automated deletion:
- Create scheduled task
- Set appropriate schedule
- Add filter conditions (very carefully)
- Add "Delete Record" action
- Test with manual execution first
- Monitor logs after activation
Example: Delete Old Logs
Automatically delete log entries older than 90 days:
Scheduled Task Configuration:
- Schedule: Daily at 2:00 AM
- Table: Activity Logs
- Filter: Created Date is more than 90 days ago
- Action: Delete Record
- Limit: 1000 records per run (to avoid performance issues)
Bulk Delete via API
Programmatic deletion:
DELETE https://api.tadabase.io/api/v1/data-tables/{tableId}/records/batch
Headers:
X-Tadabase-App-id: {appId}
X-Tadabase-App-Key: {appKey}
X-Tadabase-App-Secret: {appSecret}
Content-Type: application/json
Body:
{
"record_ids": ["id1", "id2", "id3"]
}
Bulk Delete Best Practices
- Backup Before Deleting - Non-negotiable for bulk deletes
- Test Filters Extensively - Export filtered records and review before deleting
- Start Very Small - Delete 5 records, verify, then scale up
- Use Soft Delete - Preferred method unless hard delete required
- Export Before Deleting - Save copy of records being deleted
- Check Dependencies - Verify no critical dependencies
- Limit Batch Size - Delete in batches of 500-1000, not all at once
- Log Deletions - Record what was deleted and when
- Monitor After - Check for any issues after bulk delete
Batch Operations with Scheduled Tasks
Scheduled tasks are powerful for automated batch operations.
Common Scheduled Batch Operations
Example 1: Nightly Status Update
- Table: Orders
- Schedule: Daily at 1:00 AM
- Filter: Status = "Pending" AND Created Date more than 7 days ago
- Action: Update Status to "Expired"
Example 2: Monthly Subscription Renewal
- Table: Subscriptions
- Schedule: 1st of each month at 3:00 AM
- Filter: Next Billing Date = Today AND Status = "Active"
- Action: Update Next Billing Date to +1 month, Log renewal
Example 3: Weekly Data Cleanup
- Table: Temporary Records
- Schedule: Every Sunday at 2:00 AM
- Filter: Is Temp = Yes AND Created Date more than 7 days ago
- Action: Delete record
Example 4: Daily Reminder Emails
- Table: Tasks
- Schedule: Daily at 8:00 AM
- Filter: Due Date = Today AND Status = "Open" AND Reminder Sent = No
- Action: Send email reminder, Update Reminder Sent = Yes
Building Complex Batch Logic
Combine multiple actions in sequence:
- Filter Records - Apply conditions to select records
- Update Field 1 - Make first update
- Update Field 2 - Make second update
- Send Notification - Email stakeholders
- Log Action - Create log entry
Error Handling in Scheduled Tasks
Configure error handling:
- Email on Failure - Send email if task fails
- Retry Logic - Attempt again if first execution fails
- Error Logging - Log errors to separate table
- Partial Success - Continue processing even if some records fail
Monitoring Scheduled Batch Operations
Track task execution:
- Review task execution logs regularly
- Check how many records were processed
- Monitor for errors or failures
- Verify expected results
- Set up alerts for failures
Performance Considerations
Batch operations on large datasets require optimization.
Batch Size Limits
Understand practical limits:
- Import - 10,000 records recommended per import
- API - 100 records per API request
- Scheduled Tasks - No hard limit, but consider performance
- Page Actions - Depends on filtered result size
Optimizing Large Batch Operations
Strategies for 10,000+ records:
Strategy 1: Split into Batches
Instead of one large operation:
- Create multiple smaller batches
- Process 1,000-2,000 records at a time
- Use filters to segment (by date, category, etc.)
- Run batches sequentially or during off-hours
Strategy 2: Schedule During Off-Peak
Run large batches when system load is low:
- Schedule for night or early morning
- Avoid peak usage hours
- Reduces impact on active users
Strategy 3: Disable Non-Critical Rules
Temporarily disable rules during batch:
- Record rules fire for each record (slows process)
- Disable non-essential rules before batch
- Re-enable after batch completes
- Only do this if rules not critical for data integrity
Strategy 4: Use API with Pagination
For very large datasets:
- Retrieve records via API in pages of 100
- Process each page
- Update via batch API
- Continue until all records processed
Measuring Performance
Track batch operation metrics:
- Total records processed
- Time to complete
- Records per second
- Success rate
- Error count
Error Handling & Validation
Robust error handling ensures reliable batch operations.
Pre-Validation
Validate before executing batch:
- Data Type Check - Ensure all values match field types
- Required Fields - Verify required fields have values
- Constraints - Check against unique constraints
- Dependencies - Verify connected records exist
- Business Rules - Validate against custom rules
Handling Partial Failures
When some records fail:
Approach 1: All or Nothing
If any record fails, rollback all:
- Ensures consistency
- More complex to implement
- May waste processing on valid records
- Use for critical operations
Approach 2: Best Effort
Process valid records, log failures:
- Maximum records processed
- Simpler to implement
- Generate error report for failed records
- Use for non-critical operations
Error Logging
Log errors for troubleshooting:
- Record ID that failed
- Error message
- Timestamp
- Operation type (create, update, delete)
- User or process that initiated
Rollback Strategies
If batch operation goes wrong:
Option 1: Restore from Backup
If you created backup before batch:
- Restore entire table from backup
- Most reliable method
- May lose changes made after backup
Option 2: Reverse Batch
Create opposite batch operation:
- For create: delete created records
- For update: update back to original values
- For delete: can't reverse without backup
Option 3: Manual Correction
For small number of affected records:
- Manually correct each record
- Time-consuming but precise
- Good for spot fixes
Best Practices Summary
Key practices for successful batch operations:
Before Execution
- Create Backup - Non-negotiable for updates and deletes
- Test with Sample - Always test with 10-20 records first
- Validate Data - Check data quality before processing
- Verify Filters - Export filtered records to confirm correct selection
- Document Plan - Write down what you're doing and why
- Schedule Downtime - If operation affects active users
During Execution
- Monitor Progress - Watch for errors or issues
- Watch Performance - Ensure system remains responsive
- Be Ready to Stop - Know how to cancel if needed
- Log Everything - Record all actions and results
After Execution
- Verify Results - Check that changes are correct
- Check Sample Records - Review random records for accuracy
- Monitor for Issues - Watch for user-reported problems
- Document Results - Record what happened
- Review Errors - Address any failed records
- Update Processes - Improve based on learnings
Practical Exercise
Complete a full batch operation workflow.
Exercise: Bulk Status Update System
Scenario: You manage a task tracking application and need to automatically mark overdue tasks.
Step 1: Prepare Data
- Create Tasks table with fields: Title, Due Date, Status (dropdown: Open, In Progress, Complete, Overdue)
- Add 20 sample tasks with various due dates (some past, some future)
- All tasks currently have Status = "Open" or "In Progress"
Step 2: Manual Batch Update
- Export tasks where Due Date
- In Excel, change Status to "Overdue" for all rows
- Import file with "Update Existing Records"
- Match on Task ID
- Verify updates in table
Step 3: Create Scheduled Task
- Create scheduled task: "Mark Overdue Tasks"
- Schedule: Daily at 9:00 AM
- Filters: - Due Date is before today - Status is not "Complete" - Status is not "Overdue"
- Action: Update Status to "Overdue"
- Add second action: Send email notification to task owner
Step 4: Test Execution
- Manually run scheduled task
- Check execution log
- Verify tasks updated correctly
- Confirm emails sent
Step 5: Create Reverse Process
- Create second scheduled task: "Clear Overdue Status"
- Filters: - Status is "Overdue" - Due Date is after today (date was changed)
- Action: Update Status to "Open"
- This handles cases where due dates are extended
Step 6: Add Logging
- Create Status Change Log table
- Add action to scheduled task to log each status change
- Include: Task ID, Old Status, New Status, Changed Date, Changed By
Next Steps
You now understand batch operations in Tadabase, including bulk creates, updates, and deletes. You can process thousands of records efficiently and safely with proper validation, error handling, and rollback strategies.
In the next article, you'll learn about logging and auditing—creating comprehensive audit trails, tracking changes, and maintaining compliance records.
Next: Continue to Logging and Auditing to learn audit trail creation and compliance tracking.

We'd love to hear your feedback.