Understanding Validation Scenarios and Timing
Introduction
Business rules execute at specific moments during record lifecycle events - when users create records, modify fields, delete data, or post transactions. Understanding when validation scenarios trigger determines whether rules prevent invalid data entry, validate changes before persistence, or enforce constraints during posting operations.
Incorrect scenario selection leads to rules executing too early (blocking valid interim states), too late (allowing invalid data to persist), or too frequently (degrading performance). Common timing issues include validation running before related data exists, rules blocking necessary intermediate steps, performance impacts from excessive re-validation, and missing validation at critical control points.
QUALIA Rule Engine provides validation scenarios that correspond to Business Central trigger points: OnInsert (record creation), OnModify (field changes), OnDelete (record removal), OnValidate (field-specific validation), BeforePost (pre-posting validation), AfterPost (post-posting processing), and Scheduled (periodic batch validation).
Scenario selection criteria:
Data availability (which fields and related records exist at trigger point)
User experience (immediate feedback vs. deferred validation)
Performance impact (frequency of execution)
Business requirement (prevent entry vs. warn vs. post-process)
Transaction context (real-time user action vs. batch processing)
Part 1: OnInsert Scenario
When Records Are Created
OnInsert scenario executes when new records are created, before any field modifications occur beyond initial default values.
Timing: Immediately after record creation, before user modifies additional fields
Available Data:
Record being inserted with default field values
System-populated fields (No., timestamps)
Related records accessible via source references
User context (current user, date, time)
Use Cases:
Applying default values based on business rules
Validating initial state requirements
Copying data from related records
Initializing calculated fields
Enforcing creation authorization
Validation Set: Sales Order - Initial Setup - OnInsert
Rule 1: Set Default Priority Based on Customer
Table: Sales Header (36)
Source References:
Condition:
Action - Assign:
Placeholders Used in This Example:
[36:2]- Sales Header (Table 36): Sell-to Customer No. (Field 2) - Links order to customer[18:1]- Customer (Table 18): No. (Field 1) - Customer number for linking[18:CustomVIPStatus]- Customer (Table 18): Custom VIP Status field - Boolean flag indicating VIP customer[36:CustomPriority]- Sales Header (Table 36): Custom Priority field - Target field for assignmentFixed value: 'High' - Static text value assigned on insert
Why OnInsert: Default value needed immediately on record creation before user modifies other fields.
Rule 2: Validate Customer Account Status
Condition:
Action - Message:
Why OnInsert: Prevent order creation for blocked customers immediately.
OnInsert Considerations
Advantages:
Establishes valid initial state
Sets defaults before user modifications
Provides immediate feedback on creation constraints
Prevents invalid records from persisting
Limitations:
Limited data available (only defaults populated)
May block creation for records requiring later field population
Cannot reference line-level data (header created before lines)
Best Practices:
Use for default value assignment
Validate master data relationships (customer, vendor, item exist)
Check authorization for record creation
Avoid validating fields user hasn't populated yet
Part 2: OnModify Scenario
When Fields Change
OnModify scenario executes when existing records are modified, after field changes applied.
Timing: After field modification, before record saved to database
Available Data:
Current field values (new data)
Previous field values (old data using {Table:Field} syntax)
Related records via source references
Calculated fields and aggregates
Use Cases:
Detecting significant field changes
Updating dependent fields automatically
Validating field relationships
Tracking modifications with timestamps
Enforcing change authorization
Validation Set: Customer - Credit Limit Changes - OnModify
Rule 1: Track Credit Limit Increases
Table: Customer (18)
Condition:
Action - Assign (Multiple):
Placeholders Used in This Example:
{18:59}- Customer (Table 18): Credit Limit (LCY) (Field 59) - OLD value before modification (curly braces)[18:59]- Customer (Table 18): Credit Limit (LCY) (Field 59) - NEW current value (square brackets)Change detection:
{18:59} <> [18:59]- First condition checks if credit limit changedIncrease check:
[18:59] > {18:59}- Second condition checks if it's an increase (not decrease)AND logic: Both conditions must be true via separate scenarios
[18:CustomPreviousCreditLimit]- Customer (Table 18): Custom Previous Credit Limit field - Target for old value[T]- System placeholder: Today's date - Assigned to change date field[CurrentUser]- System placeholder: Current user ID - Assigned to change user field[18:CustomCreditLimitChangeDate]- Customer (Table 18): Custom date field - Tracks when limit changed[18:CustomCreditLimitChangeUser]- Customer (Table 18): Custom user field - Tracks who changed limitMultiple Assign actions: All three assignments execute when conditions met
Note: Old value placeholders
{18:59}only available on Modify triggers
Why OnModify: Access to both old and new values enables change detection and history tracking.
Rule 2: Validate Significant Credit Increases
Condition:
Action - Confirmation:
Placeholders Used in This Example:
{18:59}- Customer (Table 18): Credit Limit (LCY) (Field 59) - OLD value before modification[18:59]- Customer (Table 18): Credit Limit (LCY) (Field 59) - NEW current valueChange detection:
{18:59} <> [18:59]- Checks if credit limit changedSignificant increase:
[18:59] > {18:59} * 1.5- Checks if new limit more than 150% of old (over 50% increase)Increase amount:
[18:59 - {18:59}]- Calculated dollar increasePercentage calculation:
[[18:59 / {18:59} - 1] * 100]- Percentage increaseRevert formula:
[18:59] = {18:59}- If user clicks No, revert to old valueMessage type: Confirmation - User must approve or decline
Why OnModify: Validation occurs when limit changes, with access to previous value for comparison.
Difference Between OnModify and OnValidate
OnModify:
Triggers on any field change
Sees all field changes together
Executes once per record save
Suitable for: Cross-field validation, general change tracking, aggregate-based rules
OnValidate (specific field):
Triggers when specific field changed
Sees only that field's change
Executes immediately when field loses focus
Suitable for: Field-specific validation, immediate feedback, cascading updates
Example - OnModify Approach:
Validation Set: Sales Header - General Changes - OnModify
Condition:
Example - OnValidate Approach:
Validation Set: Sales Header - Order Date Validation - OnValidate (Field: Order Date)
Condition:
Use OnModify when: Need to see multiple field changes, perform aggregate calculations, or validate relationships across fields.
Use OnValidate when: Need immediate per-field feedback, validate single field constraints, or trigger cascading updates from specific field.
Part 3: OnDelete Scenario
When Records Are Removed
OnDelete scenario executes when records are being deleted, before deletion occurs.
Timing: After delete initiated, before record removed from database
Available Data:
Record being deleted (all fields)
Related records via source references
Can detect dependencies and constraints
Use Cases:
Preventing deletion of records with dependencies
Requiring confirmation for critical deletions
Validating deletion authorization
Creating delete audit trails
Archiving data before deletion
Validation Set: Customer - Deletion Protection - OnDelete
Rule 1: Prevent Deletion with Open Orders
Table: Customer (18)
Source References:
Condition:
Action - Message:
Placeholders Used in This Example:
[18:1]- Customer (Table 18): No. (Field 1) - Customer number for linking[36:2]- Sales Header (Table 36): Sell-to Customer No. (Field 2) - Links orders to customer[36:1]- Sales Header (Table 36): Document Type (Field 1) - Filter for 'Order' type[36:120]- Sales Header (Table 36): Status (Field 120) - Filter for not 'Posted' statusCOUNT(36:*)- Aggregate function: Counts matching Sales Header recordsCondition:
COUNT(36:*) > 0- Checks if customer has any open orders[18:2]- Customer (Table 18): Name (Field 2) - Customer name in messageLIST(36:3)- Aggregate function: Lists all matching order numbers (Field 3 = No.)Source Reference filtering: Filters
[36:1] is 'Order'and[36:120] is not 'Posted'configured in linked table setupMessage type: Error - Blocks deletion operation
Why OnDelete: Validates dependencies before allowing deletion.
Rule 2: Archive Before Deletion
Validation Set: Item - Deletion Archival - OnDelete
Condition:
Action - Assign (to Archive Table):
Why OnDelete: Captures complete record state before removal.
Part 4: BeforePost Scenario
Pre-Posting Validation
BeforePost scenario executes during posting process, after user initiates post but before transactions finalized.
Timing: During posting procedure, before ledger entries created
Available Data:
Complete document (header and all lines)
All related records
Final calculated totals
Posting parameters
Use Cases:
Final validation before posting
Calculating posting-time values
Enforcing posting authorization
Validating document completeness
Checking regulatory compliance
Validation Set: Sales Order - Final Validation - BeforePost
Rule 1: Validate Minimum Margin
Table: Sales Header (36)
Source References:
Condition:
Action - Confirmation:
Placeholders Used in This Example:
[36:3]- Sales Header (Table 36): No. (Field 3) - Order number for linking and display[37:3]- Sales Line (Table 37): Document No. (Field 3) - Links lines to header[37:6]- Sales Line (Table 37): No. (Field 6) - Item number for linking[27:1]- Item (Table 27): No. (Field 1) - Item number for linkingSUM(37:22)- Aggregate sum of Sales Line: Unit Price (Field 22) - Total sales amount[27:22]- Item (Table 27): Unit Cost (LCY) (Field 22) - Cost per itemSUM(37:15)- Aggregate sum of Sales Line: Quantity (Field 15) - Total quantitySUM(27:22) * SUM(37:15)- Calculated expression: Total cost (cost per item × total quantity)Revenue minus cost:
SUM(37:22) - SUM(27:22) * SUM(37:15)- Total profit amountMargin percentage:
([SUM(37:22) - SUM(27:22) * SUM(37:15)]) / SUM(37:22)- Profit as percentage of revenueCondition: Margin
< 0.15- Checks if margin less than 15%Multiple nested aggregates: Combines sales line and item data across multiple related records
Message type: Confirmation - User must approve or decline posting
Why BeforePost: All lines finalized, complete margin calculation possible, last opportunity to prevent posting.
Rule 2: Enforce Posting Authorization
Condition:
Action - Message:
Why BeforePost: Final authorization check before financial impact occurs.
BeforePost vs. OnRelease
BeforePost:
Executes during posting
After all order changes finalized
Validates final posting state
Prevents ledger entry creation
OnRelease (custom scenario):
Executes when order released
Before picking/shipping operations
Validates order ready for fulfillment
Controls warehouse workflow initiation
Use BeforePost when: Need access to complete final document state, validate before financial entries, enforce posting-specific constraints.
Use OnRelease when: Control workflow progression, validate operational readiness, authorize warehouse processing.
Part 5: AfterPost Scenario
Post-Posting Processing
AfterPost scenario executes after posting completes successfully.
Timing: After ledger entries created, before user control returns
Available Data:
Posted document
Created ledger entries
Updated master data (inventory, balances)
Use Cases:
Sending posting notifications
Creating follow-up tasks
Updating analytical fields
Triggering external integrations
Generating reports
Validation Set: Sales Order - Post-Posting Actions - AfterPost
Rule 1: Send Order Confirmation
Table: Sales Header (36)
Source References:
Condition:
Action - Email:
Placeholders Used in This Example:
[36:2]- Sales Header (Table 36): Sell-to Customer No. (Field 2) - Links to customer[18:1]- Customer (Table 18): No. (Field 1) - Customer number for linking[18:CustomSendConfirmations]- Customer (Table 18): Custom Send Confirmations field - Boolean opt-in flag[18:102]- Customer (Table 18): E-Mail (Field 102) - Customer email address[36:3]- Sales Header (Table 36): No. (Field 3) - Order number in email[T]- System placeholder: Today's date - Posted date[Posted Invoice No.]- System placeholder: Posted document number - Generated after postingCondition:
[18:CustomSendConfirmations] is true- Only send if customer opted in
Why AfterPost: Confirms successful posting, includes posted document numbers and details.
Rule 2: Update Customer Analytics
Condition:
Action - Assign:
Placeholders Used in This Example:
[T]- System placeholder: Today's date - Assigned to last order date[18:CustomLastOrderDate]- Customer (Table 18): Custom Last Order Date field - Tracks most recent order[18:CustomOrderCount]- Customer (Table 18): Custom Order Count field - Running total of ordersIncrement expression:
[18:CustomOrderCount] + 1- Adds 1 to current count[18:CustomTotalOrderValue]- Customer (Table 18): Custom Total Order Value field - Cumulative order value[36:109]- Sales Header (Table 36): Amount Including VAT (Field 109) - Current order amountAccumulation expression:
[18:CustomTotalOrderValue] + [36:109]- Adds current order to running total[18:CustomAvgOrderValue]- Customer (Table 18): Custom Average Order Value field - Calculated averageAverage calculation:
[18:CustomTotalOrderValue] / [18:CustomOrderCount]- Total divided by countMultiple Assign actions: All four assignments execute after successful posting
Always execute condition: No condition specified, runs for every posted order
Why AfterPost: Guaranteed order posted successfully before updating statistics.
Part 6: Scheduled Scenario
Periodic Batch Processing
Scheduled scenarios execute on defined intervals (hourly, daily, weekly) independent of user actions.
Timing: Background processing at scheduled times
Available Data:
All records meeting filter criteria
Cross-record analysis possible
Historical data for trending
Use Cases:
Periodic data cleanup
Batch notifications and alerts
Scheduled reports
Data consistency checks
Expired record processing
Validation Set: Items - Daily Inventory Review - Scheduled Daily (2 AM)
Rule 1: Low Inventory Alerts
Table: Item (27)
Condition:
Action - Email:
Placeholders Used in This Example:
[27:Inventory]- Item (Table 27): Inventory field - Current on-hand quantity[27:15]- Item (Table 27): Reorder Point (Field 15) - Threshold for reorderingCondition:
[27:Inventory] <= [27:15]- Checks if inventory at or below reorder pointAdditional condition:
[27:15] > 0- Only includes items with reorder point configuredAND logic: Both conditions must be true via scenario
[T]- System placeholder: Today's date - In email subject line[27:1]- Item (Table 27): No. (Field 1) - Item number in report[27:3]- Item (Table 27): Description (Field 3) - Item description in report[27:16]- Item (Table 27): Reorder Quantity (Field 16) - Suggested order quantity[27:CustomLastSaleDate]- Item (Table 27): Custom Last Sale Date field - Last activity dateFOR EACH loop: Iterates through all items matching condition, generating list in email body
Static recipient: Fixed email address for inventory manager
Why Scheduled: Daily batch notification avoids real-time alerts for every inventory movement.
Rule 2: Expired Record Cleanup
Validation Set: Sales Quotes - Expired Quote Processing - Scheduled Daily
Condition:
Action - Assign:
Placeholders Used in This Example:
[36:1]- Sales Header (Table 36): Document Type (Field 1) - Filter for 'Quote' type[36:20]- Sales Header (Table 36): Document Date (Field 20) - Quote date for age calculation[T]- System placeholder: Today's date - Used in date comparison and assignmentDate arithmetic:
[T] - 90- Calculates date 90 days agoAge check:
[36:20] < [T] - 90- Checks if quote date older than 90 days[36:120]- Sales Header (Table 36): Status (Field 120) - Filter for 'Open' statusAND logic: All three conditions must be true (quote type, age, open status)
[36:CustomStatus]- Sales Header (Table 36): Custom Status field - Target for 'Expired' assignment[36:CustomExpiredDate]- Sales Header (Table 36): Custom Expired Date field - Target for date stampMultiple Assign actions: Both assignments execute when conditions met
LIST function: Generates list of expired quote numbers for email report
Why Scheduled: Batch processing avoids checking every quote individually, processes all expired quotes together.
Part 7: Scenario Selection Guidelines
Decision Matrix
Choose OnInsert when:
Need to set defaults on new records
Validate creation authorization
Prevent invalid record creation
Initialize calculated fields
Choose OnModify when:
Track field changes with old/new values
Validate cross-field relationships
Update dependent fields
Detect significant modifications
Choose OnDelete when:
Validate deletion allowed
Check for dependencies
Archive before removal
Require deletion confirmation
Choose BeforePost when:
Validate complete document before posting
Enforce posting authorization
Calculate final values
Prevent invalid postings
Choose AfterPost when:
Send notifications after successful posting
Update analytical fields
Trigger external systems
Generate completion reports
Choose Scheduled when:
Periodic batch processing
Data cleanup and maintenance
Scheduled reports
Off-peak performance-intensive operations
Performance Considerations
High-frequency scenarios (execute often):
OnModify (every field change)
OnValidate (specific field changes)
Optimization strategies:
Narrow conditions to reduce rule execution
Minimize aggregate calculations
Use appropriate source reference filters
Consider scheduled batch processing for non-critical validations
Low-frequency scenarios (execute rarely):
OnInsert (record creation only)
OnDelete (record removal only)
BeforePost (posting operations)
AfterPost (posting operations)
Scheduled (defined intervals)
Optimization less critical for these scenarios due to infrequent execution.
Testing Checklist
When testing scenario timing:
Verify rule executes at expected moment
Confirm required data available at execution time
Test with typical user workflows
Validate performance under load
Check behavior with related record changes
Test error handling and rollback
Verify user experience (feedback timing)
Confirm no unintended execution triggers
Test scheduled scenarios run on schedule
Validate batch processing handles volume
Summary and Key Takeaways
This guide covered QUALIA validation scenarios and their timing in Microsoft Dynamics 365 Business Central:
OnInsert - Record creation: default values, initial validation, creation authorization
OnModify - Field changes: change detection, dependent field updates, modification tracking
OnDelete - Record removal: dependency validation, deletion protection, archival
BeforePost - Pre-posting: final validation, complete document review, posting authorization
AfterPost - Post-posting: notifications, analytical updates, external triggers
Scheduled - Batch processing: periodic cleanup, scheduled reports, maintenance tasks
Timing selection impact:
User experience (immediate vs. deferred feedback)
Data availability (which fields and relationships accessible)
Performance (execution frequency and overhead)
Business control (prevent vs. warn vs. post-process)
Implementation exercise: Map business requirements to scenarios:
List business validation requirements
Identify required data for each validation
Determine acceptable timing for user feedback
Assess performance impact of execution frequency
Select appropriate scenario for each rule
Test with realistic workflows
Refine timing based on user experience
Related topics:
Blog 017: Multi-Condition Validation (condition logic across scenarios)
Blog 024: Aggregate Calculations (performance in different scenarios)
Blog 026: Assign Actions (field updates across scenarios)
Blog 032: Testing and Debugging (scenario-specific testing)
This blog is part of the QUALIA Rule Engine series for Microsoft Dynamics 365 Business Central. Follow along as we explore business rule automation patterns.
Business Central
>
Triggering Power Automate Flows from Business Rules
>
Advanced Table Linking and Cross-Record Validation
>
Aggregate Calculations Across Related Records: Summing, Counting, and Analyzing Data
>
Automated Email Notifications from Business Rules
>
Automatically Setting Field Values with Assign Actions
>
Building an Approval Workflow: When Orders Need Manager Sign-Off
>
Building Commission Calculation Rules for Sales Teams: Automating Sales Incentives
>
Building Multi-Condition Validation Rules: Understanding Independent Condition Evaluation
>
Construction and Project-Based Industry Solutions
>
Creating Your First Business Rule: A Step-by-Step Beginner's Guide
>
Custom Validation Messages for Business Rules
>
Distribution and Logistics Industry Solutions
>
Energy and Utilities Industry Solutions
>
Financial Services Industry Solutions
>
Food and Beverage Industry Solutions
>
Government and Public Sector Procurement Solutions
>
Healthcare and Medical Supply Industry Solutions
>
How to Implement Credit Limit Validation in 10 Minutes
>
How to Link Multiple Tables for Complex Multi-Table Validation
>
How to Prevent Infinite Loops in Your Business Rules
>
How to Prevent Negative Inventory with Business Rules
>
How to Validate Customer Data Before Order Creation
>
Implementing Discount Authorization Rules: Control Pricing with Confidence
>
Implementing Required Field Validation: Ensuring Data Completeness
>
Interactive Confirmation Dialogs in Business Rules
>
Manufacturing Industry Solutions
>
Non-Profit and Grant Management Solutions
>
Performance Optimization for Business Rules
>
Pharmaceuticals and Life Sciences Solutions
>
Preventing Data Entry Errors: Validation Best Practices
>
Professional Services Industry Solutions
>
Real Estate and Property Management Solutions
>
Retail and Point-of-Sale Industry Solutions
>
Rule Groups and User Permissions: Controlling Who Gets Which Rules
>
Rule Set Organization and Maintenance
>
Rule Versioning and Change Management
>
Testing and Debugging QUALIA Business Rules
>
Transportation and Logistics Industry Solutions
>
Understanding the Rule Execution Pipeline: From Trigger to Action
>
Understanding Validation Scenarios and Timing
>
Using Old Value Placeholders for Change Detection and Validation
Related Posts
Understanding the Rule Execution Pipeline: From Trigger to Action
QUALIA Rule Engine operates as a sophisticated event-driven system that intercepts data changes in Business Central and evaluates configured business rules in real-time. Understanding the execution pipeline—how a database operation flows through trigger detection, scenario evaluation, condition processing, and action execution—is essential for advanced rule design, performance optimization, and troubleshooting.
Energy and Utilities Industry Solutions
Energy and utilities companies face complex regulatory requirements including FERC compliance, NERC reliability standards, environmental regulations, rate case filings, renewable energy credit tracking, interconnection agreements, demand response programs, and outage management protocols. Asset-intensive operations with critical infrastructure, regulatory cost recovery mechanisms, time-of-use pricing structures, and customer meter-to-cash processes demand automated validation beyond standard ERP capabilities.
Real Estate and Property Management Solutions
Real estate and property management companies require specialized business rules for lease administration, tenant billing, common area maintenance (CAM) reconciliation, security deposit tracking, maintenance workflow management, vacancy management, rent escalation calculations, and portfolio performance analysis. Multi-entity property ownership structures, percentage rent calculations, operating expense recoveries, lease abstraction accuracy, and compliance with lease accounting standards (ASC 842 / IFRS 16) demand automated validation beyond standard ERP capabilities.
