Data Migration Strategy & Execution
Part 4 of 11 in the Business Central Implementation Series
Published: December 2025 | Reading Time: 14 minutes
Introduction
Data migration is often the most underestimated yet critical phase of Business Central implementation. It's the bridge between your legacy systems and your new ERP platform—the process that brings your business history, relationships, and operational data into your configured Business Central environment.
While it might be tempting to view data migration as a simple "lift and shift" operation, successful migrations require careful planning, rigorous data cleansing, systematic validation, and meticulous execution. Poor data migration can cripple an otherwise excellent implementation, while a well-executed migration sets the stage for confident go-live and long-term success.
This comprehensive guide provides a strategic framework for planning and executing Business Central data migration, from initial assessment through final cutover validation.
📋 Business Central Data Migration Steps (8-Phase Process)
Assessment Phase: Analyze source data quality, volume, and structure; identify data cleansing needs
Planning Phase: Define migration scope (master data, open transactions, historical data), select tools, and establish cutover timeline
Design Phase: Create detailed data mapping (source → target fields), transformation rules, and validation criteria
Tool Selection: Choose migration method: Configuration Packages (Excel), API-based tools (Azure Data Factory, Power Automate), or custom AL extensions
Data Cleansing: Remove duplicates, standardize formats, validate master data, and fix legacy data issues before migration
Testing Phase: Execute test migrations in Sandbox environment, validate data accuracy, and reconcile to source system
Cutover Execution: Freeze legacy system, perform final data extraction, load into Production Business Central, and validate critical balances
Post-Migration Validation: Reconcile opening balances, verify customer/vendor aging, test transactions, and confirm reporting accuracy
Typical Timeline: 3-6 weeks for data migration phase depending on data volume and complexity.
💡 Pricing & Timeline Note
All cost estimates and timelines in this article reflect typical Business Central implementations as of January 2026.
Geographic Context: Estimates based on Western Europe and North America markets
Regional Variation: Implementation costs vary significantly by region (typically 30-60% lower in Eastern Europe, Asia-Pacific, and Latin America)
Microsoft Licensing: Verify current prices at aka.ms/BCPricing as these change periodically
Effort-Based Budgeting: Use the consulting hours estimates with your local partner's rates for accurate budgeting
These are reference estimates for planning purposes. Request detailed quotes from Microsoft Solutions Partners for your specific requirements.
Data Migration Planning and Strategy
Effective migration begins with comprehensive planning that balances completeness with practicality.
Defining Migration Scope
Not all data deserves migration—strategic decisions about what to migrate and what to archive are essential.
Master Data vs. Transactional Data:
Master Data (Always migrate):
Customers and customer contacts
Vendors and vendor contacts
Items and inventory records
Chart of accounts
Fixed assets
Employees (if using Business Central for HR/payroll)
Price lists and discounts
Bill of materials (for manufacturing)
Transactional Data (Selective migration):
Open Transactions (Must migrate):
Open customer invoices and credit memos
Open vendor invoices and credit memos
Open purchase orders
Open sales orders
Open bank transactions
Historical Transactions (Evaluate carefully):
Posted invoices and receipts
Payment history
General ledger history
Inventory transactions
Closed orders and quotes
Historical Data Decisions:
Full History Migration:
Pros: Complete business history available, comprehensive reporting, audit trail continuity
Cons: Extended migration time, larger database, more validation required, higher cost
Best for: Regulated industries, litigation concerns, extensive historical analysis needs
Selective History Migration:
Pros: Faster migration, cleaner starting point, reduced complexity
Cons: Limited historical reporting, potential gaps in audit trail
Best for: Most implementations, especially when legacy system remains accessible for reference
Opening Balance Migration:
Pros: Quickest approach, minimal data volume, clean start
Cons: No transaction-level detail, limited historical insight
Best for: Small businesses, companies with accessible legacy systems for historical queries
Recommended Approach:
Master data: Complete and current
Open transactions: All open items
Posted transactions: Last 12-24 months
G/L balances: Opening balances at go-live, with optional detailed history
Archived data: Keep legacy system in read-only mode for historical reference
Migration Timing and Cutover Strategy
Strategic timing minimizes business disruption and ensures accuracy.
Cutover Timing Considerations:
Month-End/Quarter-End Cutover:
Advantages: Clean financial period break, easier reconciliation, aligns with reporting cycles
Challenges: Often busiest time for finance team, pressure to close quickly in both systems
Mid-Period Cutover:
Advantages: Less time pressure, finance team availability, easier to handle issues
Challenges: Partial-period data in each system, more complex reconciliation
Year-End Cutover:
Advantages: Fresh start for new fiscal year, simplified reporting, clean tax year
Challenges: Extended preparation time, holiday season complications, limited year-end support availability
Recommended Timing:
Start of fiscal year or quarter (if business cycle permits)
Beginning of month (second-best option)
Avoid peak business periods (holiday seasons, fiscal year-end closing)
Allow adequate post-cutover stabilization before period-end close
Parallel Run Strategy:
Consider running both systems temporarily:
Benefits:
Confidence building through comparison
Safety net for mission-critical processes
Gradual transition reduces user stress
Validates Business Central configuration through live data
Challenges:
Double data entry workload
Reconciliation complexity
Extended timeline
Potential user confusion
When to Use Parallel Runs:
High-risk implementations with complex processes
Lack of confidence in data migration quality
User anxiety about new system
Regulatory or contractual requirements
When to Skip Parallel Runs:
Simple, straightforward implementations
Strong confidence in migration quality
Adequate sandbox testing completed
Business cannot sustain dual-entry workload
Identifying Data Sources and Legacy Systems
Comprehensive inventory of data sources ensures nothing is missed.
Source System Assessment:
Primary Source Systems:
Legacy ERP or accounting system
CRM system
Inventory management system
Point of sale systems
Time tracking systems
Project management applications
Manufacturing execution systems
Supplementary Data Sources:
Excel spreadsheets and databases
Email archives
Paper records requiring manual entry
Third-party data providers
Legacy backup files
Source System Analysis:
For each source system, document:
System name and version
Data owner and administrator
Available export formats
Data extraction capabilities
Data quality level
Last updated date
Accessibility and availability
Retention after go-live
Data Source Challenges:
Common Issues:
Outdated legacy systems with limited export capability
Undocumented custom systems
Multiple disconnected spreadsheets as "system of record"
Tribal knowledge not documented anywhere
Inconsistent data across different sources
Missing data requiring reconstruction
Mitigation Strategies:
Engage IT team early for technical extraction support
Identify business experts who understand data relationships
Allocate adequate time for data discovery
Budget for data archaeology if needed
Consider professional data extraction services for complex legacy systems
Data Cleansing and Validation Principles
Clean data is the foundation of Business Central success.
Data Quality Assessment
Begin with comprehensive quality evaluation:
Data Quality Dimensions:
Completeness:
Required fields populated
No missing critical information
All related records present
Full address information
Accuracy:
Data reflects reality
No obvious errors (e.g., negative quantities where illogical)
Calculations are correct
Dates are valid
Consistency:
Same entity described identically across records
Formats are standardized
Codes follow conventions
Related data matches
Uniqueness:
No duplicate records
Unique identifiers are truly unique
Master data not replicated
Validity:
Data conforms to Business Central constraints
Values fall within acceptable ranges
Relationships are valid (e.g., item exists in category)
Codes match lookup tables
Timeliness:
Data is current
Obsolete records identified
Status reflects current reality
Data Profiling Exercise:
Analyze source data systematically:
Volume Analysis:
Record counts by entity type
Active vs. inactive records
Historical date ranges
Growth trends
Field Population Analysis:
Percentage of records with each field populated
Common patterns in missing data
Mandatory field compliance
Data Distribution Analysis:
Value frequency distributions
Outlier identification
Pattern detection
Anomaly flagging
Relationship Analysis:
Orphaned records (child without parent)
Missing relationships
Referential integrity violations
Data Cleansing Strategies
Transform problematic source data into high-quality Business Central data.
Common Cleansing Activities:
Deduplication:
Identify duplicate customer/vendor/item records
Establish master record selection criteria
Merge or eliminate duplicates
Update transactional references
Standardization:
Address formats (USPS standards, international conventions)
Phone number formats
Name capitalization and spelling
Unit of measure consistency
Code formats
Enrichment:
Add missing required fields
Complete partial records
Lookup and add tax registration numbers
Geocode addresses
Categorize uncategorized items
Validation and Correction:
Correct obvious errors
Validate against external sources
Fix calculation errors
Resolve date inconsistencies
Correct negative values where inappropriate
Obsolete Data Handling:
Flag inactive customers/vendors
Archive discontinued items
Mark closed accounts
Identify and segregate test data
Data Transformation:
Convert legacy codes to Business Central conventions
Map legacy categories to new taxonomy
Transform data types as needed
Restructure hierarchical relationships
Cleansing Workflow:
Extract source data to staging area
Profile to identify issues
Define cleansing rules and transformations
Execute automated cleansing where possible
Review exceptions and edge cases
Manual correction of remaining issues
Validate cleansed data quality
Document cleansing decisions and rules
Cleansing Tools:
Excel for simple transformations
SQL scripts for bulk operations
Data quality tools (OpenRefine, Trifacta)
Custom scripts (Python, PowerShell)
Business Central data import validation
Data Mapping Methodology
Precisely define how source data maps to Business Central structures.
Data Mapping Documentation:
Create comprehensive mapping specifications:
Mapping Document Structure:
For each Business Central entity:
Entity Information:
Business Central table name
Import method (RapidStart, Excel, API, custom)
Load sequence order
Dependencies
Field Mappings:
Business Central Field | Source System | Source Field | Transformation Rule | Validation Rule | Default Value | Required |
|---|---|---|---|---|---|---|
No. | Legacy ERP | CUST_ID | Format: remove dashes | Must be unique | Auto-number | Yes |
Name | Legacy ERP | CUST_NAME | Title case | Max 100 chars | - | Yes |
Address | Legacy ERP | CUST_ADDR1 | Standardize format | - | - | No |
Transformation Rules:
Document each transformation:
Lookup Tables:
Create mapping reference tables:
Example: Payment Terms Mapping
Legacy Code | Legacy Description | BC Code | BC Description |
|---|---|---|---|
NET30 | Net 30 Days | 30 DAYS | Payment within 30 days |
2-10N30 | 2/10 Net 30 | 2%10/NET30 | 2% discount if paid within 10 days |
COD | Cash on Delivery | CASH | Payment on delivery |
Mapping Validation:
Test mapping specifications thoroughly:
Sample data conversion
Edge case testing
Null value handling
Data type compatibility
Length and format constraints
Business rule validation
Master Data Migration
Establish the foundational data entities that enable transactions.
Customer and Vendor Migration
Customer Data Migration:
Customer Master Fields:
Critical Fields:
Customer number (unique identifier)
Name and search name
Address information (Bill-to, Ship-to)
Contact information (phone, email, website)
Customer posting group
Gen. business posting group
VAT business posting group
Payment terms code
Currency code
Credit limit
Blocked status
Important Optional Fields:
Salesperson code
Customer price group
Customer discount group
Shipping agent
Location code (default)
Dimensions (default values)
Payment method code
Language code
Customer Contacts:
Contact name and title
Direct phone/email
Role/responsibility
Primary contact designation
Ship-to Addresses:
Additional delivery locations
Address details
Contact information
Location-specific settings
Customer Best Practices:
Deduplicate ruthlessly—one customer, one record
Standardize naming conventions
Validate addresses (use address validation services)
Complete tax registration numbers for B2B customers
Set reasonable credit limits
Block inactive customers appropriately
Establish clear customer numbering convention
Vendor Data Migration:
Vendor Master Fields:
Similar structure to customers:
Vendor number
Name and search name
Address and contact information
Vendor posting group
Payment terms and methods
Currency code
Default dimensions
Tax information
Vendor-Specific Considerations:
1099 reporting flags (US)
Preferred payment method
Remittance email addresses
Purchase from vendor restrictions
Vendor evaluation ratings
Insurance and license expiration dates
Item and Inventory Migration
Item Master Data:
Essential Item Fields:
Item number (unique identifier)
Description and description 2
Base unit of measure
Item category code
Type (Inventory, Service, Non-Inventory)
Costing method (FIFO, LIFO, Average, Standard, Specific)
Costing Method Selection Guidance:
Your choice of costing method has significant financial and operational implications:
FIFO (First-In, First-Out):
Assumes oldest inventory sold first
Best for: Perishable goods, GAAP compliance in most countries, price trend visibility
Impact: Higher COGS in inflationary periods (older, cheaper inventory consumed first)
Note: Required or preferred in many jurisdictions
Average Cost:
Recalculates average cost per unit after each purchase
Best for: Commodities, large volumes of similar items, stable pricing environments
Impact: Smooths cost fluctuations, simpler than FIFO for high-volume scenarios
Consideration: Business Central calculates average automatically
Standard Cost:
Uses predefined cost regardless of actual purchase price
Best for: Manufacturing environments, budget-based costing, variance analysis
Impact: Requires regular standard cost updates, generates variance entries
Use case: Manufacturing with detailed cost accounting needs
LIFO (Last-In, First-Out):
Assumes newest inventory sold first
Important: LIFO is NOT allowed under IFRS and illegal in many countries (including EU, Australia, others)
Business Central supports LIFO primarily for U.S. customers where tax benefits exist
Impact: Lower taxable income in inflationary periods (higher recent costs matched to revenue)
Declining use: Many U.S. companies moving away from LIFO due to complexity
Specific Cost:
Tracks cost of each individual unit (often with lot/serial tracking)
Best for: High-value items, unique items, lot-specific traceability needs
Impact: Most accurate but administratively intensive
Migration Consideration: Changing costing methods during migration is complex. Evaluate early in requirements gathering phase and align with accounting team.
Unit cost and unit price
Gen. product posting group
Inventory posting group
VAT product posting group
Replenishment system
Vendor number (primary vendor)
Inventory-Specific Fields:
Reorder point
Reorder quantity
Lead time calculation
Safety stock quantity
Lot size
Item tracking code (for lot/serial tracking)
Manufacturing Fields (if applicable):
Production BOM number
Routing number
Manufacturing policy
Rescheduling policy
Order tracking policy
Item Attributes and Categorization:
Assign to appropriate item categories
Define item attributes (color, size, etc.)
Create item variants where needed
Set up unit of measure conversions
Inventory Quantities:
Opening Balance Approach:
Option 1: Item Journal Entry
Single journal entry per item per location
Quantity and value
Posted at cutover
Simplest approach
Option 2: Detailed Transaction History
Import historical inventory transactions
Builds audit trail
Allows lot/serial number history
More complex but comprehensive
Inventory Validation:
Reconcile quantities to physical counts
Verify valuation calculations
Validate lot/serial number assignments
Cross-check with financial G/L balances
Item Master Best Practices:
Standardize item descriptions and naming
Establish clear item numbering scheme
Eliminate obsolete and duplicate items
Complete categorization before migration
Validate units of measure carefully
Determine costing method strategically
Consider barcode requirements
Fixed Assets Migration
Fixed Asset Master Data:
Critical Fields:
Fixed asset number
Description and description 2
FA class code and FA subclass code
FA location code
Responsible employee
Acquisition date and cost
Depreciation book code
Depreciation method
Depreciation starting date
Number of depreciation years
Straight-line percentage
Ending book value
Salvage value
Depreciation Books:
Book-specific depreciation methods
Tax vs. financial reporting books
Integration to G/L flags
Depreciation calculation parameters
Fixed Asset Best Practices:
Verify acquisition dates and costs
Reconcile accumulated depreciation
Validate remaining useful life
Set up multiple depreciation books if needed
Assign FA locations and responsible persons
Establish FA numbering and categorization
Plan for ongoing depreciation calculation
Chart of Accounts Migration
G/L Account Migration:
Typically configuration rather than data migration, but may include:
Opening balances at go-live
Historical balance details (for trending)
Budget data
Opening Balances:
Timing: Last closed period before go-live
Approach:
Extract trial balance from legacy system
Map legacy accounts to Business Central chart of accounts
Create opening balance journal
Include dimensions if tracking historical dimensional data
Balance to ensure debits equal credits
Post in Business Central at cutover
Historical G/L Detail:
If migrating detailed G/L history:
Maintain transaction dates from source system
Preserve source document numbers
Include all dimension values
Validate period balances match trial balances
Consider summary entries for older periods
Using RapidStart Services and Configuration Packages
Business Central's built-in migration tools streamline data import.
RapidStart Services Overview:
RapidStart provides structured data import capabilities:
Template-based data definition
Excel-based data preparation
Validation before import
Batch import processing
Error handling and correction
Modern API-Based Migration Approaches:
While RapidStart/Configuration Packages remain valid for simple migrations, consider modern API-based tools for larger or more complex scenarios:
Azure Data Factory (ADF):
Cloud-based ETL (Extract, Transform, Load) service
Pre-built connectors for Business Central APIs
Handles large data volumes efficiently
Supports complex transformations
Schedule-based or event-driven migrations
Built-in monitoring and error handling
Best for: Large-scale migrations (100K+ records), complex legacy systems, ongoing synchronization needs
Power Automate (for simpler scenarios):
Low-code integration platform
Business Central connector built-in
Good for incremental migrations
Visual workflow designer
Best for: Smaller datasets, incremental data loads, non-technical users
Custom AL Extensions (for specialized needs):
AL codeunit-based migration logic
Full control over validation and transformation
Integration with BC business logic
Best for: Complex business rules, unique data structures, heavy validation requirements
Tool Selection Guidance:
Simple migrations (< 10K records, straightforward mapping): Configuration Packages / RapidStart
Medium migrations (10K-100K records, some complexity): Power Automate or Azure Data Factory
Complex migrations (100K+ records, complex transformations, legacy systems): Azure Data Factory + Custom AL
Ongoing synchronization needs: Azure Data Factory or Power Automate with scheduled flows
Configuration Package Process:
1. Create Configuration Package:
Define package code and description
Select tables to include
Choose fields for each table
Set processing order
Define validation rules
2. Export Package to Excel:
Generate Excel template
Pre-populated with validation rules
Includes field descriptions
Structured for easy data entry
3. Populate Excel Template:
Copy/paste data from source systems
Apply transformations
Validate against rules
Review for completeness
4. Import Package:
Import Excel file back to Business Central
System validates data
Review and correct errors
Apply package to import data
5. Validate Imported Data:
Review import log
Verify record counts
Spot-check sample records
Confirm relationships
Configuration Package Best Practices:
Start with small pilot packages
Test import in sandbox environment
Use field mapping to handle differences
Leverage data templates for consistency
Import in correct sequence (masters before transactions)
Keep original Excel files for documentation
Package Sequencing:
Proper import order prevents dependency errors:
General Setup (currencies, payment terms, shipping agents)
Posting Groups
Customers and Vendors
Items
Chart of Accounts
Open Documents
Opening Balances
Excel Templates and Data Import Tools
Alternative import methods for various scenarios.
Excel-Based Import Methods:
Configuration Packages (covered above): Best for structured, repeatable imports
Edit in Excel:
Export BC data to Excel for bulk editing (via Office Add-in)
Make changes in Excel, then publish back to Business Central
Suitable for small-to-medium updates (hundreds of records)
Available for supported pages (lists with Excel export enabled)
Not true real-time sync—requires explicit publish action
CSV Import via Data Migration:
Standard import for specific entities
Customer and vendor CSV templates
Item CSV templates
Bank statement imports
API-Based Import:
Programmatic data import
For large volumes or complex logic
Enables automated migration scripts
Requires development skills
Tool Selection Criteria:
Criteria | RapidStart | Edit in Excel | CSV Import | API |
|---|---|---|---|---|
Data Volume | High | Low | Medium | High |
Complexity | High | Low | Medium | High |
Automation | Manual | Manual | Semi-Auto | Automated |
Skill Required | Low | Low | Low | High |
Flexibility | Medium | Low | Low | High |
Data Migration Testing Phases
Rigorous testing ensures migration success before cutover.
Testing Approach:
Phase 1: Unit Testing
Test individual entities in isolation:
Single customer import
Single vendor import
Small item set
Sample transactions
Validation:
Data format correctness
Field mapping accuracy
Default value application
Validation rule compliance
Phase 2: Integration Testing
Test related entities together:
Customers with ship-to addresses and contacts
Items with inventory quantities
Transactions with related master data
G/L accounts with opening balances and dimensions
Validation:
Relationship integrity
Reference validation
Calculation correctness
Cross-entity consistency
Phase 3: Volume Testing
Test with production-scale data volumes:
Full customer/vendor databases
Complete item catalog
All historical transactions (if migrating)
Full G/L detail
Validation:
Performance acceptability
System resource utilization
Import duration
Data integrity at scale
Phase 4: User Acceptance Testing
Business users validate migrated data:
Sample customer records reviewed
Item information verified
Open orders examined
Financial balances confirmed
Validation:
Business user confidence
Data usability
Completeness assessment
Corrections identified
Testing Best Practices:
Test in sandbox environment only
Use production-like data volumes
Document and fix errors systematically
Retest after corrections
Obtain formal sign-off before cutover
Keep detailed test logs
Cutover Planning and Execution
The final migration to production requires meticulous planning.
Cutover Plan Components:
Data Privacy and GDPR Compliance Considerations:
Before migrating personal data into Business Central, address privacy and regulatory requirements:
GDPR and Data Protection:
Data Minimization: Migrate only personal data necessary for business operations; archive or delete obsolete customer/employee records
Consent Validation: Ensure you have legal basis to migrate personal data (contractual necessity, legitimate interest, consent)
Right to Erasure: Implement processes to handle "right to be forgotten" requests post-migration
Data Retention Policies: Establish and document retention policies; don't migrate data beyond legal retention requirements
Business Central Privacy Features:
Classified Fields: Use Business Central's data classification features to tag personal/sensitive fields
Data Subject Requests: Utilize BC's built-in tools for handling data subject access requests (DSAR)
Audit Trail: Enable change logs for personal data modifications
Data Encryption: Microsoft encrypts BC data at rest (Azure SQL encryption) and in transit (TLS)
Other Regulatory Considerations:
SOX Compliance: If publicly traded (U.S.), ensure migration audit trails support SOX controls
Industry-Specific: Healthcare (HIPAA), Financial Services (PCI-DSS), others may have specialized data handling requirements
Data Residency: Microsoft Azure regions for BC data; ensure compliance with local data residency laws
Migration Checklist for Privacy:
Identify all personal data fields being migrated (names, emails, phone numbers, addresses, etc.)
Validate legal basis for migration under GDPR or applicable regulations
Anonymize or delete personal data in migrated historical transactions if not legally required
Document data processing activities in GDPR-required Records of Processing Activities (ROPA)
Train migration team on data privacy requirements
Engage legal/compliance team review before cutover
Pre-Cutover Activities (Days/Weeks Before):
Final data cleansing
Final mapping validation
Sandbox migration rehearsal
Cutover runbook finalization
Go/no-go criteria establishment
Communication to stakeholders
Backup verification
Resource allocation confirmation
Cutover Weekend/Period Activities:
Hour 0-2: System Freeze and Final Extraction
Freeze legacy system for transactions
Extract final production data
Perform final data quality checks
Stage data for import
Hour 2-6: Data Import
Execute master data import
Import open transactions
Post opening balances
Validate import logs
Correct any errors
Hour 6-10: Validation and Reconciliation
Verify record counts
Validate key balances
Reconcile to legacy system
Test sample transactions
Confirm user access
Hour 10-12: Go-Live Preparation
Final smoke tests
User notification
Enable production access
Begin monitoring
Post-Cutover Activities:
Intensive user support (hypercare)
Transaction monitoring
Issue tracking and resolution
Daily reconciliation
Legacy system archival (after stabilization)
Cutover Runbook:
Detailed, step-by-step instructions including:
Each activity with estimated duration
Responsible person
Prerequisites
Specific commands or procedures
Validation checkpoints
Rollback procedures (if needed)
Communication protocols
Escalation contacts
Example Runbook Entry:
Handling Opening Balances
Opening balances establish your starting point in Business Central.
Financial Opening Balances:
Trial Balance Import:
Extract final trial balance from legacy system
Map to Business Central chart of accounts
Include all dimension values
Create general journal entry
Balance debits and credits
Post at go-live date
Customer Opening Balances:
Two approaches:
Approach 1: Detail Level
Import each open invoice
Maintains full transaction history
Enables aging reports
Preserves due dates and terms
Approach 2: Summary Level
Single balance per customer
Faster import
Less detailed history
Suitable when detail isn't required
Vendor Opening Balances:
Similar approach options:
Detailed: Import each open bill
Summary: Single balance per vendor
Inventory Opening Balances:
Quantity on hand by item and location
Inventory value
Lot/serial number details (if tracking)
Posted via item journal
Opening Balance Best Practices:
Balance to legacy system trial balance
Verify customer and vendor balances with statements
Perform physical inventory count for quantities
Document opening balance date clearly
Retain backup of legacy balances for reference
Reconcile to tax returns and statutory filings
Data Archival Strategies for Legacy Data
Plan for long-term access to historical data not migrated.
Legacy System Retention:
Read-Only Access:
Maintain legacy system in read-only mode
Enable queries for historical research
Keep available for audits
Define retention period
Data Export and Archival:
Export complete database to neutral format
Store in accessible location
Document data structure and relationships
Include user guide for data access
Reporting Archive:
Generate and save key historical reports
Export to PDF for long-term retention
Index for easy retrieval
Include financial statements, tax returns, audit reports
Archival Timeline:
Immediate (Go-Live):
Legacy system in read-only mode
Full backup created
Short-Term (3-6 months):
Verify Business Central stability
Confirm no critical data gaps
Keep legacy system easily accessible
Medium-Term (6-12 months):
Archive to offline storage
Decommission legacy system
Establish data request process
Long-Term (1+ years):
Archived data on secure storage
Annual access verification
Compliance with retention policies
Archival Best Practices:
Comply with legal and regulatory retention requirements
Maintain data readability (avoid obsolete formats)
Document archive contents and access procedures
Test data restoration periodically
Secure archived data appropriately
Frequently Asked Questions (FAQ)
How do you migrate data to Business Central?
Business Central data migration follows a structured 8-step methodology:
Step 1: Data Assessment
Inventory all source systems (ERP, CRM, spreadsheets, databases)
Analyze data volume: Record counts for customers, vendors, items, transactions
Evaluate data quality: Completeness, accuracy, consistency, duplicates
Identify data cleansing requirements
Document business rules and transformations
Step 2: Define Migration Scope
Decide what to migrate:
Master Data (Always):
Customers, vendors, items
Chart of accounts
Fixed assets, employees
Price lists, payment terms
Open Transactions (Required):
Open sales orders, purchase orders
Unpaid customer/vendor invoices
Open bank transactions
Historical Data (Selective):
Option A: Last 12-24 months of transactions
Option B: Opening balances only (keep legacy system for history)
Option C: Full history (rare—regulated industries only)
Step 3: Select Migration Tools
For Small-Medium Datasets (<10,000 records):
Excel Configuration Packages: Business Central's built-in tool
Download template from BC
Populate Excel with source data
Import and post in BC
Best for: Master data, opening balances
For Large Datasets (>10,000 records):
Azure Data Factory: ETL platform for complex transformations
Power Automate: Automated data flows using BC connectors
Third-Party Tools: RapidStart, Scribe, KingswaySoft
For Complex Scenarios:
Custom AL Extensions: Programmatic migration via AL code
APIs (OData/REST): Direct API calls for real-time migration
Step 4: Create Data Mapping
Document field-level transformations:
Step 5: Data Cleansing
Clean legacy data before migration:
Duplicates: Merge duplicate customer/vendor/item records
Standardization: Consistent formats for phone numbers, addresses, names
Validation: Complete required fields (addresses, tax IDs, payment terms)
Accuracy: Verify balances, quantities, pricing
Currency: Convert multi-currency if needed
Step 6: Test Migration (Iterative)
Test 1: Unit testing with sample data (50-100 records)
Test 2: Volume testing with full dataset in Sandbox
Test 3: Integration testing (create transactions, post to G/L)
Test 4: User Acceptance Testing and sign-off
Step 7: Cutover Execution (Go-Live Weekend)
Freeze legacy system (read-only or full freeze)
Extract final data from source systems
Load data into Production Business Central
Run validation reports
Reconcile to legacy system
Go/No-Go decision
Step 8: Post-Migration Validation
Reconcile opening balances (A/R, A/P, inventory, G/L)
Verify customer/vendor aging
Test transaction posting
Daily reconciliation for first week
What tools are used for Business Central data migration?
Business Central supports multiple migration tool options:
1. Excel Configuration Packages (Built-In, FREE)
Best For: Small-medium datasets (<10,000 records per entity)
Advantages:
✅ No additional cost (included in BC)
✅ Easy to use (Excel-based)
✅ Good for master data and opening balances
✅ Built-in validation
Limitations:
❌ Slow for large datasets
❌ Limited transformation capabilities
❌ Manual process (not automated)
2. Azure Data Factory (Microsoft ETL Platform)
Best For: Large datasets, complex transformations, multi-source migration
Advantages:
✅ Handles millions of records
✅ Complex transformations (joins, aggregations, lookups)
✅ Automated and scheduled
✅ Incremental loads
✅ Logging and monitoring
Limitations:
❌ Requires Azure subscription (cost: ~$1-2/hour)
❌ Technical complexity
❌ Setup time (2-4 weeks)
3. Power Automate (Low-Code Automation)
Best For: Moderate datasets, ongoing integrations, API-based migration
Advantages:
✅ Low-code (no programming required)
✅ Pre-built BC connectors
✅ Good for incremental migrations
Limitations:
❌ Flow execution limits (5,000 actions/day)
❌ Slower than ADF for bulk loads
4. Custom AL Extensions (Code-Based)
Best For: Highly customized data, complex business logic
Advantages:
✅ Full control over migration logic
✅ Complex validations and transformations
✅ Reusable for future migrations
Limitations:
❌ Development cost ($20K-$50K)
❌ Requires AL developer expertise
❌ Maintenance overhead
Tool Selection Guide:
Data Volume | Complexity | Recommended Tool |
|---|---|---|
<5,000 records | Simple | Excel Configuration Packages |
5K-50K records | Moderate | Power Automate or RapidStart |
>50K records | Complex | Azure Data Factory |
Unique legacy | High | Custom AL Extension |
How long does Business Central data migration take?
Data migration timelines vary based on data volume, quality, and complexity:
Typical Timeline (3-6 weeks total):
Week 1: Assessment & Planning
Data inventory and volume analysis
Migration scope definition
Tool selection
Project plan creation
Week 2: Mapping & Preparation
Field-level mapping
Transformation rules documentation
Data cleansing in source systems
Migration tool setup
Weeks 3-4: Testing & Refinement
Test 1: Sample data migration
Fix mapping errors
Test 2: Full volume migration to Sandbox
Test 3: Integration testing
User acceptance testing
Week 5: Cutover Preparation
Cutover runbook finalization
Rollback plan development
Final data cleansing
Dress rehearsal
Week 6: Cutover & Validation
Friday PM: Freeze legacy system
Saturday: Execute production migration
Sunday: Validation and reconciliation
Monday: Go-Live
Timeline by Organization Size:
Small (1 location, <10K records): 3-4 weeks Medium (2-5 locations, 10K-100K records): 5-8 weeks Large (Multi-location, >100K records): 10-16 weeks
Factors That Extend Timeline:
🔴 Poor source data quality (adds 2-4 weeks)
🔴 Multiple legacy systems (adds 1-2 weeks each)
🔴 Complex transformations (adds 2-3 weeks)
🔴 Full historical data migration (adds 3-6 weeks)
Factors That Accelerate:
🟢 Clean, well-structured source data
🟢 Single source system
🟢 Opening balances only (no history)
🟢 Experienced migration team
What is GDPR compliance for Business Central data migration?
GDPR (General Data Protection Regulation) requires specific data privacy protections:
GDPR Principles Affecting Migration:
1. Data Minimization
Only migrate necessary business data
Don't migrate dormant records (inactive >5 years)
Archive rather than migrate old employee records
2. Right to Be Forgotten
Honor deletion requests before migration
Exclude deleted individuals from migration
Implement BC's "GDPR Deletion" functionality
3. Cross-Border Data Transfer
EU data must stay within EU
Business Central Online: Choose EU data residency in Admin Center
Migration Tools: Ensure GDPR-compliant processing (e.g., EU-hosted ADF runtime)
4. Data Encryption
Use HTTPS/TLS for API-based migrations
Encrypt data files during transfer
Business Central Online: Automatic encryption at rest
5. Access Controls
Limit migration team access to minimum necessary
Use BC permission sets (don't grant SUPER to everyone)
Remove temporary migration accounts post-go-live
GDPR Migration Checklist:
Pre-Migration:
Data Protection Impact Assessment completed
Lawful basis documented for each data category
Deletion requests honored
Data minimization applied
EU data residency configured
During Migration:
Encrypted transfer methods used
Access restricted to authorized team only
Migration tools GDPR-compliant
Audit logging enabled
Post-Migration:
Personal data fields verified
BC GDPR features configured
Data retention policies set
Legacy system data securely destroyed
Business Central GDPR Features:
Data Classification: Tag personal data fields (Personal, Sensitive)
Customer Consent: Track marketing consent
GDPR Deletion: Delete customer/contact data on request
Retention Policies: Automatic deletion of old records
Audit Trails: Track access/modifications to personal data
Penalties: Up to €20 million or 4% of global annual revenue
Deliverables: Data Migration Phase Outputs
Complete this phase with comprehensive data successfully migrated:
1. Data Migration Plan
Comprehensive plan including:
Migration scope and approach
Source system inventory
Cutover strategy and timing
Resource requirements
Risk mitigation strategies
2. Data Mapping Worksheets
Detailed field-level mapping:
Source to target mapping
Transformation rules
Validation rules
Lookup tables
3. Validation Checklist
Testing and validation criteria:
Unit test cases
Integration test cases
Volume test results
UAT sign-off
4. Cutover Runbook
Step-by-step execution guide:
Pre-cutover activities
Cutover procedures
Validation checkpoints
Rollback procedures
Conclusion: From Legacy to Business Central
Data migration is the bridge that carries your business history into your Business Central future. While challenging, a methodical approach centered on data quality, comprehensive testing, and careful execution delivers clean, reliable data that empowers confident business operations in your new system.
Key Takeaways:
✓ Plan Strategically: Balance migration completeness with practicality—not all data deserves migration
✓ Cleanse Thoroughly: Invest in data quality—garbage in becomes garbage out
✓ Map Precisely: Document every transformation clearly
✓ Test Rigorously: Multiple testing phases catch issues before production impact
✓ Execute Methodically: Follow your cutover runbook precisely
✓ Validate Extensively: Reconcile everything to legacy system
With clean, accurate data successfully migrated into your configured Business Central environment, you're positioned for the next phase: Customization, Extensions & Integration, where you'll extend Business Central capabilities to meet unique business requirements.
Next in Series: Blog 5: Customization, Extensions & Integration - Learn how to extend Business Central through custom development, AppSource extensions, and integrations with other systems.
Download Resources:
Questions or Comments? Share your data migration experiences and lessons learned in the comments below.
This is Part 4 of an 8-part series on Business Central Implementation. Subscribe to receive notifications when new articles are published.
Tags: #BusinessCentral #DataMigration #ERPImplementation #DataQuality #Dynamics365 #MigrationStrategy
Related Content…
>
Planning Your Business Central Implementation
>
Requirements Gathering & Process Mapping
>
System Configuration & Setup
>
Data Migration Strategy & Execution
>
Customization, Extensions & Integration
>
AI & Copilot Capabilities
>
Training, Change Management & User Adoption
>
Go-Live, Hypercare & Continuous Improvement
>
Migrating from Legacy ERP to Business Central: A Proven Roadmap
>
Business Central Support & Optimization: Maximizing Your ERP Investment
Related Posts
Business Central Support & Optimization: Maximizing Your ERP Investment
Your Business Central go-live was successful—congratulations! Users are processing orders, posting invoices, and managing inventory in their new ERP system. The champagne has been poured, the project team celebrated, and the implementation has transitioned to steady-state operations. But here's what many organizations don't realize: Go-live is the start of your Business Central journey, not the end. The Reality: Month 2: Users discover workarounds for features they don't understand Month 6: Customizations accumulate (quick fixes becoming technical debt) Month 12: System performance degrades (reports slow, inventory counts off) Month 18: Users frustrated ("BC doesn't work for us") Month 24: Considering another ERP replacement ("we need something better")
Migrating from Legacy ERP to Business Central: A Proven Roadmap
Your current ERP system has served you well for years—maybe it's Dynamics NAV, QuickBooks Enterprise, Sage, SAP Business One, or even a custom-built system. But now you're facing mounting challenges: End-of-support deadlines: Your vendor is forcing an upgrade or discontinuing support Rising maintenance costs: Annual support fees increasing while functionality stagnates Integration nightmares: New tools (e-commerce, CRM, BI) won't integrate with your legacy system Cloud imperative: Remote work and multi-location operations demand cloud access Compliance pressure: New regulations requiring capabilities your system doesn't have Talent shortage: Hard to find IT staff who know your outdated platform
Go-Live, Hypercare & Continuous Improvement
Go‑live is more than a launch day—it marks the beginning of your Business Central journey. We help you navigate this critical transition with structured hypercare support, rapid issue resolution, and proactive monitoring to stabilize your system and build user confidence. Beyond the initial rollout, we guide you toward continuous improvement, ensuring Business Central evolves with your business and continues delivering long‑term value.
Get Your FREE Dynamics 365 Demo
Transform your business operations with Microsoft Dynamics 365 Business Central
Experience the transformative power of Microsoft Dynamics 365 Business Central for yourself! Request a free demo today and see how our solutions can streamline your operations and drive growth for your business.
Our team will guide you through a personalized demonstration tailored to your specific needs. This draft provides a structured approach to presenting Qualia Tech's offerings related to Microsoft Dynamics 365 Business Central while ensuring that potential customers understand the value proposition clearly.