Updated for Winter '26
Salesforce Data Architect Exam Tips (Winter '26): How to Pass
The Salesforce Data Architect exam tests your ability to design scalable, high-performance Salesforce data models. These tips focus on large data volume strategy, master data management, and the architectural thinking the exam rewards over product feature recall.
Written and reviewed by Krishna Mohan — ADM-201, PD1, PD2, App Builder & Consultant certified. Updated for Winter '26. Methodology · Contact
Exam At a Glance
60
Questions
105 min
Time Limit
62%
Passing Score
$200
Exam Fee
Quick Answer: What Data Architect Tests
- Data modelling decisions — Object relationships, field strategy, external IDs, data normalisation vs. denormalisation trade-offs at scale.
- Large data volume (LDV) strategy — When and how to use skinny tables, custom indexes, query optimisation, and the impact of record volume on performance.
- Master data management — Duplicate management, data governance frameworks, external system integration patterns, and data quality strategy.
Highest-Weight Exam Sections
Data Management + Data Modelling + MDM = 68%. These three sections are your highest-ROI study areas.
Scenario Strategy: How to Approach Data Architect Questions
Data Architect questions describe a data challenge (scale, quality, integration, or performance) and ask which strategy or feature solves it. The correct answer always considers long-term performance at scale — not just immediate functionality.
- For LDV questions: identify whether the bottleneck is SOQL performance, storage, or UI response time — each has a different solution (custom index, skinny table, or query optimisation).
- For data model questions: prefer normalised models in transactional systems; consider denormalisation only when query performance at millions of records is explicitly required.
- For MDM questions: external IDs are the key tool for upsert operations and cross-system record matching — know when to use a single external ID vs. multiple external IDs per object.
- For data migration questions: the Salesforce Data Loader vs. Bulk API vs. REST API choice depends on record volume and real-time vs. batch requirement — know the thresholds.
Mock-Test Benchmark Before Booking
76%+ on 3 timed full mocks before booking
The 63% passing score sounds achievable, but Data Architect questions are deeply scenario-based and require applied knowledge of data at scale. Candidates who only study documentation — without LDV project experience — consistently struggle with LDV and MDM sections. Hands-on practice is essential.
3 Concepts That Fail Most Data Architect Candidates
These are not the hardest topics — they are the ones where candidates are most confidently wrong. Learn the distinction early.
1. Skinny Tables vs Custom Indexes — Different Performance Tools, Different Use Cases
Skinny tables are Salesforce-managed read-only copies of frequently queried subsets of large objects — they speed full-table reads. Custom indexes speed lookups on specific fields in WHERE clauses. Candidates request custom indexes when they need skinny tables and vice versa. Use skinny tables when reports and queries need to scan millions of records; use custom indexes when queries filter on specific non-standard fields.
2. External Objects vs Big Objects — Real-Time vs Archive
External Objects surface data from external systems via Salesforce Connect without importing it — data stays in the source system and is queried in real time. Big Objects store massive historical volumes inside Salesforce for archival — they are write-once and have limited query capabilities (only indexed fields). Candidates recommend External Objects for large historical data — the exam expects Big Objects for archive data already in Salesforce.
3. Data Masking vs Field Encryption vs Classic Encryption — Three Security Layers
Classic Encryption (legacy) encrypts individual text fields using AES 128 — it is visible to users with "View Encrypted Data" permission. Shield Platform Encryption encrypts data at rest at the database level using AES 256 — transparent to users with access. Data Masking replaces sensitive values with synthetic data in sandbox environments. The exam tests which to apply for a given security scenario — database-level encryption = Shield; field-level hiding = Classic; sandbox data protection = Data Masking.
Frequently Asked Questions
- What is the Salesforce Data Architect exam format?
- The Salesforce Data Architect exam has 60 multiple-choice questions, a 105-minute time limit, a 63% passing score, and a $200 fee. It is a component exam for both the Application Architect and System Architect role-based credentials.
- What are the highest-weight Salesforce Data Architect exam sections?
- Data Management (26%), Data Modelling (24%), and Master Data Management (18%) together account for 68% of the Data Architect exam. Large data volume strategy and data governance are the most nuanced topics and require real project experience.
- What prerequisites do I need for the Salesforce Data Architect exam?
- There are no hard prerequisites, but Salesforce recommends Platform App Builder (DEV-402) as foundational knowledge before the Data Architect exam. Real experience with Salesforce data modelling, data loading, SOQL optimisation, and large-scale implementations is strongly recommended.
- What is the hardest part of the Salesforce Data Architect exam?
- Large Data Volume (LDV) strategy is consistently the hardest section — skinny tables, custom indexes, query optimisation, and division management require hands-on experience that is difficult to learn from study materials alone. Master data management and data governance concepts are also highly scenario-based.
- What concepts do most Data Architect candidates get wrong?
- The most commonly misunderstood topics for the Data Architect exam are: (1) Skinny Tables vs Custom Indexes — Different Performance Tools, Different Use Cases; (2) External Objects vs Big Objects — Real-Time vs Archive; (3) Data Masking vs Field Encryption vs Classic Encryption — Three Security Layers. Candidates are most confidently wrong on these — learn the distinctions early to avoid losing marks on questions you expect to get right.
- Why do most Data Architect candidates fail questions about Skinny Tables vs Custom Indexes?
- Skinny tables are Salesforce-managed read-only copies of frequently queried subsets of large objects — they speed full-table reads. Custom indexes speed lookups on specific fields in WHERE clauses. Candidates request custom indexes when they need skinny tables and vice versa. Use skinny tables when reports and queries need to scan millions of records; use custom indexes when queries filter on s...
- Why do most Data Architect candidates fail questions about External Objects vs Big Objects?
- External Objects surface data from external systems via Salesforce Connect without importing it — data stays in the source system and is queried in real time. Big Objects store massive historical volumes inside Salesforce for archival — they are write-once and have limited query capabilities (only indexed fields). Candidates recommend External Objects for large historical data — the exam expect...
- Why do most Data Architect candidates fail questions about Data Masking vs Field Encryption vs Classic Encryption?
- Classic Encryption (legacy) encrypts individual text fields using AES 128 — it is visible to users with "View Encrypted Data" permission. Shield Platform Encryption encrypts data at rest at the database level using AES 256 — transparent to users with access. Data Masking replaces sensitive values with synthetic data in sandbox environments. The exam tests which to apply for a given security sce...
Related Exam Tips
Start Data Architect Prep
After this exam, consider Application Architect or System Architect next.