Free Practice Questions for Databricks Certified Data Analyst Associate Certification

    🔄 Last checked for updates February 16th, 2026

    Study with 390 exam-style practice questions designed to help you prepare for the Databricks Certified Data Analyst Associate. All questions are aligned with the latest exam guide and include detailed explanations to help you master the material.

    Start Practicing

    Random Questions

    Practice with randomly mixed questions from all topics

    Question MixAll Topics
    FormatRandom Order

    Domain Mode

    Practice questions from a specific topic area

    Exam Information

    Exam Details

    Key information about Databricks Certified Data Analyst Associate

    Official study guide:

    View

    prerequisites:

    None is required; related course attendance and six months of hands-on experience as a Data Analyst are highly recommended.

    delivery method:

    Online proctored or test center proctored

    recertification:

    Recertification is required every two years to maintain your certified status. To recertify, you must take the full exam that is currently live.

    time limit minutes:

    90 minutes

    number of questions:

    45 scored multiple-choice questions

    certification validity:

    2 years

    Exam Topics & Skills Assessed

    Skills measured (from the official study guide)

    Domain 1: Understanding of Databricks Data Intelligence Platform

    Subdomain 1.1: Describe the core components of the Databricks Intelligence Platform

    Describe the core components of the Databricks Intelligence Platform, including Mosaic AI, DeltaLive tables, Lakeflow Jobs, Data Intelligence Engine, Delta Lake, Unity Catalog, and Databricks SQL.

    Subdomain 1.2: Understand catalogs, schemas, managed and external tables, access controls, views, certified tables, and lineage

    Understand catalogs, schemas, managed and external tables, access controls, views, certified tables, and lineage within the Catalog Explorer interface.

    Subdomain 1.3: Describe the role and features of Databricks Marketplace

    Describe the role and features of Databricks Marketplace.

    Domain 2: Managing Data

    Subdomain 2.1: Use Unity Catalog to discover, query, and manage certified datasets

    Use Unity Catalog to discover, query, and manage certified datasets.

    Subdomain 2.2: Use the Catalog Explorer to tag a data asset and view its lineage

    Use the Catalog Explorer to tag a data asset and view its lineage.

    Subdomain 2.3: Perform data cleaning on Unity Catalog Tables in SQL

    Perform data cleaning on Unity Catalog Tables in SQL, including removing invalid data or handling missing values.

    Domain 3: Importing Data

    Subdomain 3.1: Explain the approaches for bringing data into Databricks

    Explain the approaches for bringing data into Databricks, covering ingestion from S3, data sharing with external systems via Delta Sharing, API-driven data intake, the Auto Loader feature, and Marketplace.

    Subdomain 3.2: Use the Databricks Workspace UI to upload a data file to the platform

    Use the Databricks Workspace UI to upload a data file to the platform.

    Domain 4: Executing queries using Databricks SQL and Databricks SQL Warehouses

    Subdomain 4.1: Utilize Databricks Assistant within a Notebook or SQL Editor

    Utilize Databricks Assistant within a Notebook or SQL Editor to facilitate query writing and debugging.

    Subdomain 4.2: Explain the role a SQL Warehouse plays in query execution

    Explain the role a SQL Warehouse plays in query execution.

    Subdomain 4.3: Querying cross-system analytics by joining data

    Querying cross-system analytics by joining data from a Delta table and a federated data source.

    Subdomain 4.4: Create a materialized view

    Create a materialized view, including knowing when to use Streaming Tables and Materialized Views, and differentiate between dynamic and materialized views.

    Subdomain 4.5: Perform aggregate operations

    Perform aggregate operations such as count, approximate count distinct, mean, and summary statistics.

    Subdomain 4.6: Write queries to combine tables using various join operations

    Write queries to combine tables using various join operations (inner, left, right, and so on) with single or multiple keys, as well as set operations like union and union all, including the differences between the joins (inner, left, right, and so on).

    Subdomain 4.7: Perform sorting and filtering operations on a table

    Perform sorting and filtering operations on a table.

    Subdomain 4.8: Create managed tables and external tables

    Create managed tables and external tables, including creating tables by joining data from multiple sources (e.g., CSV, Parquet, Delta tables) to create unified datasets, including Unity Catalog.

    Subdomain 4.9: Use Delta Lake's time travel to access and query historical data versions

    Use Delta Lake's time travel to access and query historical data versions.

    Domain 5: Analyzing Queries

    Subdomain 5.1: Understand the Features, Benefits, and Supported Workloads of Photon

    Understand the Features, Benefits, and Supported Workloads of Photon.

    Subdomain 5.2: Identify poorly performing queries in the Databricks Intelligence platform

    Identify poorly performing queries in the Databricks Intelligence platform, such as Query Insights, Query Profiler log, etc.

    Subdomain 5.3: Utilize Delta Lake to audit and view history

    Utilize Delta Lake to audit and view history, validate results, and compare historical results or trends.

    Subdomain 5.4: Utilize query history and caching to reduce development time and query latency

    Utilize query history and caching to reduce development time and query latency

    Subdomain 5.5: Apply Liquid Clustering to improve query speed

    Apply Liquid Clustering to improve query speed when filtering large tables on specific columns.

    Subdomain 5.6: Fix a query to achieve the desired results

    Fix a query to achieve the desired results.

    Domain 6: Working with Dashboards and Visualizations in Databricks

    Subdomain 6.1: Build dashboards using AI/BI Dashboards

    Build dashboards using AI/BI Dashboards, including multi-tabs/page layouts, multiple data sources/datasets, and widgets (visualizations, text, images).

    Subdomain 6.2: Create visualizations in notebooks and the SQL editor

    Create visualizations in notebooks and the SQL editor.

    Subdomain 6.3: Work with parameters in SQL queries and dashboards

    Work with parameters in SQL queries and dashboards, including defining, configuring, and testing parameters.

    Subdomain 6.4: Configure permissions through the UI to share dashboards

    Configure permissions through the UI to share dashboards with workspace users/groups, external users through shareable links, and embed dashboards in external apps.

    Subdomain 6.5: Schedule an automatic dashboard refresh

    Schedule an automatic dashboard refresh.

    Subdomain 6.6: Configure an alert with a desired threshold and destination

    Configure an alert with a desired threshold and destination.

    Subdomain 6.7: Identify the effective visualization type to communicate insights clearly

    Identify the effective visualization type to communicate insights clearly.

    Domain 7: Developing, Sharing, and Maintaining AI/BI Genie spaces

    Subdomain 7.1: Describe the purpose, key features, and components of AI/BI Genie spaces

    Describe the purpose, key features, and components of AI/BI Genie spaces.

    Subdomain 7.2: Create Genie spaces

    Create Genie spaces by defining reasonable sample questions and domain-specific instructions, choosing SQL warehouses, curating Unity Catalog datasets (tables, views...), and vetting queries as Trusted Assets.

    Subdomain 7.3: Assign permissions via the UI and distribute Genie spaces

    Assign permissions via the UI and distribute Genie spaces using embedded links and external app integrations.

    Subdomain 7.4: Optimize AI/BI Genie spaces

    Optimize AI/BI Genie spaces by tracking user questions, response accuracy, and feedback; updating instructions and trusted assets based on stakeholder input; validating accuracy with benchmarks; refreshing Unity Catalog metadata.

    Domain 8: Data Modeling with Databricks SQL

    Subdomain 8.1: Apply industry-standard data modeling techniques

    Apply industry-standard data modeling techniques, such as star, snowflake, and data vault schemas, to analytical workloads.

    Subdomain 8.2: Understand how industry-standard models align with the Medallion Architecture

    Understand how industry-standard models align with the Medallion Architecture.

    Domain 9: Securing Data

    Subdomain 9.1: Use Unity Catalog roles and sharing settings to ensure workspace objects are secure

    Use Unity Catalog roles and sharing settings to ensure workspace objects are secure.

    Subdomain 9.2: Understand how the 3-level namespace works in the Unity Catalog

    Understand how the 3-level namespace(Catalog / Schema / Tables or Volumes) works in the Unity Catalog.

    Subdomain 9.3: Apply best practices for storage and management to ensure data security

    Apply best practices for storage and management to ensure data security, including table ownership and PII protection.

    Techniques & products

    Databricks Data Intelligence Platform
    Mosaic AI
    DeltaLive tables
    Lakeflow Jobs
    Data Intelligence Engine
    Delta Lake
    Unity Catalog
    Databricks SQL
    Catalog Explorer
    Databricks Marketplace
    S3 ingestion
    Delta Sharing
    API-driven data intake
    Auto Loader
    Databricks Workspace UI
    Databricks Assistant
    SQL Warehouse
    Materialized Views
    Streaming Tables
    Photon
    Query Insights
    Query Profiler log
    Liquid Clustering
    AI/BI Dashboards
    AI/BI Genie spaces
    Star schema
    Snowflake schema
    Data Vault schemas
    Medallion Architecture
    CSV
    Parquet
    Data cleaning
    Data management
    Data import
    Query execution
    Query optimization
    Data analysis
    Dashboards
    Visualizations
    Data modeling
    Data security
    Access controls
    Data lineage
    Time travel
    SQL queries
    Aggregate operations
    Join operations
    Filtering
    Sorting
    Alerts
    Permissions
    PII protection

    CertSafari is not affiliated with, endorsed by, or officially connected to Databricks Inc.. Full disclaimer