Free Practice Questions for Snowflake DEA-C02 Certification
Study with 657 exam-style practice questions designed to help you prepare for the Snowflake SnowPro Advanced: Data Engineer (DEA-C02). All questions are aligned with the latest exam guide and include detailed explanations to help you master the material.
Start Practicing
Random Questions
Practice with randomly mixed questions from all topics
Domain Mode
Practice questions from a specific topic area
Exam Information
Exam Details
Key information about Snowflake SnowPro Advanced: Data Engineer (DEA-C02)
associate (intermediate)
Through Snowflake Continuing Education (CE) program (eligible ILT courses or higher-level certification)
August 22, 2025
Active SnowPro Core Certified credential
Data Engineers and Software Engineers with 2+ years of data engineering experience, including practical Snowflake usage, RESTful APIs, SQL, semi-structured data, and cloud-native concepts
2 years
Exam Topics & Skills Assessed
Skills measured (from the official study guide)
Domain 1: Data Movement
Subdomain 1.1: Given a data set, load data into Snowflake.
ā Outline considerations for data loading ā Define data loading features and potential impact
Subdomain 1.2: Ingest data of various formats through the mechanics of Snowflake.
ā Required file formats ā Ingestion of structured, semi-structured, and unstructured data ā Implementation of stages and file formats
Subdomain 1.3: Troubleshoot data ingestion.
ā Identify causes of ingestion errors ā Determine resolutions for ingestion errors
Subdomain 1.4: Design, build, and troubleshoot continuous data pipelines.
ā Stages ā Tasks ā Streams ā Snowpipe (for example, Auto ingest as compared to Rest API) ā Snowpipe Streaming
Subdomain 1.5: Analyze and differentiate types of data pipelines.
ā Create User-Defined Functions (UDFs) ā Design and use the Snowflake SQL API ā Create data pipelines in Snowpark
Subdomain 1.6: Install, configure, and use connectors to connect to Snowflake.
ā Kafka connectors ā Spark connectors ā Python connectors
Subdomain 1.7: Design and build data sharing solutions.
ā Implement a data share ā Create and manage views ā Implement row-level filtering ā Share data using the Snowflake Marketplace ā Share data using a listing
Subdomain 1.8: Outline when to use external tables and define how they work.
ā Manage external tables ā Manage Iceberg tables ā Perform general table management ā Manage schema evolution ā Unload data
Domain 2: Performance Optimization
Subdomain 2.1: Troubleshoot underperforming queries.
ā Identify underperforming queries ā Outline telemetry around the operation ā Increase efficiency ā Identify the root cause
Subdomain 2.2: Given a scenario, configure a solution for the best performance.
ā Scale out compared to scale up ā Virtual warehouse properties (for example, size, multi-cluster) ā Query complexity ā Micro-partitions and the impact of clustering ā Materialized views ā Search optimization service ā Query acceleration service ā Snowpark-optimized warehouses ā Caching features
Subdomain 2.3: Monitor continuous data pipelines.
ā Snowflake objects ā Tasks ā Streams ā Snowpipe Streaming ā Alerts ā Notifications ā Data Quality and data metric function monitoring
Domain 3: Storage & Data Protection
Subdomain 3.1: Implement and manage data recovery features in Snowflake.
ā Time Travel ā Impact of streams ā Fail-safe ā Cross-region and cross-cloud replication
Subdomain 3.2: Use system functions to analyze micro-partitions.
ā Clustering depth ā Cluster keys
Subdomain 3.3: Use Time Travel and cloning to create new development environments.
ā Clone objects ā Validate changes before promoting ā Rollback changes
Domain 4: Data Governance
Subdomain 4.1: Monitor data.
ā Apply object tagging and classifications ā Use data classification to monitor data ā Manage data lineage and object dependencies ā Monitor data quality
Subdomain 4.2: Establish and maintain data protection.
ā Implement column-level security ā Use in conjunction with Dynamic Data Masking ā Use in conjunction with external tokenization ā Use projection policies ā Use data masking with Role-Based Access Control (RBAC) to secure sensitive data ā Explain the options available to support row-level security using Snowflake row access policies ā Use aggregation policies ā Use DDL to manage Dynamic Data Masking and row access policies ā Use best practices to create and apply data masking policies ā Use Snowflake Data Clean Rooms to share data ā Use the web-app ā Use the Snowflake developer API
Domain 5: Data Transformation
Subdomain 5.1: Define User-Defined Functions (UDFs) and outline how to use them.
ā Snowpark UDFs (for example, Java, Python, Scala) ā Secure UDFs ā SQL UDFs ā JavaScript UDFs ā User-Defined Table Functions (UDTFs) ā User-Defined Aggregate Functions (UDAFs)
Subdomain 5.2: Define and create external functions.
ā Secure external functions ā Work with external functions
Subdomain 5.3: Design, build, and leverage stored procedures.
ā Snowpark stored procedures (for example, Java, Python, Scala) ā SQL Scripting stored procedures ā JavaScript stored procedures ā Transaction management
Subdomain 5.4: Handle and transform semi-structured data.
ā Traverse and transform semi-structured data to structured data ā Transform structured data to semi-structured data
Subdomain 5.5: Handle and process unstructured data.
ā Use unstructured data ā URL types ā Use directory tables ā Use the Rest API
Subdomain 5.6: Use Snowpark for data transformation.
ā Understand Snowpark architecture ā Query and filter data using the Snowpark library ā Perform data transformations using Snowpark (for example, aggregations) ā Manipulate Snowpark DataFrames
Techniques & products