MS PDF VIEWER Snowsetanswers 2
MS PDF VIEWER Snowsetanswers 2
SET 1
Question 1Incorrect
A role is created and owns 2 tables. This role is then dropped. Who will now own the two
tables?
Correct answer
SYSADMIN
Overall explanation
Question 2Correct
What information is included in the display in the Query Profile? (Choose two.)
Overall explanation
Question 3Correct
chumma kizhii
#dm
Overall explanation
Scaling up adds more compute power to a warehouse by improving the components. There
will be a moment where scaling up is impossible as you cannot enhance the components
more. Scaling out adds more warehouses to work in parallel. You can see it in the following
image:
Question 4Skipped
What can the Snowflake SCIM API be used to manage? (Choose two.)
Correct selection
Users
Network policies
Correct selection
Roles
Integrations
Session policies
Overall explanation
chumma kizhii
#dm
Snowflake is compatible with SCIM2.0, SCIM is an open standard for automating user
provisioning. The SCIM API allows us to programmatically manage roles and users within the
Snowflake platform, making it easier to automate identity and access management tasks.
Question 5Correct
Shared disk
Overall explanation
Built from the ground up for the cloud, Snowflake’s unique multi-cluster shared data
architecture delivers the performance, scale, elasticity, and concurrency today’s
organizations require.
Question 6Skipped
Network policies
Correct answer
External tokenization
Internal tokenization
Overall explanation
• External Tokenization
chumma kizhii
#dm
Question 7Skipped
A developer is granted ownership of a table that has a masking policy. The developer’s role is
not able to see the masked data.
Will the developer be able to modify the table to read the masked data?
No, because masking policies must always reference specific access roles.
Yes, because a table owner has full control and can unset masking policies.
Correct answer
No, because ownership of a table does not include the ability to change masking policies.
Overall explanation
Object owners (i.e. the role that has the OWNERSHIP privilege on the object) do not have
the privilege to unset masking policies.
Object owners cannot view column data in which a masking policy applies.
Question 8Incorrect
Correct answer
Total invocations
Partitions scanned
Bytes written
Overall explanation
Total invocations — number of times that an external function was called. (This can be
different from the number of external function calls in the text of the SQL statement due to
the number of batches that rows are divided into, the number of retries (if there are
transient network problems), etc.)
Question 9Incorrect
chumma kizhii
#dm
Which of the following commands are not blocking operations? (Choose two.)
DELETE
UPDATE
INSERT
MERGE
Correct selection
COPY
Overall explanation
COMMIT operations (including both AUTOCOMMIT and explicit COMMIT) lock resources,
but usually only briefly. UPDATE, DELETE, and MERGE statements hold locks that generally
prevent them from running in parallel with other UPDATE, DELETE, and MERGE statements.
Most INSERT and COPY statements write only new partitions. Those statements often can
run in parallel with other INSERT and COPY operations, and sometimes can run in parallel
with an UPDATE, DELETE, or MERGE statement.
Question 10Skipped
What is the Snowflake recommended Parquet file size when querying from external tables to
optimize the number of parallel scanning operations?
Correct answer
256-512 MB
100-250 MB
16-128 MB
1-16 MB
Overall explanation
chumma kizhii
#dm
Do not confuse this question with the size recommendation for COPY operations in
Snowflake (100MB-250MB).
Question 11Incorrect
What activities can a user with the ORGADMIN role perform? (Choose two.)
Correct selection
Overall explanation
A user with the ORGADMIN role can perform the following actions:
• Creating an Account.
Note: Once an account is created, ORGADMIN can view the account properties but does not
have access to the account data.
Question 12Skipped
Which of the below APIs are NOT Snowpipe REST APIs? (Choose two.)
insertFiles
insertReport
Correct selection
loadFiles
loadHistoryScan
chumma kizhii
#dm
Correct selection
insertHistoryScan
Overall explanation
You can make calls to REST endpoints to get information. For example, by calling the
following insertReport endpoint, you can get a report of files submitted via insertFiles:
1. GET
https://<account_id>.snowflakecomputing.com/v1/data/pipes/<pipeName>/insertR
eport
Question 13Incorrect
What is the MINIMUM role required to set the value for the parameter
ENABLE_ACCOUNT_DATABASE_REPLICATION?
ACCOUNTADMIN
SECURITYADMIN
Correct answer
ORGADMIN
SYSADMIN
Overall explanation
To enable replication for accounts, a user with the ORGADMIN role uses the
SYSTEM$GLOBAL_ACCOUNT_SET_PARAMETER function to set the
ENABLE_ACCOUNT_DATABASE_REPLICATION parameter to true.
Question 14Incorrect
Which Snowflake feature allows a user to track sensitive data for compliance, discovery,
protection, and resource usage?
Internal tokenization
Correct answer
chumma kizhii
#dm
Tags
Comments
Overall explanation
Tags enable data stewards to monitor sensitive data for compliance, discovery, protection,
and resource usage use cases through either a centralized or decentralized data governance
management approach.
Question 15Incorrect
What is used to denote a pre-computed data set derived from a SELECT query specification
and stored for later use?
Correct answer
Materialized view
Secure view
View
External table
Overall explanation
chumma kizhii
#dm
Question 16Correct
What property from the Resource Monitors lets you specify whether you want to control the
credit usage of your entire account or a specific set of warehouses?
Monitor Level.
Credit Quota.
Notification.
Schedule.
Overall explanation
The monitor level is a property that specifies whether the resource monitor is used to
monitor the credit usage for your entire account or individual warehouses.
Question 17Incorrect
A virtual warehouse initially suffers from poor performance as a result of queries from
multiple concurrent processes that are queuing. Over time, the problem resolved.
Correct answer
chumma kizhii
#dm
Overall explanation
Multi-cluster warehouses are designed specifically for handling queuing and performance
issues related to large numbers of concurrent users and/or queries. In addition, multi-cluster
warehouses can help automate this process if your number of users/queries tend to
fluctuate
Question 18Skipped
NODE
Correct selection
VALUE
Correct selection
KEY
LEVEL
ROOT
Overall explanation
Question 19Incorrect
JSON
ORC
XML
Avro
Correct selection
Parquet
chumma kizhii
#dm
Overall explanation
Not all semi-structured formats supported for data upload are supported for data unload.
Question 20Correct
ORGADMIN
SHAREADMIN
ACCOUNTADMIN
SECURITYADMIN
Overall explanation
CREATE SHARE: Account :Only the ACCOUNTADMIN role has this privilege by default. The
privilege can be granted to additional roles as needed.
Question 21Incorrect
Correct answer
REPEAT
FOR
LOOP
WHILE
Overall explanation
A REPEAT loop iterates until a condition is true. In a REPEAT loop, the condition is tested
immediately after executing the body of the loop. As a result, the body of the loop always
executes at least once.
chumma kizhii
#dm
A WHILE loop iterates while a condition is true. In a WHILE loop, the condition is tested
immediately before executing the body of the loop. If the condition is false before the first
iteration, then the body of the loop does not execute even once.
Question 22Incorrect
Table View
Correct selection
Regular
Materialized View
Secure View
External View
Private View
Overall explanation
You can see the differences between them in the following image:
Question 23Incorrect
Tableau
DataRobot
dbt
chumma kizhii
#dm
Correct answer
Alation
Overall explanation
Question 24Correct
IBM.
AWS.
Azure.
Overall explanation
A Snowflake account can only be hosted on Amazon Web Services, Google Cloud Platforms,
and Microsoft Azure for now.
chumma kizhii
#dm
Question 25Correct
Overall explanation
The top-most container is the customer organization. All databases for your Snowflake
account are contained in the account object. Securable objects such as tables, views, stages,
and UDFs are contained in a schema object, which is, in turn, contained in a database. You
can see the complete Snowflake hierarchy in the following image (via docs.snowflake.com):
Question 26Incorrect
What strategies can be used to optimize the performance of a virtual warehouse? (Choose
two.)
chumma kizhii
#dm
Correct selection
Reduce queuing.
Overall explanation
• Reduce queues (if WH is not multi cluster make it multi cluster else add another WH)
Question 27Correct
Overall explanation
Snowpipe enables loading data when the files are available in any (internal/external) stage.
We use it when we have a small volume of frequent data and want to load it continuously
(micro-batches). Snowpipe enables loading data when the files are available in any
(internal/external) stage. We use it when we have a small volume of frequent data and want
to load it continuously (micro-batches).
chumma kizhii
#dm
Question 28Incorrect
Correct answer
The Snowflake Web Interface (UI) in the Account -> Billing & Usage section
Overall explanation
Question 29Correct
What is the MOST performant file format for loading data in Snowflake?
Parquet
CSV (Unzipped)
CSV (Gzipped)
ORC
Overall explanation
Loading from Gzipped CSV is several times faster than loading from ORC and Parquet at an
impressive 15 TB/Hour. While 5-6 TB/hour is decent if your data is originally in ORC or
Parquet, don’t go out of your way to CREATE ORC or Parquet files from CSV in the hope that
it will load Snowflake faster.
chumma kizhii
#dm
Loading data into fully structured (columnarized) schema is ~10-20% faster than landing it
into a VARIANT.
Question 30Incorrect
Which Snowflake object stores a generated identity and access management (IAM) entity for
your external cloud storage, along with an optional set of allowed or blocked storage
locations (Amazon S3, Google Cloud Storage, or Microsoft Azure)?
Correct answer
Storage Integration.
Storage Schema.
Security Integration.
User Stage.
Overall explanation
A storage integration is a Snowflake object that stores a generated identity and access
management (IAM) entity for your external cloud storage. This option will enable users to
avoid supplying credentials when creating stages or when loading or unloading data.
Question 31Incorrect
When should you consider disabling auto-suspend for a Virtual Warehouse? (Choose two.)
Correct selection
Correct selection
chumma kizhii
#dm
When users will be using compute at different times throughout a 24/7 period
When you do not want to have to manually turn on the Warehouse each time a user
needs it
Overall explanation
Question 32Correct
Which Snowflake feature records changes made to a table so actions can be taken using that
change data capture?
Task
Materialized View
Pipe
Stream
Overall explanation
Note that a stream itself does not contain any table data. A stream only stores an offset for
the source object and returns CDC records by leveraging the versioning history for the
source object.
Question 33Incorrect
How can a Snowflake user configure a virtual warehouse to support over 100 users if their
company has Enterprise Edition?
Correct answer
Overall explanation
chumma kizhii
#dm
Question 34Correct
How can a user improve the performance of a single large complex query in Snowflake?
Overall explanation
Resizing a warehouse generally improves query performance, particularly for larger, more
complex queries. For query complexity Scale Up , for concurrency and query load tuning
scale out.
Question 35Correct
A query executed a couple of hours ago, which spent more than 5 minutes to run, is
executed again, and it returned the results in less than a second. What might have
happened?
Snowflake used the persisted query results from the query result cache.
Snowflake used the persisted query results from the metadata cache.
A new Snowflake version has been released in the last two hours, improving the speed of
the service.
Snowflake used the persisted query results from the warehouse cache.
Overall explanation
The query result cache stores the results of our queries for 24 hours, so as long as we
perform the same query and the data hasn’t changed in the storage layer, it will return the
same result without using the warehouse and without consuming credits.
Question 36Incorrect
chumma kizhii
#dm
How should a virtual warehouse be configured if a user wants to ensure that additional
multi-clusters are resumed with the shorteset delay possible?
Correct answer
Overall explanation
Only if the system estimates there’s enough query load to keep the cluster busy for at least 6
minutes.
Question 37Correct
16
128
chumma kizhii
#dm
32
Overall explanation
Question 38Correct
Tables
Schemas
Stages
Databases
Overall explanation
Question 39Correct
Which of the following roles are NOT System-Defined Roles in Snowflake? (Choose two.)
SECURITYADMIN
STORAGEADMIN
VIEWER
SYSADMIN
USERADMIN
Overall explanation
The PUBLIC role is also a System-Defined role. You can see the differences between them in
the following table:
chumma kizhii
#dm
Question 40Incorrect
Correct answer
Overall explanation
It is NOT a parameter for CREATE [OR REPLACE] Pipe. See this link.
Load History
chumma kizhii
#dm
The load history for Snowpipe operations is stored in the metadata of the pipe object. When
a pipe is recreated, the load history is dropped. In general, this condition only affects users if
they subsequently execute an ALTER PIPE … REFRESH statement on the pipe. Doing so could
load duplicate data from staged files in the storage location for the pipe if the data was
already loaded successfully and the files were not deleted subsequently.
Question 41Correct
The micro-partitions are stored in compressed cloud storage and the cloud storage handles
compression.
The text data in a micro-partition is compressed with GZIP but other types are not
compressed.
Overall explanation
Snowflake automatically determines the most efficient compression algorithm for the
columns in each micro-partition.
Question 42Correct
Zero-copy clones increase storage costs as cloning the table requires storing its data twice
At the instance/instant a clone is created, all micro-partitions in the original table and the
clone are fully shared
All zero-copy clone objects inherit the privileges of their original objects
Overall explanation
Using Zero-Copy cloning, you can create a snapshot of any table, schema, or database. The
cloned object is independent and can be modified without modifying the original. It does
NOT duplicate data; it duplicates the metadata of the micro-partitions, making it not
consume storage.
chumma kizhii
#dm
Question 43Correct
What is the recommended Snowflake data type to store semi-structured data like JSON?
LOB
VARIANT
RAW
VARCHAR
Overall explanation
Semi-structured data is saved as Variant type in Snowflake tables, with a maximum limit size
of 16MB, and it can be queried using JSON notation. You can store arrays, objects, etc.
Reference:
Question 44Correct
Which item in the Data Warehouse migration process does not apply in Snowflake?
Migrate Indexes
chumma kizhii
#dm
Migrate Users
Migrate Schemas
Overall explanation
Question 45Skipped
trim_space = true
compression = auto
Correct answer
strip_outer_array = true
strip_outer_array = false
Overall explanation
strip_outer_array = true will remove the outer array structure and copy the file into multiple
table rows instead of row. (the limitation for table rows is max 16MB) with this solution it
will be fine.
Question 46Correct
Storage.
Cloud Services.
Compute.
Overall explanation
The Cloud Services layer is a collection of services coordinating activities across Snowflake.
It's in charge of Authentication, Infrastructure management, Metadata management, Query
parsing and optimization, and Access control.
chumma kizhii
#dm
Question 47Incorrect
Snowflake recommends, as a minimum, that all users with the following role(s) should be
enrolled in Multi-Factor Authentication (MFA):
SECURITYADMIN, ACCOUNTADMIN
Correct answer
ACCOUNTADMIN
Overall explanation
Question 48Correct
Set the Minimum Clusters and Maximum Clusters settings to the different values
Set the Minimum Clusters and Maximum Clusters settings to the same value
Overall explanation
If you set the minimum cluster count less than the maximum cluster count, then the
warehouse runs in Auto-scale mode.
Question 49Incorrect
What is the recommended approach for unloading data to a cloud storage location from
Snowflake?
Correct answer
chumma kizhii
#dm
Unload the data to a user stage, then upload the data to cloud storage.
Unload the data to a local file system, then upload it to cloud storage.
Overall explanation
The best approach is to use the COPY INTO <location> command to copy the data from the
Snowflake database table into one or more files in a Snowflake or external stage.
After that you will be able to download the file from the stage to a local file system (not
before)
Question 50Correct
What technique does Snowflake use to limit the number of micro-partitions retrieved as
part of a query?
Computing.
Clustering.
Pruning.
Indexing.
Overall explanation
Question 51Skipped
What are characteristics of directory tables when used with unstructured data? (Choose
two.)
Correct selection
A directory table can be added explicitly to a stage when the stage is created.
Correct selection
A directory table is a separate database object that can be layered explicitly on a stage.
chumma kizhii
#dm
Overall explanation
A directory table is an implicit object layered on a stage (not a separate database object) and
is conceptually similar to an external table because it stores file-level metadata about the
data files in the stage. A directory table has no grantable privileges of its own.
Both external (external cloud storage) and internal (Snowflake) stages support directory
tables
Question 52Skipped
A Snowflake user runs a query for 36 seconds on a size 2XL virtual warehouse.
Snowflake will charge for 60 seconds at the rate of 64 credits per hour.
Correct answer
Snowflake will charge for 60 seconds at the rate of 32 credits per hour.
Snowflake will charge for 36 seconds at the rate of 32 credits per hour.
Snowflake will charge for 36 seconds at the rate of 64 credits per hour.
Overall explanation
Question 53Skipped
Which command can be used to delete staged files from a Snowflake stage when the files
are no longer needed?
Correct answer
REMOVE
TRUNCATE TABLE
chumma kizhii
#dm
DROP
DELETE
Overall explanation
REMOVE
Removes files from either an external (external cloud storage) or internal (i.e. Snowflake)
stage.
Question 54Skipped
Correct answer
Overall explanation
Question 55Skipped
A medium (M) warehouse has auto-suspend configured after 15 minutes. You have noticed
that all of the queries that run on this warehouse finish within a minute. What will you do to
optimize compute costs?
Correct answer
Overall explanation
By reducing the minutes of the "auto-suspend" option, the warehouse will automatically go
to sleep after 60 seconds of inactivity, significantly reducing credit consumption.
chumma kizhii
#dm
Question 56Skipped
Table
Aggregate
Correct answer
Scalar
Window
Overall explanation
A scalar function is a function that returns one value per invocation; in most cases, you can
think of this as returning one value per row. This contrasts with Aggregate Functions, which
return one value per group of rows.
Question 57Skipped
Correct answer
Overall explanation
SPLIT_TO_TABLE
This table function splits a string (based on a specified delimiter) and flattens the results into
rows.
Question 58Skipped
A Snowflake query took 40 minutes to run. The results indicate that ‘Bytes spilled to local
storage’ was a large number.
Correct answer
chumma kizhii
#dm
The warehouse is too small. Increase the size of the warehouse to reduce the spillage.
The warehouse is too large. Decrease the size of the warehouse to reduce the spillage.
The warehouse consists of a single cluster. Use a multi-cluster warehouse to reduce the
spillage.
Overall explanation
Warehouse size should be increased (Scale up). Multi cluster warehouse will just help in
managing concurrency.
Question 59Skipped
The users in a reader account can query data that has been shared with the reader
account and can perform DML tasks.
Correct answer
The SHOW MANAGED ACCOUNTS command will view all the reader accounts that have
been created for an account.
A reader account can consume data from the provider account that created it and combine
it with its own data.
Overall explanation
Lists the managed accounts created for your account. Currently used by data providers to
create reader accounts for their consumers.
To view all the reader accounts that have been created for your account, use the SHOW
MANAGED ACCOUNTS command.
About option : A reader account can consume data from the provider account that created it
and combine it with its own data.
chumma kizhii
#dm
Reader accounts (formerly known as “read-only accounts”) enable providers to share data
with consumers who are not already Snowflake customers, without requiring the consumers
to become Snowflake customers.
A reader account is intended primarily for querying data shared by the provider of the
account. You can work with data, for example, by creating materialized views.
• Unload data using a storage integration. However, you can use the COPY INTO
<location> command with your connection credentials to unload data into a cloud
storage location.
Question 60Skipped
12am-5am
After replication
Correct answer
Overall explanation
Question 61Skipped
XS
chumma kizhii
#dm
Correct answer
XXS
Overall explanation
The minimum configuration for a Snowflake Warehouse is X-Small (XS), which consumes one
credit/hour. You can see the different sizes in the following image:
Question 62Skipped
What authentication method does the Kafka connector use within Snowflake?
Correct answer
chumma kizhii
#dm
OAuth
Overall explanation
Question 63Skipped
Create small data files and stage them in cloud storage frequently.
The number of load operations that run in parallel can exceed the number of data files to
be loaded.
Correct answer
The number of data files that are processed in parallel is determined by the virtual
warehouse.
Overall explanation
Question 64Skipped
Which Snowflake object can be used to record DML changes made to a table?
Task
Snowpipe
Correct answer
Stream
Stage
Overall explanation
Question 65Skipped
Enterprise.
chumma kizhii
#dm
Business Critical.
Correct answer
Standard.
Overall explanation
We can increase the Time Travel functionality to 90 days if we have (at least) the Snowflake
Enterprise Edition.
Question 66Skipped
What feature of Snowflake Continuous Data Protection can be used for maintenance of
historical data?
Network policies
Access control
Fail-safe
Correct answer
Time Travel
Overall explanation
Snowflake Time Travel enables accessing historical data that has been changed or deleted at
any point within a defined period. It is a powerful CDP (Continuous Data Protection) feature
which ensures the maintenance and availability of your historical data.
Question 67Skipped
What are common issues found by using the Query Profile? (Choose two.)
Correct selection
chumma kizhii
#dm
Correct selection
Identifying queries that will likely run very slowly before executing them
Overall explanation
Question 68Skipped
A deterministic query is run at 8am, takes 5 minutes, and the results are cached. Which of
the following statements are true? (Choose two.)
Correct selection
The same exact query will return the precomputed results if the underlying data hasn't
changed and the results were last accessed within previous 24 hour period
The same exact query will return the precomputed results even if the underlying data has
changed as long as the results were last accessed within the previous 24 hour period
Correct selection
The 24 hours timer on the precomputed results gets renewed every time the exact query is
executed
The exact query will ALWAYS return the precomputed result set for the
RESULT_CACHE_ACTIVE = time period
Overall explanation
Question 69Skipped
You have two virtual warehouses in your Snowflake account. If one of them updates the data
in the storage layer, when will the other one see it?
Once all the compute resources are provisioned for the second warehouse.
Correct answer
Immediately.
chumma kizhii
#dm
Overall explanation
All the warehouses of your account share the storage layer, so if the data is updated, all the
warehouses will be able to see it. You can see this behavior in the following image:
Question 70Skipped
Zero-copy clones
Internal stages
Correct selection
Fail-safe
Incremental backups
Correct selection
Time Travel
Overall explanation
Question 71Skipped
Which pages are included in the Activity area of Snowsight? (Choose two.)
chumma kizhii
#dm
Sharing settings
Correct selection
Copy History
Contacts
Correct selection
Query History
Overall explanation
Question 72Skipped
Correct answer
Snowflake tables are the physical instantiation of data loaded into Snowflake
Overall explanation
Question 73Skipped
Correct answer
Overall explanation
chumma kizhii
#dm
Question 74Skipped
Files have been uploaded to a Snowflake internal stage. The files now need to be deleted.
PURGE
DELETE
MODIFY
Correct answer
REMOVE
Overall explanation
Question 75Skipped
Materialized view
Temporary table
Transient table
Correct answer
Secure view
Overall explanation
Transient and temporary tables contribute to the storage charges that Snowflake bills your
account until explicitly dropped. Data stored in these table types contributes to the overall
storage charges Snowflake bills your account while they exist.
Materialized views impact your costs for both storage and compute resources:
• Storage: Each materialized view stores query results, which adds to the monthly
storage usage for your account.
Question 76Skipped
chumma kizhii
#dm
Which common query problems can the Query Profile help a user identify and troubleshoot?
(Choose two.)
Correct selection
When there are Common Table Expressions (CTEs) without a final SELECT statement
Correct selection
Overall explanation
• Exploding joins
Question 77Skipped
SYSADMIN
Database owner
Correct answer
Object owner
Schema owner
Overall explanation
In regular (i.e. non-managed) schemas, object owners (i.e. a role with the OWNERSHIP
privilege on an object) can grant access on their objects to other roles, with the option to
further grant those roles the ability to manage object grants
With managed access schemas, object owners lose the ability to make grant decisions. Only
the schema owner (i.e. the role with the OWNERSHIP privilege on the schema) or a role with
chumma kizhii
#dm
the MANAGE GRANTS privilege can grant privileges on objects in the schema, including
future grants, centralizing privilege management.
Question 78Skipped
Correct answer
Overall explanation
Question 79Skipped
Which Snowflake object helps evaluate virtual warehouse performance impacted by query
queuing?
Information_schema.warehouse_metering_history
Correct answer
Account_usage.query_history
Resource monitor
Information_schema.warehouse_load_history
Overall explanation
Warehouse query load measures the average number of queries that were running or
queued within a specific interval. You can customize the time period and time interval during
which to evaluate warehouse performance by querying the Account Usage QUERY_HISTORY
View.
Question 80Skipped
Correct answer
chumma kizhii
#dm
LOGIN_HISTORY
ACCESS_HISTORY
QUERY_HISTORY
SESSIONS
Overall explanation
Question 81Skipped
How does Snowflake allow a data provider with an Azure account in central Canada to share
data with a data consumer on AWS in Australia?
The data provider uses the GET DATA workflow in the Snowflake Data Marketplace to
create a share between Azure Central Canada and AWS Asia Pacific.
The data consumer and data provider can form a Data Exchange within the same
organization to create a share from Azure Central Canada to AWS Asia Pacific.
Correct answer
The data provider must replicate the database to a secondary account in AWS Asia Pacific
within the same organization then create a share to the data consumer's account
The data provider in Azure Central Canada can create a direct share to AWS Asia Pacific, if
they are both in the same organization.
Overall explanation
Question 82Skipped
Which Snowflake feature can be used to find sensitive data in a table or column?
Correct answer
Data classification
External functions
Masking policies
Overall explanation
chumma kizhii
#dm
Data Classification allows categorizing potentially personal and/or sensitive data to support
compliance and privacy regulations.
Question 83Skipped
Which command is used to take away staged files from a Snowflake stage after a successful
data ingestion?
DELETE
TRUNCATE
Correct answer
REMOVE
DROP
Overall explanation
Staged files can be deleted from a Snowflake stage (user stage, table stage, or named stage)
using the following methods:
• Files that were loaded successfully can be deleted from the stage during a load by
specifying the PURGE copy option in the COPY INTO <table> command.
• After the load completes, use the REMOVE command to remove the files in the
stage.
Question 84Skipped
Correct answer
Any user specified in the GET REST API call with sufficient privileges
Any role specified in the GET REST API call with sufficient privileges
Overall explanation
Scoped URL
Only the user who generated the scoped URL can use the URL to access the referenced file.
chumma kizhii
#dm
Question 85Skipped
During periods of warehouse contention, which parameter controls the maximum length of
time a warehouse will hold a query for processing?
QUERY_TIMEOUT_IN_SECONDS
STATEMENT_TIMEOUT_IN_SECONDS
MAX_CONCURRENCY_LEVEL
Correct answer
STATEMENT_QUEUED_TIMEOUT_IN_SECONDS
Overall explanation
STATEMENT_QUEUED_TIMEOUT_IN_SECONDS
Amount of time, in seconds, a SQL statement (query, DDL, DML, etc.) remains queued for a
warehouse before it is canceled by the system.
Question 86Skipped
A multi-step query that displays each processing step in the same panel.
Correct answer
A graphical representation of the main components of the processing plan for a query.
A pre-computed data set derived from a query specification and stored for later use.
A collapsible panel in the operator tree pane that lists nodes by execution time in
descending order for a query.
Overall explanation
Query Profile, available through the classic web interface, provides execution details for a
query. For the selected query, it provides a graphical representation of the main components
of the processing plan for the query, with statistics for each component, along with details
and statistics for the overall query.
Question 87Skipped
chumma kizhii
#dm
Snowflake provides two mechanisms to reduce data storage costs for short-lived tables.
These mechanisms are: (Choose two.)
Correct selection
Temporary Tables
Provisional Tables
Materialized views
Permanent Tables
Correct selection
Transient Tables
Overall explanation
Question 88Skipped
Which is true of Snowflake network policies? A Snowflake network policy: (Choose two.)
Correct selection
Only ACCOUNTADMIN role or a role with the global CREATE NETWORK POLICY privilege
can create network policies
Correct selection
chumma kizhii
#dm
Overall explanation
Question 89Skipped
What object will you use to schedule a merge statement in Snowflake so that it runs every
hour?
Table.
Stream.
Correct answer
Task.
Pipe.
Overall explanation
Snowflake tasks are schedulable scripts that are run inside your Snowflake environment. No
event source can trigger a task; instead, a task runs on a schedule. In this case, it will run
every hour.
Question 90Skipped
Which function should be used to authorize users to access rows in a base table when using
secure views with Secure Data Sharing?
CURRENT_ROLE()
Correct answer
CURRENT_ACCOUNT()
CURRENT_USER()
CURRENT_SESSION()
Overall explanation
When using secure views with Secure Data Sharing, use the CURRENT_ACCOUNT function to
authorize users from a specific account to access rows in a base table.
Question 91Skipped
chumma kizhii
#dm
Correct answer
Overall explanation
The following types of URLs are available to access files in cloud storage:
Scoped URL
Encoded URL that permits temporary access to a staged file without granting privileges to
the stage. The URL expires when the persisted query result period ends (i.e. the results
cache expires), which is currently 24 hours.
File URL
URL that identifies the database, schema, stage, and file path to a set of files. A role that has
sufficient privileges on the stage can access the files.
Pre-signed URL
Simple HTTPS URL used to access a file via a web browser. A file is temporarily accessible to
users via this URL using a pre-signed access token. The expiration time for the access token is
configurable.
Question 92Skipped
Which system-defined Snowflake role has permission to rename an account and specify
whether the original URL can be used to access the renamed account?
ACCOUNTADMIN
Correct answer
ORGADMIN
SYSADMIN
SECURITYADMIN
Overall explanation
chumma kizhii
#dm
An organization administrator (i.e. a user granted the ORGADMIN role) can rename an
account.
When an account is renamed, Snowflake creates a new account URL that is used to access
the account. During the renaming, the administrator can accept the default to save the
original account URL so users can continue to use it, or they can delete the original URL to
force users to use the new URL.
Question 93Skipped
A company’s security audit requires generating a report listing all Snowflake logins (e.g., date
and user) within the last 90 days.
2. FROM ACCOUNT_USAGE.USERS;
2. FROM ACCOUNT_USAGE.ACCESS_HISTORY;
2. FROM table(information_schema.login_history_by_user())
Correct answer
2. FROM ACCOUNT_USAGE.LOGIN_HISTORY;
Overall explanation
login_history_by_user function returns login activity within the last 7 days only.
Question 94Skipped
A Snowflake user wants to temporarily bypass a network policy by configuring the user
object property MINS_TO_BYPASS_NETWORK_POLICY.
chumma kizhii
#dm
Correct answer
Overall explanation
Question 95Skipped
Which function returns the name of the warehouse of the current session?
WAREHOUSE()
RUNNING_WAREHOUSE()
Correct answer
CURRENT_WAREHOUSE()
ACTIVE_WAREHOUSE()
Overall explanation
I’m not a big fan of learning commands by heart, and they are unlikely to appear on the
exam, but this one may be useful. You have other commands to show the current database
and schema, as you can see by executing the following command:
Question 96Skipped
What is the Snowflake multi-clustering feature for virtual warehouses used for?
Correct answer
Overall explanation
chumma kizhii
#dm
Multi-cluster warehouses enable you to scale compute resources to manage your user and
query concurrency needs as they change, such as during peak and off hours.
Question 97Skipped
Which command will we use to download the files from the stage/location loaded through
the COPY INTO <LOCATION> command?
PUT.
UNLOAD.
INSERT INTO.
Correct answer
GET.
Overall explanation
We will use the GET command to DOWNLOAD files from a Snowflake internal stage (named
internal stage, user stage, or table stage) into a directory/folder on a client machine. You
need to use SnowSQL to use this command.
Question 98Skipped
Where can we see the amount of storage used by Snowflake's Fail-Safe functionality in the
User Interface?
Correct answer
Overall explanation
You can see the Fail-Safe usage in the "Account Usage" section, as we can see in the
following image:
chumma kizhii
#dm
Question 99Skipped
Which file format will keep floating-point numbers from being truncated when data is
unloaded?
Correct answer
Parquet
ORC
JSON
CSV
Overall explanation
The data types such as FLOAT/DOUBLE/REAL (underlying using DOUBLE ), all are
approximate values, they are all stored as DOUBLE which has a precision/scale of 15/9.
When we are unloading floats/doubles, columns are unloaded to CSV or JSON files,
Snowflake truncates the values to approximately (15,9). Precision is always not accurate.
Snowflake can’t precisely represent any arbitrary value with double precision, it's as per
industry standard. Please refer to this Tech Note.
When floating-point number columns are unloaded to CSV or JSON files, Snowflake
truncates the values to approximately (15,9).
The values are not truncated when unloading floating-point number columns to Parquet
files.
Question 100Skipped
The database administrator must define the clustering methodology for each Snowflake
table
Correct answer
chumma kizhii
#dm
Clustering represents the way data is grouped together and stored within Snowflake's
micro-partitions
The clustering key must be included on the COPY command when loading data into
Snowflake
Overall explanation
Question 101Skipped
Which MINIMUM set of privileges is required to temporarily bypass an active network policy
by configuring the user object property MINS_TO_BYPASS_NETWORK_POLICY?
Only the role with the OWNERSHIP privilege on the network policy
Correct answer
Only Snowflake Support can set the value for this object property
Overall explanation
Question 102Skipped
Correct selection
ORGADMIN only
Correct selection
SYSADMIN only
chumma kizhii
#dm
Overall explanation
Only security administrators (i.e. users with the SECURITYADMIN role) or higher or a role
with the global CREATE NETWORK POLICY privilege can create network policies. Ownership
of a network policy can be transferred to another role
Question 103Skipped
Which Snowflake edition supports Protected Health Information (PHI) data (in accordance
with HIPAA and HITRUST CSF regulations), and has a dedicated metadata store and pool of
compute resources?
Correct answer
Standard
Business Critical
Enterprise
Overall explanation
Virtual Private Snowflake offers dedicated metadata store and pool of compute resources
(used in virtual warehouses).
Question 104Skipped
What is the default access of a securable object until other access is granted?
Read access
Correct answer
No access
Full access
Write access
Overall explanation
Securable object: An entity to which access can be granted. Unless allowed by a grant,
access is denied.
chumma kizhii
#dm
Question 105Skipped
Which columns are available in the output of a Snowflake directory table? (Choose two.)
FILE_NAME
STAGE_NAME
CATALOG_NAME
Correct selection
RELATIVE_PATH
Correct selection
LAST_MODIFIED
Overall explanation
LAST_MODIFIED: The date and time the file was last modified.
FILE_URL: The Snowflake-hosted file URL to the file. The other columns listed are not
available in the output of a Snowflake directory table.
Question 106Skipped
Correct answer
Overall explanation
You scale up to accommodate complex queries and you scale out to accommodate
concurrent queries.
chumma kizhii
#dm
Question 107Skipped
Correct answer
Overall explanation
Remember:
Question 108Skipped
8KB
500MB
Correct answer
16MB
50MB
Overall explanation
chumma kizhii
#dm
16MB per row captured in the VARIANT field. Same for VARCHAR datatype.
Question 109Skipped
ALTER
Correct answer
COPY INTO
Overall explanation
Question 110Skipped
By putting the Snowflake URL on the allowed list for get method responses
Correct answer
Overall explanation
Network policies provide options for managing network configurations to the Snowflake
service.
Network policies allow restricting access to your account based on user IP address.
Effectively, a network policy enables you to create an IP allowed list, as well as an IP blocked
list, if desired.
Using network policies is one of the best practices related to network security.
Question 111Skipped
chumma kizhii
#dm
Correct answer
Overall explanation
Snowflake collects rich statistics on data allowing it not to read unnecessary parts of a table
based on the query filters. (Pruning)
Question 112Skipped
The VALIDATE table function has which parameter as an input argument for a Snowflake
user?
CURRENT_STATEMENT
LAST_QUERY_ID
UUID_STRING
Correct answer
JOB_ID
Overall explanation
Syntax:
Question 113Skipped
10
1000
Correct answer
100
Overall explanation
chumma kizhii
#dm
Snowflake tasks are schedulable scripts that are run inside your Snowflake environment.
Users can define a simple tree-like structure of tasks that starts with a root task and is linked
together by task dependencies. The children's tasks only run after the parent's task finishes.
A single task can have a maximum of 100 predecessor tasks and 100 child tasks.
Question 114Skipped
What is the minimum Snowflake edition that customers planning on storing protected
information in Snowflake should consider for regulatory compliance?
Enterprise
Premier
Correct answer
Standard
Overall explanation
PII and HIPAA compliance are only supported for Business Critical Edition or higher. But the
question should have been more specific as jjordan mentioned.
Question 115Skipped
Which of the following roles is recommended to be used to create and manage users and
roles?
Correct answer
SECURITYADMIN
ACCOUNTADMIN
PUBLIC
SYSADMIN
Overall explanation
chumma kizhii
#dm
Question 116Skipped
Which Snowflake table type is only visible to the user who creates it, can have the same
name as permanent tables in the same schema, and is dropped at the end of the session?
User
Local
Correct answer
Temporary
Transient
Overall explanation
Question 117Skipped
What aspect of an executed query is represented by the remote disk I/O statistic of the
Query Profile in Snowflake?
Time spent reading and writing data from and to remote storage when the data being
accessed does not fit into the executing virtual warehouse node memory
chumma kizhii
#dm
Time spent caching the data to remote storage in order to buffer the data being extracted
and exported
Time spent scanning the table partitions to filter data based on the predicate
Correct answer
Time spent reading and writing data from and to remote storage when the data being
accessed does not fit into either the virtual warehouse memory or the local disk
Overall explanation
For some operations (e.g. duplicate elimination for a huge data set), the amount of memory
available for the compute resources used to execute the operation might not be sufficient to
hold intermediate results. As a result, the query processing engine will start spilling the data
to local disk. If the local disk space is not sufficient, the spilled data is then saved to remote
disks.
This spilling can have a profound effect on query performance (especially if remote disk is
used for spilling). Performance degrades drastically when a warehouse runs out of memory
while executing a query because memory bytes must “spill” onto local disk storage. If the
query requires even more memory, it spills onto remote cloud-provider storage, which
results in even worse performance.
Remote Disk I/O is the metric of the Query profile which can analyze the time spent
reading/writing data from/it remote storage (i.e. S3 or Azure Blob storage). This would
include things like spilling to remote disk, or reading your datasets.
Question 118Skipped
Privileges are only inherited by the direct child role in the hierarchy.
Correct answer
Privileges are inherited by any roles above that role in the hierarchy.
Privileges are inherited by any roles at the same level in the hierarchy.
Privileges are only inherited by the direct parent role in the hierarchy.
Overall explanation
The privileges associated with a role are inherited by any roles above that role in the
hierarchy.
chumma kizhii
#dm
Question 119Skipped
Scaling rhythmically
Correct answer
Scaling out
Scaling up
Scaling max
Overall explanation
Question 120Skipped
Which Snowflake feature will allow small volumes of data to continuously load into
Snowflake and will incrementally make the data available for analysis?
Correct answer
CREATE PIPE
TABLE STREAM
INSERT INTO
COPY INTO
Overall explanation
Question 121Skipped
Correct selection
Fail-safe
Correct selection
Client redirect
chumma kizhii
#dm
Clustering
Overall explanation
Question 122Skipped
Increasing the size of a virtual warehouse will always improve data loading performance.
Each virtual warehouse is an independent compute cluster that shares compute resources
with other warehouses.
Correct answer
All virtual warehouses share the same compute resources so performance degradation of
one warehouse can significantly affect all the other warehouses.
Overall explanation
Query Processing
Query execution is performed in the processing layer. Snowflake processes queries using
“virtual warehouses”. Each virtual warehouse is an MPP compute cluster composed of
multiple compute nodes allocated by Snowflake from a cloud provider.
Question 123Skipped
Correct selection
Correct selection
Warehouses can only be used for querying and cannot be used for data loading.
chumma kizhii
#dm
Overall explanation
Warehouses can be started and stopped at any time. They can also be resized at any time,
even while running, to accommodate the need for more or less compute resources, based
on the type of operations being performed by the warehouse.
You can expect a lot of questions about Virtual Warehouses in the exam.
Question 124Skipped
Snowflake will return an error when a user attempts to share which object?
Correct answer
Standard views
Secure views
Tables
Overall explanation
For data security and privacy reasons, only secure views are supported in shares at this time.
If a standard view is added to a share, Snowflake returns an error.
Question 125Skipped
Correct selection
Uncompressed
User-defined partitions
Correct selection
Micro-partitions
Overall explanation
All data in Snowflake tables is automatically divided into micro-partitions, which are
contiguous units of storage. Each micro-partition contains between 50 MB and 500 MB of
chumma kizhii
#dm
uncompressed data (note that the actual size in Snowflake is smaller because data is always
stored compressed). Groups of rows in tables are mapped into individual micro-partitions,
organized in a columnar fashion.
Question 126Skipped
Regardless of which notation is used, what are considerations for writing the column name
and element names when traversing semi-structured data?
Correct answer
Overall explanation
Regardless of which notation you use, the column name is case-insensitive but element
names are case-sensitive.
For example, (src is a column name) in the following list, the first two paths are equivalent,
but the third is not:
src:salesperson.name
SRC:salesperson.name
SRC:Salesperson.Name
Question 127Skipped
What happens to historical data when the retention period for an object ends?
Correct answer
Overall explanation
chumma kizhii
#dm
When the retention period ends for an object, the historical data is moved into Snowflake
Fail-safe.
Question 128Skipped
Storage integration
User
Correct answer
Stage
Role
Overall explanation
Question 129Skipped
What is the purpose of using the OBJECT_CONSTRUCT function with the COPY INTO
command?
Correct answer
Convert the rows in a relational table to a single VARIANT column and then unload the
rows into a file.
Convert the rows in a source file to a single VARIANT column and then load the rows from
the file to a variant table.
Reorder the rows in a relational table and then unload the rows into a file.
Reorder the data columns according to a target table definition and then unload the rows
into the table.
Overall explanation
An OBJECT can contain semi-structured data and can be used to create hierarchical data
structures.
chumma kizhii
#dm
You can use the OBJECT_CONSTRUCT function combined with the COPY command to
convert the rows in a relational table to a single VARIANT column and unload the rows into a
file.
Question 130Skipped
Which of the following are options when creating a Virtual Warehouse? (Choose two.)
Auto-disable
Auto-enable
Auto-resize
Correct selection
Auto-suspend
Correct selection
Auto-resume
Auto-drop
Overall explanation
Question 131Skipped
In which layer of its architecture does Snowflake store its metadata statistics?
Compute Layer
Storage Layer
Correct answer
Database Layer
Overall explanation
Question 132Skipped
It specified whether Snowflake overwrites the encryption key you used to upload the files.
chumma kizhii
#dm
Correct answer
It specifies whether Snowflake overwrites an existing file with the same name during
upload.
It specified whether Snowflake should overwrite the gzip compress algorithm with
another one that you provide.
Overall explanation
If there is a file called "mydata.csv" in the stage, it won't load it. However, using the
OVERWRITE option, it will load it:
Question 133Skipped
Correct selection
Correct selection
Overall explanation
Question 134Skipped
chumma kizhii
#dm
Correct answer
Overall explanation
Question 135Skipped
Query results are stored in the Result Cache for how long after they are last accessed,
assuming no data changes have occurred?
12 hours
Correct answer
24 hours
1 Hour
3 Hours
Overall explanation
Question 136Skipped
Correct selection
File keys
chumma kizhii
#dm
Correct selection
Overall explanation
Question 137Skipped
A Snowflake user wants to share data using my_share with account xy12345.
Correct answer
chumma kizhii
#dm
Overall explanation
ALTER SHARE [ IF
Question 138Skipped
Which clients does Snowflake support Multi-Factor Authentication (MFA) token caching for?
(Choose two.)
Correct selection
Python connector
Correct selection
ODBC driver
GO driver
Spark connector
Node.js driver
Overall explanation
Snowflake supports MFA token caching with the following drivers and connectors on macOS
and Windows. This feature is not supported on Linux.
Question 139Skipped
A redirect of client connections to Snowflake accounts in the same regions for data
replication.
chumma kizhii
#dm
A redirect of client connections to Snowflake accounts in the same regions for business
continuity.
Correct answer
Overall explanation
Client Redirect enables redirecting your client connections to Snowflake accounts in different
regions for business continuity and disaster recovery, or when migrating your account to
another region or cloud platform.
Question 140Skipped
Which stream type can be used for tracking the records in external tables?
Correct answer
Insert-only
Append-only
Standard
External
Overall explanation
Question 141Skipped
Correct answer
Overall explanation
chumma kizhii
#dm
Multi-cluster warehouses are best utilized for scaling resources to improve concurrency for
users/queries. They are not as beneficial for improving the performance of slow-running
queries or data loading. For these types of operations, resizing the warehouse provides
more benefits.
Question 142Skipped
Which of the following is true of Snowpipe via REST API? (Choose two.)
Correct selection
Snowflake automatically manages the compute required to execute the Pipe's COPY INTO
commands
Correct selection
Overall explanation
Question 143Skipped
What are the types of data consumer accounts available in Snowflake? (Choose two.)
Subscriber account
Shared Account
Public Account
Correct selection
Full Account
Correct selection
Reader Account
Overall explanation
chumma kizhii
#dm
There are two types of data consumers. The first one is the Full Accounts, the consumers
with existing Snowflake accounts. In this case, the consumer account pays for the queries
they make. We also have the Reader Accounts, the consumers without Snowflake accounts.
In this last case, the producer account pays all the compute credits that their warehouses
use. You can see this behavior in the following diagram:
Question 144Skipped
It is read-only and prevents the shared data from being updated by the provider
Correct answer
It provides limited access to the data share and is therefore cheaper for the data provider
Overall explanation
Question 145Skipped
The first user assigned to a new account, ACCOUNTADMIN, should create at least one
additional user with which administrative privilege?
PUBLIC
Correct answer
USERADMIN
SYSADMIN
ORGADMIN
Overall explanation
chumma kizhii
#dm
By default, when your account is provisioned, the first user is assigned the ACCOUNTADMIN
role. This user should then create one or more additional users who are assigned the
USERADMIN role. All remaining users should be created by the user(s) with the USERADMIN
role or another role that is granted the global CREATE USER privilege.
Question 146Skipped
What value provides information about disk usage for operations where intermediate results
do not fit in memory in a Query Profile?
Network
Correct answer
Spilling
IO
Pruning
Overall explanation
Spilling — information about disk usage for operations where intermediate results do not fit
in memory
Question 147Skipped
Correct answer
Overall explanation
Business intelligence (BI) tools enable analyzing, discovering, and reporting on data to help
make more informed business decisions. They use dashboards, charts, or other graphical
tools to deliver data visualization. We can see the Snowflake ecosystem in the following
image:
chumma kizhii
#dm
Question 148Skipped
Which Snowflake edition (and above) allows until 90 days of Time Travel?
Business Critical.
Standard.
Correct answer
Enterprise.
Overall explanation
By default, Time travel is enabled with a 1-day retention period. However, we can increase it
to 90 days if we have (at least) the Snowflake Enterprise Edition. It requires additional
storage, which will be reflected in your monthly storage charges.
Question 149Skipped
What type of columns does Snowflake recommend to be used as clustering keys? (Choose
two.)
chumma kizhii
#dm
Correct selection
Correct selection
A VARIANT column
Overall explanation
Question 150Skipped
Which of the following statements are true concerning the Snowflake release process?
(Choose three.)
A customer is assigned a 30 minute window (that can be moved anytime within a week)
during which the system will be unavailable and customer is upgraded
Correct selection
Snowflake deploys patch releases every week, but new feature releases happen once a
month.
Correct selection
Correct selection
It is possible for you as a user to request 24-hour early access to the upcoming releases so
that you can do additional release testing before the release is rolled out.
There is usually some minimal downtime associated with Snowflake during the
deployments.
Overall explanation
Question 151Skipped
chumma kizhii
#dm
Fail-Safe.
Correct answer
Time-Travel.
Zero-Copy Cloning.
Overall explanation
Time-Travel enables accessing historical data (i.e., data that has been changed or deleted) at
any point within a defined period. If we drop a table, we can restore it with time travel. You
can use it with Databases, Schemas & Tables. The following diagram explains how Time-
Travel works:
Question 152Skipped
Correct answer
100.
10.
Overall explanation
chumma kizhii
#dm
Users can define a simple tree-like structure of tasks that starts with a root task and is linked
together by task dependencies. A tree of tasks can have a maximum of 1000 tasks, including
the root one. Also, each task can have a maximum of 100 children.
Question 153Skipped
What are the available Snowflake scaling modes for configuring multi-cluster virtual
warehouses? (Choose two.)
Correct selection
Maximized
Scale-Out
Correct selection
Auto-Scale
Standard
Economy
Overall explanation
Note, that there are Scaling Policies called "Economy" and "Standard" in "Auto-Scale" mode.
Question 154Skipped
A Snowflake user wants to optimize performance for a query that queries only a small
number of rows in a table. The rows require significant processing. The data in the table
does not change frequently.
Correct answer
Overall explanation
chumma kizhii
#dm
• Query results contain a small number of rows and/or columns relative to the base
table (the table on which the view is defined).
• The query is on an external table (i.e. data sets stored in files in an external stage),
which might have slower performance compared to querying native database tables.
Question 155Skipped
Which command can be used to list all the file formats for which a user has access
privileges?
LIST
Correct answer
Overall explanation
Lists the file formats for which you have access privileges. This command can be used to list
the file formats for a specified database or schema (or the current database/schema for the
session), or your entire account.
SET 2
chumma kizhii
#dm
Question 1Incorrect
Correct answer
Column-level security
Overall explanation
Question 2Incorrect
No, because the size of the cache is independent from the warehouse size.
Correct answer
Possibly, if the warehouse is resized to a smaller size and the cache no longer fits.
Yes, because the compute resource is replaced in its entirety with a new compute
resource.
Yes, because the new compute resource will no longer have access to the cache encryption
key.
Overall explanation
Question 3Incorrect
Which privileges are required for a user to restore an object? (Choose two.)
UPDATE
UNDROP
MODIFY
chumma kizhii
#dm
OWNERSHIP
Correct selection
CREATE
Overall explanation
Question 4Skipped
Authorization to execute CREATE [object] statements comes only from which role?
Application role
Secondary role
Database role
Correct answer
Primary role
Overall explanation
Note that authorization to execute CREATE <object> statements to create objects is provided
by the primary role.
Question 5Skipped
Correct selection
Correct selection
Compute can be scaled up or down without the requirement to add more storage.
Overall explanation
Both can be managed saperately as per business needs. They are not bounded, this is what
cloud is for.
chumma kizhii
#dm
Question 6Skipped
Correct answer
Overall explanation
Query Profile provides execution details for a query. For the selected query, it provides a
graphical representation of the main components of the processing plan for the query, with
statistics for each component, along with details and statistics for the overall query.
Question 7Skipped
Correct answer
They are a set of rules that control access to Snowflake accounts by specifying the IP
addresses or ranges of IP addresses that are allowed to connect to Snowflake.
They are a set of rules that dictate how Snowflake accounts can be used between multiple
users.
They are a set of rules that define the network routes within Snowflake.
They are a set of rules that define how data can be transferred between different
Snowflake accounts within an organization.
Overall explanation
Question 8Skipped
What SQL command would be used to view all roles that were granted to USER1?
Correct answer
chumma kizhii
#dm
Overall explanation
ROLE role_name
USER user_name
Lists all the roles granted to the user. Note that the PUBLIC role, which is automatically
available to every user, is not listed.
Question 9Skipped
Which data types can be used in a Snowflake table that holds semi-structured data? (Choose
two.)
VARCHAR
Correct selection
VARIANT
BINARY
TEXT
Correct selection
ARRAY
Overall explanation
Question 10Skipped
Raw format
Correct selection
Compressed format
Correct selection
Columnar format
chumma kizhii
#dm
Zipped format
Binary format
Overall explanation
When data is loaded into Snowflake, Snowflake reorganizes that data into its internal
optimized, compressed, columnar format.
Question 11Skipped
Correct answer
Overall explanation
Question 12Skipped
Tableau
DBeaver
Correct answer
Protegrity
SAP
Overall explanation
chumma kizhii
#dm
Question 13Skipped
Which system functions are available in Snowflake to view/monitor the clustering metadata
for a table? (Choose two.)
1. SYSTEM$CLUSTERING_STATUS
1. SYSTEM$CLUSTERING
Correct selection
1. SYSTEM$CLUSTERING_INFORMATION
Correct selection
1. SYSTEM$CLUSTERING_DEPTH
1. SYSTEM$CLUSTERING_METADATA
Overall explanation
The clustering depth measures the average depth of the overlapping micro-partitions for
specified columns in a table (1 or greater). The smaller the cluster depth is, the better
chumma kizhii
#dm
clustered the table is. You can use any previous commands to get the Cluster Depth of a
table.
Question 14Skipped
Which of the following services are NOT provided by the Cloud Services Layer? (Choose
two.)
Infrastructure Management.
Correct selection
Query Execution.
Authentication.
Metadata Management.
Correct selection
Storage.
Overall explanation
The Cloud Services layer is a collection of services coordinating activities across Snowflake.
It's in charge of Authentication, Infrastructure management, Metadata management, Query
parsing and optimization, and Access control.
Question 15Skipped
Correct answer
Data shares are cloud agnostic and can cross regions by default.
Overall explanation
• For data security and privacy reasons, only secure views are supported in shares at
this time. If a standard view is added to a share, Snowflake returns an error.
• With Secure Data Sharing, no actual data is copied or transferred between accounts.
All sharing uses Snowflake’s services layer and metadata store.
chumma kizhii
#dm
• Database replication will be needed to allow data providers to securely share data
with data consumers across different regions and cloud platforms.
Question 16Skipped
Correct answer
Query details such as the objects included and the user who executed the query
Names and owners of the roles that are currently enabled in the session
Details around the privileges that have been granted for all objects in an account
Overall explanation
Question 17Skipped
A user is loading JSON documents composed of a huge array containing multiple records into
Snowflake. The user enables the STRIP_OUTER_ARRAY file format option.
It removes the NULL elements from the JSON object eliminating invalid data and enables
the ability to load the records.
It removes the trailing spaces in the last element of the outer array and loads the records
into separate table columns.
Correct answer
It removes the outer array structure and loads the records into separate table rows.
Overall explanation
STRIP_OUTER_ARRAY, Removes the outer set of square brackets [ ] when loading the data,
separating the initial array into multiple lines
Question 18Skipped
chumma kizhii
#dm
The warehouse cache persists for as long as the warehouse exists, regardless of its
suspension status.
The cache is maintained for the auto_suspend duration and can be restored if the
warehouse is restarted within this limit.
The cache is maintained for up to two hours and can be restored if the warehouse is
restarted within this limit.
Correct answer
The cache is dropped when the warehouse is suspended and is no longer available upon
restart.
Overall explanation
This cache is dropped when the warehouse is suspended, which may result in slower initial
performance for some queries after the warehouse is resumed.
Question 19Skipped
Which Snowflake URL type allows users or applications to download or access files directly
from Snowflake stage without authentication?
File
Scoped
Directory
Correct answer
Pre-signed
Overall explanation
Pre-signed URLs are used to download or access files, via a web browser for example,
without authenticating into Snowflake or passing an authorization token.
Question 20Skipped
What is the MINIMUM configurable idle timeout value for a session policy in Snowflake?
2 minutes
15 minutes
10 minutes
chumma kizhii
#dm
Correct answer
5 minutes
Overall explanation
The timeout period begins upon a successful authentication to Snowflake. If a session policy
is not set, Snowflake uses a default value of 240 minutes (i.e. 4 hours). The minimum
configurable idle timeout value for a session policy is 5 minutes. When the session expires,
the user must authenticate to Snowflake again.
Question 21Skipped
Which semi-structured file formats are supported when unloading data from a table?
(Choose two.)
XML
Correct selection
Parquet
Avro
Correct selection
JSON
ORC
Overall explanation
Question 22Skipped
The average number of micro-partitions in the table associated with cloned objects.
Correct answer
Overall explanation
chumma kizhii
#dm
Question 23Skipped
A Snowflake user needs to share unstructured data from an internal stage to a reporting tool
that does not have Snowflake access.
BUILD_STAGE_FILE_URL
GET_STAGE_LOCATION
Correct answer
GET_PRESIGNED_URL
BUILD_SCOPED_FILE_URL
Overall explanation
Pre-signed URL function is used to download or access files without authenticating into
Snowflake or passing an authorization token. Pre-signed URLs are open; any user or
application can directly access or download the files. Ideal for business intelligence
applications or reporting tools that need to display the unstructured file contents.
Question 24Skipped
A Snowflake user is trying to load a 125 GB file using SnowSQL. The file continues to load for
almost an entire day.
The file will stop loading and all data up to that point will be committed.
Correct answer
The file loading could be aborted without any portion of the file being committed.
The file’s number of allowable hours to load can be programmatically controlled to load
easily into Snowflake.
The file will continue to load until all contents are loaded.
Overall explanation
Note
chumma kizhii
#dm
If you must load a large file, carefully consider the ON_ERROR copy option value. Aborting or
skipping a file due to a small number of errors could result in delays and wasted credits. In
addition, if a data loading operation continues beyond the maximum allowed duration of 24
hours, it could be aborted without any portion of the file being committed.
Question 25Skipped
From what stage can a Snowflake user omit the FROM clause while loading data into a
table?
Correct answer
Overall explanation
Note that when copying data from files in a table stage, the FROM clause can be omitted
because Snowflake automatically checks for files in the table stage.
Question 26Skipped
What are supported file formats for unloading data from Snowflake? (Choose three.)
AVRO
XML
ORC
Correct selection
CSV
Correct selection
JSON
Correct selection
Parquet
chumma kizhii
#dm
Overall explanation
Question 27Skipped
What storage cost is completely eliminated when a Snowflake table is defined as transient?
Correct answer
Fail-safe
Active
Staged
Time Travel
Overall explanation
Similar to permanent tables, transient tables contribute to the overall storage charges that
Snowflake bills your account; however, because transient tables do not utilize Fail-safe, there
are no Fail-safe costs (i.e. the costs associated with maintaining the data required for Fail-
safe disaster recovery).
Question 28Skipped
Correct selection
Masking policies
Correct selection
Snowflake-managed keys
Overall explanation
Question 29Skipped
chumma kizhii
#dm
What actions can the resource monitor associated with a Warehouse take when it reaches
(or is about to) hit the limit? (Choose three.)
Correct selection
Correct selection
Correct selection
Overall explanation
A resource monitor can Notify, Notify & Suspend, and Notify & Suspend Immediately. You
can see these three actions in the following image:
Question 30Skipped
Which of the following are options when creating a Virtual Warehouse? (Choose two.)
Auto-enable
Correct selection
Auto-suspend
chumma kizhii
#dm
Correct selection
Auto-resume
User count
Overall explanation
Question 31Skipped
Correct answer
Standard
Premier
Enterprise
Overall explanation
Question 32Skipped
What can you easily check to see if a large table will benefit from explicitly defining a
clustering key?
Correct answer
Clustering depth.
Clustering status.
Values in a table.
Clustering ratio.
Overall explanation
The clustering depth measures the average depth of the overlapping micro-partitions for
specified columns in a table (1 or greater). The smaller the cluster depth is, the better
clustered the table is. You can get the clustering depth of a Snowflake table using this
command:
Question 33Skipped
chumma kizhii
#dm
Correct answer
Overall explanation
A single monitor can be set at the account level to control credit usage for all warehouses in
your account.
Question 34Skipped
Which VALIDATION_MODE value will return the errors across the files specified in a COPY
command, including files that were partially loaded during an earlier load?
RETURN_ERRORS
RETURN_n_ROWS
RETURN_-1_ROWS
Correct answer
RETURN_ALL_ERRORS
Overall explanation
RETURN_ALL_ERRORS
Returns all errors across all files specified in the COPY statement, including files with errors
that were partially loaded during an earlier load because the ON_ERROR copy option was set
to CONTINUE during the load.
Question 35Skipped
Correct answer
Fail-safe makes data available for 7 days, recoverable only by Snowflake Support.
chumma kizhii
#dm
Overall explanation
Fail-safe provides a (non-configurable) 7-day period during which historical data may be
recoverable by Snowflake. This period starts immediately after the Time Travel retention
period ends.
Question 36Skipped
How can a user get the MOST detailed information about individual table storage details in
Snowflake?
TABLES view
Correct answer
TABLE_STORAGE_METRICS view
Overall explanation
Question 37Skipped
Which function is used to convert rows in a relational table to a single VARIANT column?
ARRAY_CONSTRUCT
chumma kizhii
#dm
ARRAY_AGG
Correct answer
OBJECT_CONSTRUCT
OBJECT_AGG
Overall explanation
You can use the OBJECT_CONSTRUCT function combined with the COPY command to
convert the rows in a relational table to a single VARIANT column and unload the rows into a
file.
Question 38Skipped
Correct answer
Overall explanation
There are two types of partners, technology partners, and solution partners. The technology
partners are the ones that integrate their solutions with Snowflake, and they can be divided
into Data Integration, ML & Data Science, Security & Governance, Business Intelligence, SQL
Editors, and Programming Interfaces. In this case, they all belong to the Data Integration
Partners. You can see the Snowflake ecosystem in the following image:
chumma kizhii
#dm
Question 39Skipped
Standard.
Correct answer
All of them.
Enterprise.
Business Critical.
Overall explanation
MFA login is designed primarily for connecting to Snowflake through the web interface, but
it is also fully supported by SnowSQL and the Snowflake JDBC and ODBC drivers. MFA is
available in all the Snowflake editions as another security layer, and you can set it in the
settings tab:
chumma kizhii
#dm
Question 40Skipped
Which Snowflake objects are automatically created by default every time you create a
database? (Choose two.)
Correct selection
The METADATA_SCHEMA.
The DEFAULT_SCHEMA.
Correct selection
The INFORMATION_SCHEMA.
The ANALYTICS_SCHEMA.
Overall explanation
The INFORMATION_SCHEMA contains views for all the objects in the database, as well as
views for account-level objects, and table functions for historical and usage data across your
account. The PUBLIC schema is the default schema for the database, and all objects are, by
default, created inside it if no other schema is specified.
chumma kizhii
#dm
Question 41Skipped
What is generally the FASTEST way to bulk load data files from a stage?
Correct answer
Overall explanation
Of the three bulk load options for identifying/specifying data files to load from a stage,
providing a discrete list of files is generally the fastest; however, the FILES parameter
supports a maximum of 1,000 files, meaning a COPY command executed with the FILES
parameter can only load up to 1,000 files.
Question 42Skipped
Premier
Correct answer
chumma kizhii
#dm
Enterprise
Standard
Overall explanation
Question 43Skipped
OBJECT_HISTORY
Correct answer
ACCESS_HISTORY
LOGIN_HISTORY
VIEWS_HISTORY
Overall explanation
Querying the ACCESS_HISTORY View This Account Usage view can be used to query the
access history of Snowflake objects (e.g. table, view, column) within the last 365 days (1
year).
Question 44Skipped
In Snowflake, the use of federated authentication enables which Single Sign-On (SSO)
workflow activities? (Choose two.)
Authorizing users
Correct selection
Correct selection
Overall explanation
chumma kizhii
#dm
Question 45Skipped
Data storage for individual tables can be monitored using which commands and/or objects?
(Choose two.)
Correct selection
SHOW TABLES;
Correct selection
Overall explanation
Question 46Skipped
Which are the additional columns that the streams create? (Choose three.)
METADATA$IS_DELETED
METADATA$COLUMN_ID
Correct selection
METADATA$ACTION
Correct selection
METADATA$ROW_ID
Correct selection
METADATA$ISUPDATE
chumma kizhii
#dm
METADATA$ISREAD
Overall explanation
Question 47Skipped
What happens to the incoming queries when a warehouse does not have enough resources
to process them?
Correct answer
Queries are queued and executed when the warehouse has resources.
Overall explanation
If the warehouse does not have enough remaining resources to process a query, the query is
queued, pending resources that become available as other running queries complete.
Question 48Skipped
What is the only supported character set for loading and unloading data from all supported
file formats?
ISO-8859-1
UTF-16
WINDOWS-1253
Correct answer
UTF-8
Overall explanation
Question 49Skipped
Correct answer
chumma kizhii
#dm
Data provider
Reader account
Managed account
Data consumer
Overall explanation
A data provider is any Snowflake account that creates shares and makes them available to
other Snowflake accounts to consume.
Question 50Skipped
Schedule, Actions.
Correct answer
Overall explanation
The Credit Quota specifies the number of Snowflake credits allocated to the monitor for the
specified frequency interval.
The Monitor Level specifies whether the resource monitor is used to monitor the credit
usage for your entire account or individual warehouses.
The Schedule indicates when the monitor will start monitoring and when the credits will
reset to 0.
Each action specifies a threshold and the action to perform when the threshold is reached
within the specified interval.
Question 51Skipped
Correct answer
Insert a colon (:) between the VARIANT column name and any first-level element.
chumma kizhii
#dm
Insert a double colon (::) between the VARIANT column name and any second-level
element.
Insert a colon (:) between the VARIANT column name and any second-level element.
Insert a double colon (::) between the VARIANT column name and any first-level element.
Overall explanation
Question 52Skipped
Queries in Snowflake are getting queued on the warehouses and delaying the ETL processes
of the company. What are the possible solution options you can think of, considering we
have the Snowflake Enterprise edition? (Choose two.)
Correct selection
Correct selection
Overall explanation
By resizing the warehouse, your company will scale up, reducing the time to execute big
queries. Using multi-cluster warehouses, you will have more queries running simultaneously
and a high concurrency when they execute, and this is the definition of scaling out. You can
see the differences between the different ways to scale in the following picture:
chumma kizhii
#dm
Question 53Skipped
Which property needs to be added to the ALTER WAREHOUSE command to verify the
additional compute resources for a virtual warehouse have been fully provisioned?
SCALING_POLICY
RESOURCE_MONITOR
QUERY_ACCELERATION_MAX_SCALE_FACTOR
Correct answer
WAIT_FOR_COMPLETION
Overall explanation
Question 54Skipped
What transformations are supported in a CREATE PIPE ... AS COPY `¦ FROM (`¦) statement?
(Choose two.)
Correct selection
Correct selection
Overall explanation
Question 55Skipped
Correct answer
Per-second/per-core granularity
chumma kizhii
#dm
Overall explanation
Question 56Skipped
Manage roles.
Correct answer
Manage the data that 3rd party applications upload to the marketplace.
Manage users.
Overall explanation
You can provide and consume listings offered privately or publicly using the Snowflake
Marketplace, discovering and accessing a variety of third-party datasets. Becoming a
provider of listings in Snowflake makes it easier to manage sharing from your account to
other Snowflake accounts.
Question 57Skipped
Which Snowflake feature is used for both querying and restoring data?
Fail-safe
Correct answer
Time Travel
Cluster keys
Cloning
Overall explanation
Question 58Skipped
By default, the COPY INTO statement will separate table data into a set of output files to take
advantage of which Snowflake feature?
Correct answer
chumma kizhii
#dm
Parallel processing
Time Travel
Query acceleration
Overall explanation
By default, the COPY INTO statement will separate table data into a set of output files to take
advantage of Snowflake's parallel processing feature. This means that when data is
unloaded, it can be split into multiple files and each file can be processed simultaneously by
different nodes in the cluster, improving performance. The number of output files can be
controlled by specifying the number of file parts in the COPY INTO statement.
Question 59Skipped
How does a Snowflake user extract the URL of a directory table on an external stage for
further transformation?
Correct answer
Overall explanation
The GET_STAGE_LOCATION function returns the location of a stage, including the URL of the
directory table. The syntax for the GET_STAGE_LOCATION function.
SELECT GET_STAGE_LOCATION(@my_stage)
Question 60Skipped
Correct answer
chumma kizhii
#dm
Overall explanation
Question 61Skipped
When loading data into Snowflake via Snowpipe what is the compressed file size
recommendation?
1000-1500 MB
10-50 MB
Correct answer
100-250 MB
300-500 MB
Overall explanation
We recommend files at least above 10 MB on average, with files in the 100 to 250 MB range
offering the best cost-to-performance ratio.
Question 62Skipped
What is used during the FIRST execution of SELECT COUNT(*) FROM ORDER?
Cache result
Correct answer
Metadata-based result
Overall explanation
Some queries include steps that are pure metadata/catalog operations rather than data-
processing operations. These steps consist of a single operator. Some examples include:
Metadata-based Result
chumma kizhii
#dm
A query whose result is computed based purely on metadata, without accessing any data.
These queries are not processed by a virtual warehouse. For example:
SELECT CURRENT_DATABASE()
Question 63Skipped
ORGADMIN
ACCOUNTADMIN
SYSADMIN
Correct selection
USERADMIN
Correct selection
SECURITYADMIN
Overall explanation
Custom account roles can be created using the USERADMIN role (or a higher role)
Question 64Skipped
Users with the ACCOUNTADMIN role can execute which of the following commands on
existing users?
Can SHOW users, DEFINE a given user or ALTER, DROP, or MODIFY a user
Correct answer
Overall explanation
Only these operations are allowed CREATE USER , ALTER USER , DROP USER , DESCRIBE USER
chumma kizhii
#dm
Question 65Skipped
insertReport
Correct answer
GET /api/files/
insertFiles
loadHistoryScan
Overall explanation
Question 66Skipped
What type of account can be used to share data with a consumer who does not have a
Snowflake account?
Organization
Data provider
Data consumer
Correct answer
Reader
Overall explanation
Data sharing is only supported between Snowflake accounts. As a data provider, you might
want to share data with a consumer who does not already have a Snowflake account or is
not ready to become a licensed Snowflake customer.
To facilitate sharing data with these consumers, you can create reader accounts. Reader
accounts (formerly known as “read-only accounts”) provide a quick, easy, and cost-effective
way to share data without requiring the consumer to become a Snowflake customer.
Question 67Skipped
chumma kizhii
#dm
When after 5-6 consecutive checks the system determines that the load on the most-
loaded cluster could be redistributed.
Correct answer
When after 2-3 consecutive checks the system determines that the load on the least-
loaded cluster could be redistributed.
When after 5-6 consecutive checks the system determines that the load on the least-
loaded cluster could be redistributed.
When after 2-3 consecutive checks the system determines that the load on the most-
loaded cluster could be redistributed.
Overall explanation
Question 68Skipped
As a best practice, clustering keys should only be defined on tables of which minimum size?
Correct answer
Overall explanation
Question 69Skipped
USERADMIN
Correct answer
ACCOUNTADMIN
SYSADMIN
SECURITYADMIN
chumma kizhii
#dm
Overall explanation
ACCOUNTADMIN is the only role that is able to create Shares and Resource Monitors by
default. However, account administrators can choose to enable users with other roles to
view and modify resource monitors using SQL.
Question 70Skipped
Which of the following are not types of streams in Snowflake? (Choose two.)
Append-only.
Correct selection
Update-only.
Insert-only.
Standard.
Correct selection
Merge-only.
Overall explanation
Standard and Append-only streams are supported on tables, directory tables, and views. The
Standard one tracks all DML changes to the source table, including inserts, updates, and
deletes, whereas the Append-only one Tracks row inserts only. The Insert-only stream also
tracks row inserts only. The difference with the previous one is that this one is only
supported on EXTERNAL TABLES.
Question 71Skipped
Correct answer
Overall explanation
chumma kizhii
#dm
An external function is a type of UDF. Unlike other UDFs, an external function does not
contain its own code; instead, the external function calls code that is stored and executed
outside Snowflake
Question 72Skipped
Which statements are NOT correct about micro-partitions in Snowflake? (Choose two.)
Correct selection
Correct selection
Overall explanation
This definition is a must, and we need to know it perfectly “All data in Snowflake tables are
automatically divided into micro-partitions, which are contiguous units of storage between
50 and 500MB of uncompressed data, organized in a columnar way”.
Question 73Skipped
Correct answer
Use a stored procedure executing multiple SQL statements and invoke the stored
procedure from the task. CREATE TASK mytask .... AS call
stored_proc_multiple_statements_inside();
Create a task for each SQL statement (e.g. resulting in task1, task2, etc.) and string the
series of SQL statements by having a control task calling task1, task2, etc. sequentially.
Include the SQL statements in the body of the task CREATE TASK mytask .. AS INSERT INTO
target1 SELECT .. FROM stream_s1 WHERE .. INSERT INTO target2 SELECT .. FROM
stream_s1 WHERE ..
chumma kizhii
#dm
A stored procedure can have only one DML statement per stored procedure invocation
and therefore the user should sequence stored procedure calls in the task
definition CREATE TASK mytask .... AS call stored_proc1(); call stored_proc2();
Overall explanation
Question 74Skipped
Which of the following commands are valid options for the VALIDATION_MODE parameter
within the Snowflake COPY_INTO command? (Choose two.)
TRUE
Correct selection
RETURN_ALL_ERRORS
RETURN_FIRST_n_ERRORS
Correct selection
RETURN_n_ROWS
RETURN_ERROR_SUM
Overall explanation
Question 75Skipped
Correct selection
Schema
Virtual warehouse
Correct selection
Database
Table
Account
Overall explanation
chumma kizhii
#dm
Question 76Skipped
What is the most granular object that the Time Travel retention period can be defined on?
Database
Account
Correct answer
Table
Schema
Overall explanation
The time travel data retention can be overwritten at the table level "When creating a table,
schema, or database, the account default can be overridden using the
DATA_RETENTION_TIME_IN_DAYS parameter in the command."
Question 77Skipped
What are the responsibilities of Snowflake's Cloud Service layer? (Choose three.)
Correct selection
Correct selection
Authentication
Query execution
Correct selection
Metadata management
Overall explanation
The cloud services layer is a collection of services that coordinate activities across Snowflake.
These services tie together all of the different components of Snowflake in order to process
user requests, from login to query dispatch. The cloud services layer also runs on compute
instances provisioned by Snowflake from the cloud provider.
• Authentication
chumma kizhii
#dm
• Infrastructure management
• Metadata management
• Access control
Question 78Skipped
Which of the following are best practices for loading data into Snowflake? (Choose three.)
Correct selection
Correct selection
Split large files into a greater number of smaller files to distribute the load among the
compute resources in an active warehouse.
Load data from files in a cloud storage service in a different region or cloud platform from
the service or region containing the Snowflake account, to save on cost.
Partition the staged data into large folders with random paths, allowing Snowflake to
determine the best way to load each file.
Correct selection
Aim to produce data files that are between 100 MB and 250 MB in size, compressed.
When planning which warehouse(s) to use for data loading, start with the largest
warehouse possible.
Overall explanation
To optimize the number of parallel operations for a load, we recommend aiming to produce
data files roughly 100-250 MB (or larger) in size compressed.
Fields that contain delimiter characters should be enclosed in quotes (single or double). If
the data contains single or double quotes, then those quotes must be escaped.
Split larger files into a greater number of smaller files to distribute the load among the
compute resources in an active warehouse.
Question 79Skipped
chumma kizhii
#dm
Masking policies
Object tags
Correct answer
OBJECT_DEPENDENCIES view
Overall explanation
Object tags, Masking Policies and Row Access Policies are for Enterprise and above editions.
The only option remaining is Object Dependencies
Question 80Skipped
Which of the following accurately represents how a table fits into Snowflake's logical
container hierarchy?
Correct answer
Overall explanation
Question 81Skipped
Which of the following are characteristics of Snowflake virtual warehouses? (Choose two.)
Correct selection
SnowSQL supports both a configuration file and a command line option for specifying a
default warehouse.
Correct selection
A user cannot specify a default warehouse when using the ODBC driver.
chumma kizhii
#dm
Auto-resume applies only to the last warehouse that was started in a multi-cluster
warehouse.
Overall explanation
Question 82Skipped
Which are the metadata columns for staged files? (Choose two.)
Correct selection
METADATA$FILE_ROW_NUMBER
Correct selection
METADATA$FILENAME
METADATA$FILE_SIZE
METADATA$FILE_ROW_ID
METADATA$FILEFORMAT
Overall explanation
The METADATA$FILENAME column is the name of the staged data file that the current row
belongs to. The METADATA$FILE_ROW_NUMBER is the row number for each record in the
container staged data file. This is a way of query the stage metadata:
You can see another example (via docs.snowflake.com) in the following image:
chumma kizhii
#dm
Question 83Skipped
What setting in Snowsight determines the databases, tables, and other objects that can be
seen and the actions that can be performed on them?
Masking policy
Correct answer
Active role
Column-level security
Overall explanation
While using Snowsight, you can change the active role in your current session. Your active
role determines the databases, tables, and other objects you can see and the actions you
can perform on them.
Question 84Skipped
The use of which Snowflake table type will reduce costs when working with ETL workflows?
Permanent
Correct answer
Temporary
Transient
External
Overall explanation
chumma kizhii
#dm
Snowflake supports creating temporary tables for storing non-permanent, transitory data
(e.g. ETL data, session-specific data). Temporary tables only exist within the session in which
they were created and persist only for the remainder of the session.
Question 85Skipped
Correct answer
Overall explanation
Question 86Skipped
After how many days does the COPY INTO load metadata expire?
Correct answer
64 days.
1 day.
180 days.
14 days.
Overall explanation
The information about the loaded files is stored in Snowflake metadata. It means that you
cannot COPY the same file again in the next 64 days unless you specify it (with the
"FORCE=True" option in the COPY command). You can see this behavior in the following
image:
chumma kizhii
#dm
Question 87Skipped
2. (
3. EMP_ID NUMBER,
4. EMP_NAME VARCHAR(30),
5. EMP_SALARY NUMBER,
6. DEPT VARCHAR(20)
7. );
Correct answer
Overall explanation
A warehouse provides the required resources, such as CPU, memory, and temporary storage,
to perform the following operations in a Snowflake session:
Executing SQL SELECT statements that require compute resources (e.g. retrieving rows from
tables and views).
chumma kizhii
#dm
Queries using Snowflake metadata do not require a Warehouse to be turned on and do not
consume credits.
Question 88Skipped
“%”
“/@”
“@”
Correct answer
“@%”
Overall explanation
Table Stage
The following example uploads a file named data.csv in the /data directory on your local
machine to the stage for a table named mytable.
• Linux or macOS
• Windows
Question 89Skipped
Correct answer
chumma kizhii
#dm
Overall explanation
When you create a multi-cluster warehouse, you need to specify a scaling policy to help you
control the credits consumed by the multi-cluster warehouse. There are two types of
policies; the "Standard policy" prioritizes starting additional warehouses over conserving
credits. The "Economy policy" is a more restrictive policy that prioritizes conserving credits
over conserving starting additional warehouses. You can set the scaling policy by executing
the CREATE WAREHOUSE or ALTER WAREHOUSE command specifying the SCALING_POLICY.
For example:
Question 90Skipped
Results cache.
Metadata cache.
Correct answer
Warehouse cache.
Standard cache.
Overall explanation
Every warehouse has an attached Warehouse cache, also known as the SSD or Local cache.
While the data warehouse runs, the table fetched in the query will remain in this cache.
When the warehouse is suspended, the information will be lost.
Question 91Skipped
What privilege does a user need in order to receive or request data from the Snowflake
Marketplace?
Correct answer
IMPORT SHARE
CREATE SHARE
IMPORTED PRIVILEGES
Overall explanation
You must use the ACCOUNTADMIN role or another role with the CREATE DATABASE and
IMPORT SHARE privileges to access a listing.
chumma kizhii
#dm
Question 92Skipped
Which SQL command should be used to validate which data was loaded into the stage?
1. verify @file_stage
Correct answer
1. list @file_stage
1. view @file_stage
1. show @file_stage
Overall explanation
Question 93Skipped
Which Snowflake object returns a set of rows instead of a single, scalar value, and can be
accessed in the FROM clause of a query?
Correct answer
UDTF
UDF
Stored procedure
Overall explanation
User-defined functions (UDFs) let you extend the system to perform operations that are not
available through Snowflake’s built-in, system-defined functions.
UDTFs can return multiple rows for each input row; that’s the only difference with UDFs.
Question 94Skipped
By default, how long is the standard retention period for Time Travel across all Snowflake
accounts?
0 days
90 days
chumma kizhii
#dm
Correct answer
1 day
7 days
Overall explanation
By default, Time travel is enabled with a 1-day retention period. However, we can increase it
to 90 days if we have (at least) the Snowflake Enterprise Edition. It requires additional
storage, which will be reflected in your monthly storage charges. You can see how the Time
Travel functionality works in the following image:
Question 95Skipped
Correct answer
It was designed as a hybrid database to allow customers to store data either on premises
or in the cloud
Overall explanation
Question 96Skipped
chumma kizhii
#dm
Which functions can be used to share unstructured data through a secure view? (Choose
two.)
BUILD_STAGE_FILE_URL
GET_ABSOLUTE_PATH
Correct selection
BUILD_SCOPED_FILE_URL
Correct selection
GET_PRESIGNED_URL
GET_RELATIVE_PATH
Overall explanation
Question 97Skipped
Which of the following objects can be shared through secure data sharing?
Stored procedure
Masking policy
Task
Correct answer
External table
Overall explanation
• Tables
• External tables
• Secure views
• Secure UDFs
Question 98Skipped
Why would a Snowflake user decide to use a materialized view instead of a regular view?
chumma kizhii
#dm
Correct answer
Overall explanation
Question 99Skipped
What is the main function of Business Intelligence tools (e.g., Tableau or Quicksight)?
Transform data from other source systems and move them into Snowflake stages.
Extract data from other source systems and move them into Snowflake stages.
Correct answer
Overall explanation
Business Intelligence (BI) tools are software applications that enable organizations to collect,
analyze, and present data in a user-friendly way to make better business decisions. BI tools
can help organizations identify trends, patterns, and insights in their data, which can
optimize business processes, improve performance, and gain a competitive advantage. You
can create charts like this one using Tableau from Snowflake data. You should create views to
be able to use VARIANT columns in these types of tools, apart from paying them separately.
You can see an example of a Tableau chart in the following image:
chumma kizhii
#dm
Question 100Skipped
How long is the Fail-safe period for temporary and transient tables?
90 days
31 days
7 days
Correct answer
1 day
Overall explanation
Question 101Skipped
IMPORT SHARE
REFERENCES
USAGE
Correct answer
OWNERSHIP
Overall explanation
OWNERSHIP.
The definition of a secure view is only exposed to authorized users (i.e. users who have been
granted the role that owns the view).
chumma kizhii
#dm
Question 102Skipped
The Snowflake Cloud Data Platform is described as having which of the following
architectures?
Shared-nothing
Correct answer
Shared-disk
Overall explanation
Question 103Skipped
What is the lowest Snowflake edition that offers Time Travel up to 90 days?
Correct answer
Enterprise Edition
Premier Edition
Standard Edition
Overall explanation
Question 104Skipped
Which SQL command will download all the data files from an internal table stage named
TBL_EMPLOYEE to a local window directory or folder on a client machine in a folder named
folder with space within the C drive?
chumma kizhii
#dm
Correct answer
Overall explanation
If the directory path includes special characters, the entire file URI must be enclosed in
single quotes. Note that the drive and path separator is a forward slash (/) in enclosed URIs
(e.g. 'file://C:/temp/load data' for a path in Windows that includes a directory named load
data).
Question 105Skipped
45 days
Correct answer
7 days
90 days
1 day
Overall explanation
Question 106Skipped
Correct selection
Solution Partners.
Personalized Partners.
Private Partners.
Standard Partners.
Correct selection
Technology Partners.
Overall explanation
chumma kizhii
#dm
Technology Partners integrate their solutions with Snowflake to get data quickly into
Snowflake and offer software, driver, interfaces, etc. Solution Partners are trusted and
validated experts, like consulting partners. You can see the whole Snowflake ecosystem in
the following image:
Question 107Skipped
Which function will provide the proxy information needed to protect Snowsight?
Correct answer
SYSTEM$ALLOWLIST
SYSTEM$GET_PRIVATELINK
SYSTEM$GET_TAG
SYSTEM$AUTHORIZE_PRIVATELINK
Overall explanation
To determine the fully qualified URL and port for Snowsight, review the
SNOWSIGHT_DEPLOYMENT entry in the return value of the SYSTEM$ALLOWLIST function.
chumma kizhii
#dm
Question 108Skipped
Which validation option is the only one that supports the COPY INTO (location) command?
RETURN__ROWS
RETURN_ERRORS
Correct answer
RETURN_ROWS
RETURN_ALL_ERRORS
Overall explanation
• Loading:
• Unloading:
Question 109Skipped
Correct answer
Overall explanation
Question 110Skipped
chumma kizhii
#dm
When reviewing a query profile, what is a symptom that a query is too large to fit into the
memory?
Correct answer
A single join node uses more than 50% of the query time
Overall explanation
Question 111Skipped
When unloading the data for file format type specified (TYPE = 'CSV'), SQL NULL can be
converted to string ‘null’ using which file format option?
SKIP_BYTE_ORDER_MARK
Correct answer
NULL_IF
EMPTY_FIELD_AS_NULL
ESCAPE_UNENCLOSED_FIELD
Overall explanation
When unloading data from tables: Snowflake converts SQL NULL values to the first value in
the list. Be careful to specify a value that you want interpreted as NULL. For example, if you
are unloading data to a file that will get read by another system, make sure to specify a value
that will be interpreted as NULL by that system.
Question 112Skipped
What is the MINIMUM amount of time that the warehouse will incur charges for when it is
restarted?
Correct answer
chumma kizhii
#dm
60 seconds
60 minutes
5 minutes
1 second
Overall explanation
The minimum billing charge for provisioning compute resources is 1 minute (i.e. 60 seconds).
Question 113Skipped
Tasks are created at the Application level and can only be created by the Account Admin
role.
Many Snowflake DDLs are metadata operations only, and CREATE TASK DDL can be
executed without virtual warehouse requirement or task specific grants.
Correct answer
The role must have access to the target schema and the CREATE TASK privilege on the
schema itself.
Overall explanation
Question 114Skipped
10
Correct answer
Unlimited
Overall explanation
Question 115Skipped
chumma kizhii
#dm
By user.
Correct selection
By timestamp.
By backup.
By session.
Correct selection
Correct selection
By offset.
Overall explanation
Querying over historical data is one of the main functionalities of Snowflake Time Travel,
apart from restoring deleted objects. With Snowflake, you don’t need to duplicate or back
up data from key points in the past. Here you have an example of querying the historical
data by query statement ID:
1. SELECT *
2. FROM my_table
Question 116Skipped
chumma kizhii
#dm
What parameter controls if the Virtual Warehouse starts immediately after the CREATE
WAREHOUSE statement?
START_TIME = CURRENT_DATE()
START_AFTER_CREATE = TRUE/FALSE
Correct answer
INITIALLY_SUSPENDED = TRUE/FALSE
Overall explanation
Syntax
[ [ WITH ] objectProperties ]
[ objectParams ]
Where:
objectProperties ::=
MAX_CLUSTER_COUNT = <num>
MIN_CLUSTER_COUNT = <num>
RESOURCE_MONITOR = <monitor_name>
COMMENT = '<string_literal>'
QUERY_ACCELERATION_MAX_SCALE_FACTOR = <num>
Question 117Skipped
chumma kizhii
#dm
What is the default File Format used in the COPY command if one is not specified?
XML
Parquet
JSON
Correct answer
CSV
Overall explanation
Question 118Skipped
Correct answer
Overall explanation
To add, configure, or remove search optimization for a table, you must have the following
privileges:
• You must have ADD SEARCH OPTIMIZATION privilege on the schema that contains
the table.
Question 119Skipped
Which data formats are supported by Snowflake when unloading semi-structured data?
(Choose two.)
Comma-separated JSON
Correct selection
chumma kizhii
#dm
Correct selection
Overall explanation
Question 120Skipped
Which formats are supported for unloading data from Snowflake? (Choose two.)
Correct selection
JSON
Correct selection
XML
ORC
Avro
Overall explanation
Question 121Skipped
Snowpark
Snowsight
Correct answer
SnowCD
SnowSQL
Overall explanation
chumma kizhii
#dm
SnowCD (i.e. Snowflake Connectivity Diagnostic Tool) helps users to diagnose and
troubleshoot their network connection to Snowflake.
Question 122Skipped
Which data types in Snowflake are synonymous for FLOAT? (Choose two.)
Correct selection
DOUBLE
DECIMAL
NUMERIC
NUMBER
Correct selection
REAL
Overall explanation
Question 123Skipped
Avro
XML
Correct answer
XLSX
ORC
Overall explanation
A File Format object describes and stores the format information required to load data into
Snowflake tables. CSV, JSON, Parquet, XML, Avro & ORC are the supported Snowflake file
formats. To import XLSX, you'll have to convert it to CSV first.
Question 124Skipped
How many warehouses do you need to run to have Snowpipe constantly running?
10
chumma kizhii
#dm
100
Correct answer
Overall explanation
Snowpipe enables loading data when the files are available in any (internal/external) stage.
You use it when you have a small volume of frequent data, and you load it continuously
(micro-batches). Snowpipe is serverless, which means that it doesn’t use Virtual
Warehouses. You can see how Snowpipe works in the following diagram:
Question 125Skipped
If you want a multi-cluster warehouse, which is the lowest Snowflake edition that you should
opt for?
Correct answer
Enterprise.
Business Critical.
Standard.
Overall explanation
You can see some differences between the Snowflake editions in the following image:
chumma kizhii
#dm
Question 126Skipped
For which use cases is running a virtual warehouse required? (Choose two.) A. __B. |
C. D. E
Correct selection
Correct selection
Overall explanation
Executing SQL SELECT statements that require compute resources (e.g. retrieving rows from
tables and views).
chumma kizhii
#dm
Question 127Skipped
Snowflake best practice recommends that which role be used to enforce a network policy on
a Snowflake account?
SYSADMIN
ACCOUNTADMIN
Correct answer
SECURITYADMIN
USERADMIN
Overall explanation
Question 128Skipped
A tabular User-Defined Function (UDF) is defined by specifying a return clause that contains
which keyword?
ROW_NUMBER
VALUES
TABULAR
Correct answer
TABLE
Overall explanation
RETURNS TABLE(...)
Specifies that the UDF should return a table. Inside the parentheses, specify name-and-type
pairs for columns (as described below) to include in the returned table.
Question 129Skipped
Correct answer
chumma kizhii
#dm
Overall explanation
By default, each user and table in Snowflake is automatically allocated an internal stage for
staging data files to be loaded. In addition, you can create named internal stages.
Question 130Skipped
Which of the following are characteristics of schemas used in Snowflake? (Choose two.)
Correct selection
Correct selection
Overall explanation
• A schema is a logical grouping of database objects (tables, views, etc.). Each schema
belongs to a single database.
Question 131Skipped
How do Snowflake data providers share data that resides in different databases?
Materialized views
Correct answer
Secure views
External tables
chumma kizhii
#dm
Overall explanation
Snowflake data providers can share data that resides in different databases by using secure
views. A secure view can reference objects such as schemas, tables, and other views from
one or more databases, as long as these databases belong to the same account.
Question 132Skipped
Role
Session
Correct answer
Account
User
Overall explanation
chumma kizhii
#dm
Question 133Skipped
Correct answer
Overall explanation
To restore objects, we use the command “UNDROP”. We can use it with Databases,
Schemas, or Tables. If we try to restore an object with a name that already exists, Snowflake
will give an error.
Question 134Skipped
Which of the following statements would be used to export/unload data from Snowflake?
EXPORT TO @stage
Correct answer
Overall explanation
Question 135Skipped
What does Snowflake recommend a user do if they need to connect to Snowflake with a tool
or technology that is not listed in Snowflake’s partner ecosystem?
Correct answer
chumma kizhii
#dm
Overall explanation
Question 136Skipped
How does the ACCESS_HISTORY view enhance overall data governance pertaining to read
and write operations? (Choose two.)
Protects sensitive data from unauthorized access while allowing authorized users to access
it at query runtime
Identifies columns with personal information and tags them so masking policies can be
applied to protect sensitive data
Correct selection
Provides a unified picture of what data was accessed and when it was accessed
Determines whether a given row in a table can be accessed by the user by filtering the
data based on a given policy
Correct selection
Shows how the accessed data was moved from the source to the target objects
Overall explanation
Question 137Skipped
When a database is cloned, which objects in the clone inherit all granted privileges from the
source object? (Choose two.)
Database
Correct selection
Schemas
Account
Correct selection
Tables
Overall explanation
chumma kizhii
#dm
Only child objects inherit the privileges. If the source object is a database or schema, the
clone inherits all granted privileges on the clones of all child objects contained in the source
object:
Question 138Skipped
What is the maximum total Continuous Data Protection (CDP) charges incurred for a
temporary table?
7 days
30 days
48 hours
Correct answer
24 hours
Overall explanation
Thus, the maximum total CDP charges incurred for a temporary table are 1 day (or less if the
table is explicitly dropped or dropped as a result of terminating the session). During this
period, Time Travel can be performed on the table.
Question 139Skipped
Which query contains a Snowflake hosted file URL in a directory table for a stage named
bronzestage?
Correct answer
1. list @bronzestage;
Overall explanation
chumma kizhii
#dm
Question 140Skipped
Role
Organization
Account
Correct answer
User
Overall explanation
Question 141Skipped
When floating-point number columns are unloaded to CSV or JSON files, Snowflake
truncates the values to approximately what?
Correct answer
(15,9)
(14,8)
(10,4)
(12,2)
Overall explanation
Question 142Skipped
What versions of Snowflake should be used to manage compliance with Personal Identifiable
Information (PII) requirements? (Choose two.)
chumma kizhii
#dm
Correct selection
Correct selection
Custom Edition
Enterprise Edition
Standard Edition
Overall explanation
Question 143Skipped
Which command should be used to download files from a Snowflake stage to a local folder
on a client's machine?
COPY
PUT
SELECT
Correct answer
GET
Overall explanation
Question 144Skipped
What are the primary authentication methods that Snowflake supports for securing REST API
interactions? (Choose two.)
Federated authentication
Correct selection
OAuth
Correct selection
chumma kizhii
#dm
Overall explanation
Snowflake supports the following methods of authentication while using External API
Authentication:
• Basic authentication.
Question 145Skipped
A user has semi-structured data to load into Snowflake but is not sure what types of
operations will need to be performed on the data.
Based on this situation, what type of column does Snowflake recommend be used?
OBJECT
ARRAY
TEXT
Correct answer
VARIANT
Overall explanation
Snowflake natively supports semi-structured data, which means semi-structured data can be
loaded into relational tables without requiring the definition of a schema in advance.
Snowflake supports loading semi-structured data directly into columns of type VARIANT.
Typically, tables used to store semi-structured data consist of a single VARIANT column. Once
the data is loaded, you can query the data similar to structured data. You can also perform
other tasks, such as extracting values and objects from arrays. For more information, see the
FLATTEN table function.
Question 146Skipped
Correct selection
HIPAA.
Correct selection
chumma kizhii
#dm
FedRAMP.
SC-900.
Correct selection
PCI-DSS.
ISO 9000.
Overall explanation
They won't ask you in-depth questions about this topic in the exam, but it's important to
remember some of the most important ones. You can see other certifications at the
following link.
Question 147Skipped
Which of the following view types are available in Snowflake? (Choose two.)
External view
Correct selection
Materialized view
Layered view
Correct selection
Secure view
Embedded view
Overall explanation
Question 148Skipped
Which command should be used to load data from a file, located in an external stage, into a
table in Snowflake?
GET
PUT
chumma kizhii
#dm
Correct answer
COPY
INSERT
Overall explanation
Question 149Skipped
1000
Correct answer
100
Overall explanation
A child task USED TO HAVE only a predecessor task. As a new feature of September 2022,
Snowflake also supports DAG of tasks. In a DAG, each non-root task can have dependencies
on multiple predecessor tasks, increasing the previous limit to 100 predecessors.
Question 150Skipped
What are the correct settings for column and element names, regardless of which notation is
used while accessing elements in a JSON object?
Both the column name and the element name are case-insensitive.
Both the column name and the element name are case-sensitive.
The column name is case-sensitive and the element names are case-insensitive.
Correct answer
Overall explanation
Regardless of which notation you use, the column name is case-insensitive but element
names are case-sensitive. For example, in the following list, the first two paths are
equivalent, but the third is not:
chumma kizhii
#dm
src:salesperson.name
SRC:salesperson.name
SRC:Salesperson.Name
Question 151Skipped
Correct answer
Overall explanation
• A materialized view.
• A non-materialized view.
and more!
Question 152Skipped
Correct selection
EXISTS, ANY / ALL, and IN subqueries in WHERE clauses: these subqueries can be
correlated or uncorrelated
Correct selection
chumma kizhii
#dm
Uncorrelated scalar subqueries in any place that a value expression can be used
EXISTS, ANY / ALL, and IN subqueries in WHERE clauses: these subqueries can be
uncorrelated only
EXISTS, ANY / ALL, and IN subqueries in WHERE clauses: these subqueries can be
correlated only
Overall explanation
• Uncorrelated scalar subqueries in any place that a value expression can be used.
• EXISTS, ANY / ALL, and IN subqueries in WHERE clauses. These subqueries can be
correlated or uncorrelated.
Question 153Skipped
Which command line flags can be used to log into a Snowflake account using SnowSQL?
(Choose two.)
-c
-e
Correct selection
-a
-o
Correct selection
-d
Overall explanation
a, ~-accountname TEXT
Question 154Skipped
chumma kizhii
#dm
Correct selection
The VALIDATION_MODE parameter supports COPY statements that transform data during
a load
The VALIDATION_MODE option will validate data to be loaded by the COPY statement
while completing the load and will return the rows that could not be loaded without error
Correct selection
The VALIDATION_MODE option will validate data to be loaded by the COPY statement
without completing the load and will return possible errors
The VALIDATION_MODE parameter supports COPY statements that load data from
external stages only
Overall explanation
Question 155Skipped
The MAXIMUM size for a serverless task run is equivalent to what size virtual warehouse?
Large
Medium
4X-Large
Correct answer
2X-Large
Overall explanation
The maximum size for a serverless task run is equivalent to an XXLARGE warehouse.
chumma kizhii
#dm
SET 3
Question 1Skipped
Using COPY INTO <location> command, to which locations is not possible to unload data
from a table?
Named external stage that references an external location (Amazon S3, Google Cloud
Storage, or Microsoft Azure).
chumma kizhii
#dm
Correct answer
Local Drive.
Overall explanation
Once the data is in the internal stage, you can download them into your local drive using the
GET command. You can also unload data into an external location, as we can see in the
following image:
Question 2Skipped
How does Snowflake improve the performance of queries that are designed to filter out a
significant amount of data?
Correct answer
Overall explanation
Question 3Skipped
A team runs the same query daily, generally with a frequency of fewer than 24 hours, and it
takes around 10 minutes to execute. They realized that the underlying data changes because
of an ETL process that runs every morning.
How can they use the results cache to save the 10 minutes that the query is being executed?
Correct answer
After the ETL run, execute the identical queries so that the result remains in the cache.
After the ETL run, increase the warehouse size. Decrease it after the query runs.
After the ETL run, copy the tables to another database for the team to query.
chumma kizhii
#dm
Overall explanation
In this case, because the underlying data changes every morning due to the ETL process, the
results cache may not be useful for the daily query execution. However, suppose the team
executes an identical query immediately after the ETL process runs. In that case, the results
of that query will be stored in the results cache and can be retrieved for subsequent queries.
By doing so, the team can save the 10 minutes that the query is being executed by retrieving
the results from the cache.
Question 4Skipped
Which COPY INTO command outputs the data into one file?
MULTIPLE=FALSE
MAX_FILE_NUMBER=1
FILE_NUMBER=1
Correct answer
SINGLE=TRUE
Overall explanation
Question 5Skipped
What privileges are necessary for a consumer in the Data Exchange to make a request and
receive data? (Choose two.)
Correct selection
IMPORT SHARE
Correct selection
CREATE DATABASE
REFERENCE_USAGE
USAGE
OWNERSHIP
Overall explanation
To access a listing, you must use the ACCOUNTADMIN role or another role with the CREATE
DATABASE and IMPORT SHARE privileges.
chumma kizhii
#dm
Question 6Skipped
While loading data through the COPY command, you can transform the data.
Reorder columns.
Omit columns.
Correct answer
Filters.
Cast.
Truncate columns.
Overall explanation
Question 7Skipped
Which of the below columns will you consider while choosing a cluster key? (Choose two.)
Correct selection
Correct selection
Overall explanation
A column with very low cardinality (e.g., a column indicating only whether a person is male
or female) might yield minimal pruning. On the other hand, a column with very high
cardinality (e.g., a column containing UUID or nanosecond timestamp values) is also typically
not a good candidate to use directly as a clustering key.
Question 8Skipped
A user created a transient table and made several changes to it over the course of several
days. Three days after the table was created, the user would like to go back to the first
version of the table.
chumma kizhii
#dm
Use the FAIL_SAFE parameter for Time Travel to retrieve the data from Fail-safe storage.
Contact Snowflake Support to have the data retrieved from Fail-safe storage.
Correct answer
Overall explanation
Question 9Skipped
What Snowflake database object is derived from a query specification, stored for later use,
and can speed up expensive aggregation on large data sets?
Secure view
External table
Temporary table
Correct answer
Materialized view
Overall explanation
Simple but very frequent question in exams. Materialized views are an important topic.
Question 10Skipped
What option will you specify to delete the stage files after a successful load into a Snowflake
table with the COPY INTO command?
Correct answer
chumma kizhii
#dm
1. PURGE = TRUE
1. TRUNCATE = TRUE
1. REMOVE = TRUE
1. DELETE = TRUE
Overall explanation
If the PURGE option is set to TRUE, Snowflake will try its best to remove successfully loaded
data files from stages. If the purge operation fails for any reason, it won't return any error for
now.
Question 11Skipped
What COPY INTO SQL command should be used to unload data into multiple files?
SINGLE=TRUE
MULTIPLE=TRUE
Correct answer
SINGLE=FALSE
MULTIPLE=FALSE
Overall explanation
Question 12Skipped
Correct answer
chumma kizhii
#dm
Overall explanation
The count() is one of the operations that is resolved in the Metadata Layer.
Question 13Skipped
When unloading data with the COPY INTO command, what is the purpose of the PARTITION
BY parameter option?
Correct answer
To split the output into multiple files, one for each distinct value of the specified
expression.
To delimit the records in the output file using the specified expression.
To include a new column in the output using the specified window function expression.
Overall explanation
The PARTITION BY copy option accepts an expression by which the unload operation
partitions table rows into separate files unloaded to the specified stage.
Question 14Skipped
Correct answer
“@~”
“@%”
“@”
“~”
Overall explanation
Each user has a Snowflake personal stage allocated to them by default for storing files, and
no one can access them except the user it belongs to. It's represented with the "@~"
chumma kizhii
#dm
character. In the following example, we are uploading the file "myfile.csv" to the stage from
the current user:
1. PUT file://C:\data\myfile.csv @~
Question 15Skipped
By default, which role allows a user to manage a Snowflake Data Exchange share?
USERADMIN
SYSADMIN
Correct answer
ACCOUNTADMIN
SECURITYADMIN
Overall explanation
By default, the privileges required to create and manage shares are granted only to the
ACCOUNTADMIN role, ensuring that only account administrators can perform these tasks.
Question 16Skipped
What is the minimum Snowflake edition that you need for the Data Sharing capability?
Business Critical
Enterprise
Correct answer
Standard
Overall explanation
Secure Data Sharing lets you share selected objects in a database in your account with other
Snowflake accounts. All the data-sharing features are available for these three types of
editions.
Question 17Skipped
Based on Snowflake recommendations, when creating a hierarchy of custom roles, the top-
most custom role should be assigned to which role?
chumma kizhii
#dm
SECURITYADMIN
Correct answer
SYSADMIN
USERADMIN
ACCOUNTADMIN
Overall explanation
Question 18Skipped
The effects of query pruning can be observed by evaluating which statistics? (Choose two.)
chumma kizhii
#dm
Bytes scanned
Bytes written
Correct selection
Partitions scanned
Correct selection
Partitions total
Overall explanation
Question 19Skipped
Which command is used to unload data from a table or move a query result to a stage?
GET
PUT
MERGE
Correct answer
COPY INTO
Overall explanation
Use the COPY INTO <location> command to copy the data from the Snowflake database
table into one or more files in a Snowflake or external stage.
Question 20Skipped
What computer language can be selected when creating User-Defined Functions (UDFs)
using the Snowpark API?
Swift
Correct answer
Python
chumma kizhii
#dm
JavaScript
SQL
Overall explanation
Question 21Skipped
Which command can be used to list all network policies available in an account? A.
Correct answer
Overall explanation
Question 22Skipped
Which command will you run to list all users and roles to which a role has been granted?
Correct answer
Overall explanation
“SHOW GRANTS OF ROLE” will list the users, whereas “SHOW GRANTS TO ROLE” will list the
privileges to which this role has access.
Here you can see an example of running the command in my Snowflake account:
chumma kizhii
#dm
Question 23Skipped
Which Snowflake edition supports private communication between Snowflake and your
other VPCs through AWS PrivateLink?
All Snowflake editions supports private communication between Snowflake and your
other VPCs through AWS PrivateLink.
Standard.
Enterprise.
Correct answer
Business Critical.
Overall explanation
AWS PrivateLink is an AWS service for creating private VPC endpoints that allow direct,
secure connectivity between your AWS VPCs and the Snowflake VPC without traversing the
public Internet. This feature requires the Business Critical edition or higher.
You can see the differences between the Snowflake editions in the following image:
chumma kizhii
#dm
Question 24Skipped
Which of the following features, associated with Continuous Data Protection (CDP), require
additional Snowflake-provided data storage? (Choose two.)
Data encryption
Tri-Secret Secure
Correct selection
Fail-safe
Correct selection
Time Travel
External stages
Overall explanation
Both Time Travel and Fail-safe require additional data storage, which has associated fees .
Question 25Skipped
Which statements are correct concerning the leveraging of third-party data from the
Snowflake Data Marketplace? (Choose two.)
Data transformations are required when combining Data Marketplace datasets with
existing data in Snowflake.
Correct selection
Correct selection
chumma kizhii
#dm
Overall explanation
Data in the Snowflake Data Marketplace is already formatted and ready to query, and can be
personalized for specific business needs.
Data from the Snowflake Data Marketplace is accessed through Snowflake's Secure Data
Sharing technology, which allows users to access the data without copying or moving it to
their own account.
Loading data into a cloud provider as a consumer account is not required to leverage data
from the Snowflake Data Marketplace, and the data can be accessed and used in a
Snowflake account without restriction.
Data transformations may not be required when combining Data Marketplace datasets with
existing data in Snowflake, as it depends on the specific data being used and how it needs to
be combined or analyzed.
Question 26Skipped
What are ways to create and manage data shares in Snowflake? (Choose two.)
Correct selection
Correct selection
Overall explanation
Question 27Skipped
Which user preferences can be set for a user profile in Snowsight? (Choose two.)
Correct selection
Correct selection
Notifications
chumma kizhii
#dm
Default database
Default schema
Username
Overall explanation
On your profile, you can review and set the following user details:
• Profile photo
• First Name
• Last Name
• Password
• Default experience
• Language: Select the language to use for Snowsight. Snowflake currently supports
the following languages:
• English (US)
• Japanese (日本語)
If your active role has access to set up resource monitor notifications, you can also select a
checkbox to set up Email notifications from resource monitors.
Question 28Skipped
What can be used to view warehouse usage over time? (Choose two.)
Correct selection
chumma kizhii
#dm
Correct selection
Overall explanation
WAREHOUSE_METERING_HISTORY View
This Account Usage view can be used to return the hourly credit usage for a single
warehouse (or all the warehouses in your account) within the last 365 days (1 year). See
this link.
Question 29Skipped
How does the PARTITION BY option affect an expression for a COPY INTO command?
The unload operation partitions table rows into separate files unloaded to the specified
table.
A single file will be loaded with a Snowflake-defined partition key and Snowflake will use
this key for pruning.
Correct answer
The unload operation partitions table rows into separate files unloaded to the specified
stage.
A single file will be loaded with a user-defined partition key and the user can use this
partition key for clustering.
Overall explanation
The PARTITION BY copy option accepts an expression by which the unload operation
partitions table rows into separate files unloaded to the specified stage.
Question 30Skipped
In (at least), how many availability zones does Snowflake replicate your data to?
Two.
chumma kizhii
#dm
One.
Correct answer
Three.
Overall explanation
Cloud storage synchronously and automatically replicates the stored data across multiple
devices and at least three availability zones. You can read more information at the following
link.
Question 31Skipped
Which database objects can be shared with the Snowflake secure data sharing feature?
(Choose two.)
Files
Sequences
Correct selection
External tables
Correct selection
Streams
Overall explanation
Secure Data Sharing lets you share selected objects in a database in your account with other
Snowflake accounts. You can share the following Snowflake database objects:
• Tables
• External tables
• Secure views
• Secure UDFs
Question 32Skipped
chumma kizhii
#dm
Column level security in Snowflake allows the application of a masking policy to a column
within a table or view. Which two features are related to column-level security? (Choose
two.)
Correct selection
Correct selection
External Tokenization
Data Encryption
Conditional Tokenization.
Lock Databases.
Overall explanation
Dynamic Data Masking is a security feature in Snowflake that enables you to mask sensitive
data (for example, credit card numbers or passwords) in real-time, based on the user's
permissions and role. When a user with restricted access attempts to access the masked
data, the data is replaced with a masked value or redacted to ensure that sensitive
information is not exposed. You can see how it works in the image below.
chumma kizhii
#dm
Question 33Skipped
Which service does Snowflake use to provide the Zero-Copy cloning functionality?
Cache.
Correct answer
Overall explanation
Zero-Copy cloning does NOT duplicate data; it duplicates the metadata of the micro-
partitions. For this reason, Zero-Copy cloning doesn’t consume storage. When you modify
some cloned data, it will consume storage because Snowflake has to recreate the micro-
partitions.
chumma kizhii
#dm
Question 34Skipped
Parquet.
JSON.
Avro.
Correct answer
CSV.
Overall explanation
When we talk about loading data, you get the most significant speed at loading CSV files.
However, Snowflake is fast and flexible, and you can also use other formats like JSON or
Parquet. You can see an article talking about it at the following link.
Question 35Skipped
What is the expiration period for a file URL used to access unstructured data in cloud
storage?
The same length of time as the expiration period for the query results cache
Correct answer
chumma kizhii
#dm
Overall explanation
Question 36Skipped
Correct answer
Overall explanation
Question 37Skipped
Correct selection
Correct selection
Overall explanation
Only one SQL statement is allowed to be executed through a task. If you need to execute
multiple statements, build a procedure.
Question 38Skipped
Correct answer
chumma kizhii
#dm
A task can be called using a CALL statement to run a set of predefined SQL commands.
A task can be executed using a SELECT statement to run a predefined SQL command.
Overall explanation
A task can execute any one of the following types of SQL code:
Question 39Skipped
What operations can be performed while loading a simple CSV file into a Snowflake table
using the COPY INTO command? (Choose two.)
Correct selection
Correct selection
Grouping by operations
Overall explanation
• Column reordering, column omission, and casts using a SELECT statement. There is
no requirement for your data files to have the same number and ordering of columns
as your target table.
Question 40Skipped
chumma kizhii
#dm
Correct answer
Standard or higher.
Enterprise or higher.
Overall explanation
Question 41Skipped
Correct answer
Snowflake tables are the physical instantiation of data loaded into Snowflake.
Overall explanation
Question 42Skipped
Which command is used to start configuring Snowflake for Single Sign-On (SSO)?
Correct answer
chumma kizhii
#dm
Overall explanation
Snowflake uses a SAML2 security integration to integrate with the IdP you are using to
implement federated authentication. Use the CREATE SECURITY INTEGRATION command to
start configuring Snowflake for SSO.
Question 43Skipped
What is the minimum Snowflake edition required for row level security?
Standard
Correct answer
Enterprise
Business Critical
Overall explanation
chumma kizhii
#dm
Question 44Skipped
You have the following data in a variant column from the table “myTable”. How can you
query the favorite technology that Gonzalo uses?
1. {
3. "favouriteTechnology": Snowflake,
4. "hobbies":[
5. {"name": "soccer"},
6. {"name": "music"},
chumma kizhii
#dm
7. {"name": "hiking"}
8. ]}
Correct answer
Overall explanation
Question 45Skipped
ALTER ROLE
Correct answer
REVOKE ROLE
USE ROLE
Overall explanation
REVOKE ROLE
Question 46Skipped
A user needs to ingest 1 GB of data that is available in an external stage using a COPY INTO
command.
How can this be done with MAXIMUM performance and the LEAST cost?
Correct answer
Split the file into smaller files of 100-250 MB each, compress and ingest each of the
smaller files.
chumma kizhii
#dm
Split the file into smaller files of 100-250 MB each and ingest each of the smaller files in an
uncompressed format.
Overall explanation
Question 47Skipped
What data type should be used to store JSON data natively in Snowflake?
Object
JSON
Correct answer
VARIANT
String
Overall explanation
Question 48Skipped
Correct answer
The privileges can be revoked by any user-defined role with appropriate privileges.
Overall explanation
System-defined roles cannot be dropped. In addition, the privileges granted to these roles by
Snowflake cannot be revoked.
Question 49Skipped
What is the MINIMUM edition of Snowflake that is required to use a SCIM security
integration?
Correct answer
chumma kizhii
#dm
Standard Edition
Enterprise Edition
Overall explanation
Question 50Skipped
The Snowflake Search Optimization Services supports improved performance of which kind
of query?
Correct answer
Overall explanation
Keyword: selective
Question 51Skipped
Which Snowflake feature allows administrators to identify unused data that may be archived
or deleted?
Object tagging
Data classification
Correct answer
Access history
Overall explanation
Each row in the ACCESS_HISTORY view contains a single record per SQL statement. The
record describes the columns the query accessed directly and indirectly (i.e. the underlying
tables that the data for the query comes from). These records facilitate regulatory
chumma kizhii
#dm
compliance auditing and provide insights on popular and frequently accessed tables and
columns since there is a direct link between the user (i.e. query operator), the query, the
table or view, the column, and the data.
Question 52Skipped
How can a producer share a table with a consumer located in a different region?
Unload all data to a stage and then deploy a pipeline to move data to the consumer's
stage in other region.
Correct answer
Replicate your account to another region and create a share from that region.
Overall explanation
Data sharing works within the same region; however, you can replicate your account to
another region and then share data from that replicated account within that account’s
region. This is also true across cloud platforms. You can see this behavior in the following
image:
Question 53Skipped
Which encryption algorithm is used by Snowflake tables when we load data into them?
chumma kizhii
#dm
SHA 256.
AES 128.
Correct answer
AES 256.
SHA 128.
Overall explanation
All ingested data stored in Snowflake tables, and all files stored in internal stages for data
loading and unloading, are encrypted using AES-256 strong encryption. You can read more
information about the Snowflake security features at the following link.
Question 54Skipped
Correct answer
Overall explanation
Materialized view can be joined with other tables. But you cannot include JOIN in a
materialized view definition
Question 55Skipped
Which of the following objects can be directly restored using the UNDROP command?
(Choose two.)
User
Correct selection
Table
Role
View
Correct selection
chumma kizhii
#dm
Schema
Internal stage
Overall explanation
Account Objects:
UNDROP DATABASE
Database Objects:
UNDROP SCHEMA
UNDROP TABLE
UNDROP TAG
Question 56Skipped
Which chart type is supported in Snowsight for Snowflake users to visualize data with
dashboards?
Correct answer
Heat grid
Pie chart
Area chart
Box plot
Overall explanation
• Bar charts
• Line charts
• Scatterplots
• Heat grids
• Scorecards
chumma kizhii
#dm
Question 57Skipped
90 days
45 days
Correct answer
1 day
7 days
Overall explanation
Default of 1 day.
Question 58Skipped
What does the worksheet and database explorer feature in Snowsight allow users to do?
Correct answer
Overall explanation
Question 59Skipped
Query compilation occurs in which architecture layer of the Snowflake Cloud Data Platform?
Storage layer
Compute layer
Correct answer
chumma kizhii
#dm
Overall explanation
Query execution is different from Query processing. Query execution is performed in the
processing layer. While The services layer for Snowflake authenticates user sessions,
provides management, enforces security functions, performs query compilation and
optimization, results cache and coordinates all transactions
Question 60Skipped
Correct answer
Overall explanation
Clone the dropped stage. - Incorrect, you cannot clone a previously dropped object.
Execute the UNDROP command - Incorrect, UNDROP command cannot be used with stages.
Using Time Travel, you can perform the following actions within a defined period of time:
• Query data in the past that has since been updated or deleted.
• Create clones of entire tables, schemas, and databases at or before specific points in
the past.
Question 61Skipped
What are best practice recommendations for using the ACCOUNTADMIN system-defined role
in Snowflake? (Choose two.)
chumma kizhii
#dm
Correct selection
Assign the ACCOUNTADMIN role to at least two users, but as few as possible.
All users granted ACCOUNTADMIN role must also be granted SECURITYADMIN role.
All users granted ACCOUNTADMIN role must be owned by the ACCOUNTADMIN role.
Correct selection
Overall explanation
Question 62Skipped
Correct selection
Snowpark executes as much work as possible by leveraging pushdown for all operations,
including user-defined functions (UDF).
Correct selection
Snowpark does not require that a separate cluster be running outside of Snowflake.
Overall explanation
In comparison to using the Snowflake Connector for Spark, developing with Snowpark
includes the following benefits:
• Support for interacting with data within Snowflake using libraries and patterns
purpose built for different languages without compromising on performance or
functionality.
• Support for authoring Snowpark code using local tools such as Jupyter, VS Code, or
IntelliJ.
• Support for pushdown for all operations, including Snowflake UDFs. This means
Snowpark pushes down all data transformation and heavy lifting to the Snowflake
data cloud, enabling you to efficiently work with data of any size.
chumma kizhii
#dm
Question 63Skipped
If you want a dedicated virtual warehouse, which is the lowest Snowflake edition you should
opt for?
Enterprise.
Business Critical.
Correct answer
Standard.
Overall explanation
In Snowflake, all the Virtual Warehouses are dedicated to the users. If you create a virtual
warehouse, you will only be the one using it.
Question 64Skipped
Which command can be added to the COPY command to make it load all files, whether or
not the load status of the files is known?
Correct answer
FORCE = TRUE
LOAD_UNCERTAIN_FILES = TRUE
LOAD_UNCERTAIN_FILES = FALSE
FORCE = FALSE
Overall explanation
To load files whose metadata has expired, set the LOAD_UNCERTAIN_FILES copy option to
true. The copy option references load metadata, if available, to avoid data duplication, but
also attempts to load files with expired load metadata.
Alternatively, set the FORCE option to load all files, ignoring load metadata if it exists. Note
that this option reloads files, potentially duplicating data in a table.
chumma kizhii
#dm
Question 65Skipped
There are 300 concurrent users on a production Snowflake account using a single cluster
virtual warehouse. The queries are small, but the response time is very slow.
The application is not using the latest native ODBC driver which is causing latency.
Correct answer
The warehouse is queuing the queries, increasing the overall query execution time.
Overall explanation
High number of concurrent users with a single cluster virtual warehouse. This would be a
good use case for using multi cluster warehouse, which are designed to handle concurrency
problems.
Question 66Skipped
A single user of a virtual warehouse has set the warehouse to auto-resume and auto-
suspend after 10 minutes. The warehouse is currently suspended and the user performs the
following actions:
4. Manually suspends the warehouse as soon as the last query was completed
When the user returns, how much billable compute time will have been consumed?
4 minutes
Correct answer
chumma kizhii
#dm
14 minutes
10 minutes
24 minutes
Overall explanation
The user ran a query for 3 minutes. However, since the warehouse was set to auto-suspend
after 10 minutes, the compute cost was calculated as 3 + 10 minutes. The user then came
back and ran another query for 10 seconds. Even though the query only ran for 10 seconds,
the minimum billable unit is 1 minute. Therefore, the total compute cost was 13 minutes + 1
minute = 14 minutes.
Question 67Skipped
User1, who has the SYSADMIN role, executed a query on Snowsight. User2, who is in the
same Snowflake account, wants to view the result set of the query executed by User1 using
the Snowsight query history.
If User2 has the SECURITYADMIN role they will be able to see the results.
If User2 has the ACCOUNTADMIN role they will be able to see the results.
Correct answer
User2 will be unable to view the result set of the query executed by User1.
If User2 has the SYSADMIN role they will be able to see the results.
Overall explanation
You can view results only for queries you have executed. If you have privileges to view
queries executed by another user, the Query Detail page displays the details for the query,
but, for data privacy reasons, the page does not display the actual query result.
Question 68Skipped
chumma kizhii
#dm
When unloading data from Snowflake to AWS, what permissions are required? (Choose
two.)
Correct selection
s3:PutObject
s3:CopyObject
Correct selection
s3:DeleteObject
s3:GetBucketAcl
s3:GetBucketLocation
Overall explanation
Snowflake requires the following permissions on an S3 bucket and folder to create new files
in the folder (and any sub-folders):
• s3:DeleteObject
• s3:PutObject
Question 69Skipped
Snowflake's approach to access control combines aspects of two different models. One
model remarks, "each object has an owner, who can in turn grant access to that object".
What is the name of this model?
Correct answer
Overall explanation
Discretionary Access Control (DAC) remarks that "each object has an owner, who can, in
turn, grant access to that object". In contrast, the Role-based Access Control (RBAC) remarks
that "access privileges are assigned to roles, which are in turn assigned to users".
Question 70Skipped
What should be considered when deciding to use a Secure View? (Choose two.)
chumma kizhii
#dm
Correct selection
No details of the query execution plan will be available in the query profiler.
The view definition of a secure view is still visible to users by way of the information
schema.
Correct selection
Secure views do not take advantage of the same internal optimizations as standard views.
Overall explanation
The internals of a secure view are not exposed in Query Profile (in the web interface).
Some of the internal optimizations for views require access to the underlying data in the
base tables for the view. This access might allow data that is hidden from users of the view
to be exposed through user code, such as user-defined functions, or other programmatic
methods. Secure views do not utilize these optimizations, ensuring that users have no access
to the underlying data.
Question 71Skipped
A task went into a loop. How long will the task run before Snowflake finishes it?
4 hours.
15 minutes.
30 minutes.
Correct answer
60 minutes.
Overall explanation
Tasks have a maximum duration of 60 minutes by default. If they haven't finished by then,
they will be automatically terminated. You can configure the time limit on a single task run
before it times out with the option "USER_TASK_TIMEOUT_MS" when creating the task.
However, before significantly increasing the time limit on a task, consider whether to
refactor the SQL statement or increase the warehouse size.
Question 72Skipped
The is the minimum Fail-safe retention time period for transient tables?
chumma kizhii
#dm
7 days
Correct answer
0 days
12 hours
1 day
Overall explanation
Transient tables are similar to permanent tables with the key difference that they do not
have a Fail-safe period.
Question 73Skipped
Correct selection
Virtual warehouse
Correct selection
Account
Organization
Schema
Database
Overall explanation
chumma kizhii
#dm
Question 74Skipped
In which Snowflake layer does Snowflake reorganize data into its internal optimized,
compressed, columnar format?
Correct answer
Database Storage
Query Processing
Metadata Management
Cloud Services
Overall explanation
When data is loaded into Snowflake, Snowflake reorganizes that data into its internal
optimized, compressed, columnar format. Snowflake stores this optimized data in cloud
storage.
Question 75Skipped
Which term is used to describe information about disk usage for operations where
intermediate results cannot be accommodated in a Snowflake virtual warehouse memory?
Queue overloading
Pruning
Join explosion
Correct answer
Spilling
chumma kizhii
#dm
Overall explanation
What is disk spilling? When Snowflake warehouse cannot fit an operation in memory, it
starts spilling (storing) data first to the local disk of a warehouse node, and then to remote
storage.
In such a case, Snowflake first tries to temporarily store the data on the warehouse local
disk. As this means extra IO operations, any query that requires spilling will take longer than
a similar query running on similar data that is capable to fit the operations in memory.
Question 76Skipped
In the Snowflake access control model, which entity owns an object by default?
Correct answer
Overall explanation
To own an object means that a role has the OWNERSHIP privilege on the object. Each
securable object is owned by a single role, which by default is the role used to create the
object.
Question 77Skipped
How does a Snowflake user reference a directory table created on stage mystage in a SQL
query?
Correct answer
Overall explanation
Question 78Skipped
chumma kizhii
#dm
Correct answer
Overall explanation
Snowflake enables sharing of data through named Snowflake objects called shares which
supports data sharing by sharing tables, secure views, and secure UDFs in our Snowflake
database (Data Provider) with other Snowflake accounts (Data Consumer).
Question 79Skipped
Which of the following significantly improves the performance of selective point lookup
queries on a table?
Correct answer
Clustering
Materialized Views
Zero-copy Cloning
Overall explanation
The search optimization service aims to significantly improve the performance of certain
types of queries on tables, including: Selective point lookup queries on tables.
A point lookup query returns only one or a small number of distinct rows.
(...)
Question 80Skipped
When cloning a database containing stored procedures and regular views, that have fully
qualified table references, which of the following will occur?
chumma kizhii
#dm
The cloned views and the stored procedures will reference the cloned tables in the cloned
database.
Correct answer
The stored procedures and views will refer to tables in the source database.
Overall explanation
Question 81Skipped
Correct answer
Overall explanation
Adjusting the available memory of a warehouse can improve performance because a query
runs substantially slower when a warehouse runs out of memory, which results in bytes
“spilling” onto storage.
Question 82Skipped
Correct answer
When reducing the size of a warehouse the compute resources are removed only when
they are no longer being used to execute any current statements.
The warehouse will be suspended while the new compute resource is provisioned and will
resume automatically once provisioning is complete.
Users who are trying to use the warehouse will receive an error message until the resizing
is complete.
When increasing the size of an active warehouse the compute resource for all running and
queued queries on the warehouse are affected.
chumma kizhii
#dm
Overall explanation
Question 83Skipped
Schema
Correct answer
Account
Database
Table
Overall explanation
Question 84Skipped
Why would a Snowflake user create a secure view instead of a standard view?
With a secure view, the underlying data is replicated to a separate storage layer with
enhanced encryption.
Secure views support additional functionality that is not supported for standard views,
such as column masking and row level access policies.
The secure view is only available to end users with the corresponding SECURE_ACCESS
property.
Correct answer
End users are unable to see the view definition, and internal optimizations differ with a
secure view.
Overall explanation
Question 85Skipped
Where can a user find and review the failed logins of a specific user for the past 30 days?
chumma kizhii
#dm
Correct answer
Overall explanation
Question 86Skipped
How can a Snowflake user load duplicate files with a COPY INTO command?
Correct answer
Overall explanation
Definition
Boolean that specifies to load all files, regardless of whether they’ve been loaded previously
and have not changed since they were loaded. Note that this option reloads files, potentially
duplicating data in a table.
Question 87Skipped
A single executable statement can call multiple stored procedures. In contrast, multiple
SQL statements can call the same UDFs.
A single executable statement can call only two stored procedures. In contrast, a single
SQL statement can call multiple UDFs.
Multiple executable statements can call more than one stored procedure. In contrast, a
single SQL statement can call multiple UDFs.
Correct answer
chumma kizhii
#dm
A single executable statement can call only one stored procedure. In contrast, a single SQL
statement can call multiple UDFs.
Overall explanation
Multiple UDFs may be called with one statement; a single stored procedure is called with
one statement
Question 88Skipped
Correct answer
Overall explanation
chumma kizhii
#dm
Question 89Skipped
Correct answer
Duo Security
Authy
One Login
Overall explanation
MFA support is provided as an integrated Snowflake feature, powered by the Duo Security
service, which is managed completely by Snowflake.
Question 90Skipped
Which of the following statements apply to Snowflake in terms of security? (Choose two.)
chumma kizhii
#dm
Correct selection
Correct selection
Snowflake can run within a user's own Virtual Private Cloud (VPC).
Overall explanation
Snowflake is a native 100% public cloud solution, you cannot host it on your OWN VPC. All
data micro partitions are encrypted.
Question 91Skipped
For how long are we billed if our warehouse runs for 48 seconds?
We are not going to be billed if the query scans less than 10 micro-partitions.
We are not going to be billed as the warehouse hasn’t run for 1 minute.
48 seconds.
Correct answer
1 minute.
Overall explanation
When compute resources are provisioned for a warehouse, the minimum billing charge for
provisioning compute resources is 1 minute. Even if your warehouse runs for less than a
minute, we will be billed for a minute.
Question 92Skipped
How many credits will consume a medium-size warehouse with 2 clusters running in auto-
scaled mode for 3 hours, considering that the first cluster runs continuously and the second
one runs for 30 minutes in the second hour?
8.
10.
16.
Correct answer
chumma kizhii
#dm
14.
Overall explanation
A medium size warehouse with one cluster consumes four credits per hour. The first cluster
will run continuously for three hours, consuming 12 credits. The second one will run for only
30 minutes, consuming two credits. The total of the warehouse will be 14 credits.
Question 93Skipped
In order to access Snowflake Marketplace listings, who needs to accept the Snowflake
Consumer Terms of Service?
SECURITYADMIN
SYSADMIN
Correct answer
ORGADMIN
ACCOUNTADMIN
Overall explanation
The organization administrator only needs to accept the Snowflake Provider and Consumer
Terms once for your organization. After the terms have been accepted, anyone in your
organization that has a role with the necessary privileges can become a consumer of listings.
chumma kizhii
#dm
Note
You must be an organization administrator (i.e. a user granted the ORGADMIN role) to
accept the terms.
Question 94Skipped
COPY_HISTORY
ROW_ACCESS_POLICIES
QUERY_HISTORY
Correct answer
ACCESS_HISTORY
Overall explanation
Question 95Skipped
REMOVE
Correct selection
PUT
Correct selection
GET
LIST
COPY INTO
Overall explanation
chumma kizhii
#dm
• The command cannot be executed from the Worksheets page in either Snowflake
web interface; instead, use the SnowSQL client or Drivers to upload data files, or
check the documentation for a specific Snowflake client to verify support for this
command.
Question 96Skipped
Which Query Profile result indicates that a warehouse is sized too small?
Correct answer
Overall explanation
Question 97Skipped
Which statistics are displayed in a Query Profile that indicate that intermediate results do
not fit in memory? (Choose two.)
Bytes scanned
Correct selection
Partitions scanned
Correct selection
Overall explanation
• Spilling — information about disk usage for operations where intermediate results
do not fit in memory:
chumma kizhii
#dm
Question 98Skipped
What does the LATERAL modifier for the FLATTEN function do?
Correct answer
Overall explanation
FLATTEN is a table function that produces a lateral view of a VARIANT, OBJECT, or ARRAY
column. The function returns a row for each object, and the LATERAL modifier joins the data
with any information outside of the object.
Question 99Skipped
What Snowflake features allow virtual warehouses to handle high concurrency workloads?
(Choose two.)
Correct selection
Correct selection
Overall explanation
Question 100Skipped
How can the Query Profile be used to identify the costliest operator of a query?
Correct answer
chumma kizhii
#dm
Find the operator node with the highest fraction of time or percentage of total time.
Look at the number of rows between operator nodes across the operator tree.
Select any node in the operator tree and look at the number of micro-partitions scanned.
Select the TableScan operator node and look at the percentage scanned from cache.
Overall explanation
A collapsible panel in the operator tree pane lists nodes by execution time in descending
order, enabling users to quickly locate the costliest operator nodes in terms of execution
time. The panel lists all nodes that lasted for 1% or longer of the total execution time of the
query (or the execution time for the displayed query step, if the query was executed in
multiple processing steps).
Question 101Skipped
What is a machine learning and data science partner within the Snowflake Partner
Ecosystem?
Power BI
Informatica
Adobe
Correct answer
Data Robot
Overall explanation
chumma kizhii
#dm
Question 102Skipped
Concurrent sizing
Correct answer
Scaling up
Scaling out
Right sizing
Overall explanation
chumma kizhii
#dm
Question 103Skipped
A company needs to allow some users to see Personally Identifiable Information (PII) while
limiting other users from seeing the full value of the PII.
Correct answer
Data encryption
Overall explanation
If you have a table with a column including PII, masking rows will not solve the issue. What
we need is to make the data in this column visible to some, and masked to some. Thus we
need to use dynamic data masking.
Question 104Skipped
A user has enabled the STRIP_OUTER_ARRAY file format option for the COPY INTO {table}
command to remove the outer array structure.
Correct answer
Ensure each unique element stores values of a single native data type.
chumma kizhii
#dm
Overall explanation
Enable the STRIP_OUTER_ARRAY file format option for the COPY INTO <table> command to
remove the outer array structure and load the records into separate table rows.
Question 105Skipped
Correct answer
Overall explanation
Question 106Skipped
Which data types does Snowflake support when querying semi-structured data? (Choose
two.)
Correct selection
VARIANT
BLOB
VARCHAR
Correct selection
ARRAY
XML
Overall explanation
Question 107Skipped
When using SnowSQL, which configuration options are required when unloading data from a
SQL query run on a local machine? (Choose two.)
chumma kizhii
#dm
Correct selection
output_format
force_put_overwrite
quiet
Correct selection
output_file
echo
Overall explanation
Question 108Skipped
Which table function should be used to view details on a Directed Acyclic Graph (DAG) run
that is presently scheduled or is executing?
TASK_HISTORY
TASK_DEPENDENTS
Correct answer
CURRENT_TASK_GRAPHS
COMPLETE_TASK_GRAPHS
Overall explanation
CURRENT_TASK_GRAPHS
Returns the status of a graph run that is currently scheduled or is executing. A graph is
currently defined as a single scheduled task or a DAG of tasks composed of a scheduled root
task and one or more child tasks (i.e. tasks that have a defined predecessor task). For the
purposes of this function, root task refers to either the single scheduled task or the root task
in a DAG.
Question 109Skipped
What are the main differences between the Virtual Private Snowflake (VPS) and Business
Critical Editions? (Select TWO)
Correct selection
chumma kizhii
#dm
Snowflake VPS provides a dedicated metadata store and pool of computing resources,
whereas it’s not included in the Business Critical Edition.
Snowflake VPS provides a direct proxy to virtual networks // on-premises data centers
using AWS PrivateLink, whereas it’s not included in the Business Critical Edition.
Correct selection
Snowflake VPS provides a completely separate Snowflake environment, isolated from all
other Snowflake accounts, whereas it’s not included in the Business Critical Edition.
Overall explanation
Tri-Secret secure and AWS PrivateLink are also provided in the Business Critical Edition. You
can see all the differences between Snowflake editions at the following link.
Question 110Skipped
Correct answer
chumma kizhii
#dm
Overall explanation
Question 111Skipped
Which sequence (order) of object privileges should be used to grant a custom role read-only
access on a table?
Correct answer
chumma kizhii
#dm
Overall explanation
When designing a RBAC and assigning grants it is very important to follow the principle of
‘least privilege’.
Question 112Skipped
Which function returns an integer between 0 and 100 when used to calculate the similarity
of two strings?
APPROXIMATE_SIMILARITY
APPROXIMATE_JACCARD_INDEX
Correct answer
JAROWINKLER_SIMILARITY
MINHASH_COMBINE
Overall explanation
Computes the Jaro-Winkler similarity between two input strings. The function returns an
integer between O and 100, where 0 indicates no similarity and 100 indicates an exact
match.
Question 113Skipped
1. SELECT CURRENT_IP_ADDRESS()
1. SELECT CURRENT_ACCOUNT()
Correct answer
1. SELECT CURRENT_PROVIDER()
1. SELECT CURRENT_CLIENT()
Overall explanation
chumma kizhii
#dm
You can see all the different context functions at the following link.
Question 114Skipped
What impacts the credit consumption of maintaining a materialized view? (Choose two.)
Correct selection
Correct selection
Overall explanation
When a base table changes, all materialized views defined on the table are updated by a
background service that uses compute resources provided by Snowflake.
Question 115Skipped
Which tasks are performed in the Snowflake Cloud Services layer? (Choose two.)
Correct selection
Management of metadata
Correct selection
Infrastructure security
Overall explanation
Question 116Skipped
What is the MAXIMUM number of days that Snowflake resets the 24-hour retention period
for a query result every time the result is used?
chumma kizhii
#dm
1 day
10 days
Correct answer
31 days
60 days
Overall explanation
Each time the persisted result for a query is reused, Snowflake resets the 24-hour retention
period for the result, up to a maximum of 31 days from the date and time that the query was
first executed. After 31 days, the result is purged and the next time the query is submitted, a
new result is generated and persisted.
Question 117Skipped
What happens when a network policy includes values that appear in both the allowed and
blocked IP address lists?
Those IP addresses are allowed access to the Snowflake account as Snowflake applies the
allowed IP address list first.
Snowflake issues an alert message and adds the duplicate IP address values to both the
allowed and blocked IP address lists.
Correct answer
Those IP addresses are denied access to the Snowflake account as Snowflake applies the
blocked IP address list first.
Snowflake issues an error message and adds the duplicate IP address values to both the
allowed and blocked IP address lists.
Overall explanation
When a network policy includes values in both the allowed and blocked IP address lists,
Snowflake applies the blocked IP address list first.
Question 118Skipped
SnowUI
SnowSQL
chumma kizhii
#dm
SnowCLI
Correct answer
SnowCD
Overall explanation
SnowCD (i.e. Snowflake Connectivity Diagnostic Tool) helps users to diagnose and
troubleshoot their network connection to Snowflake.
Question 119Skipped
What criteria does Snowflake use to determine the current role when initiating a session?
(Choose two.)
If a role was specified as part of the connection and that role has not been granted to the
Snowflake user, the role is automatically granted and it becomes the current role.
Correct selection
If a role was specified as part of the connection and that role has been granted to the
Snowflake user, the specified role becomes the current role.
If a role was specified as part of the connection and that role has not been granted to the
Snowflake user, it will be ignored and the default role will become the current role.
If no role was specified as part of the connection and a default role has not been set for
the Snowflake user, the session will not be initiated and the log in will fail.
Correct selection
If no role was specified as part of the connection and a default role has been defined for
the Snowflake user, that role becomes the current role.
Overall explanation
Question 120Skipped
The following JSON is stored in a VARIANT column called src of the CAR_SALES table:
chumma kizhii
#dm
Correct answer
Overall explanation
Insert a colon : between the VARIANT column name and any first-level element:
<column>:<level1_element>.
Question 121Skipped
Which database objects can be shared using Snowflake Secure Data Sharing?
Correct answer
Tables, External tables, Secure views, Secure materialized views, Secure UDFs.
chumma kizhii
#dm
Tables.
Overall explanation
Secure Data Sharing lets you share selected objects in a database in your account with other
Snowflake accounts. You can share all the previous database objects.
Question 122Skipped
A warehouse ran for 62 seconds, and it was suspended. After some time, it ran for another
20 seconds. For how many seconds will you be billed?
92 seconds.
Correct answer
122 seconds.
62 seconds.
20 seconds.
Overall explanation
You will be billed for 122 seconds (62 + 60 seconds) because warehouses are billed for a
minimum of one minute. The price would be different if the warehouse wasn't suspended
before executing the second query.
For example, if we had only run a query, and it had only run for 62 seconds, you would be
billed for these 62 seconds. If it had only run for 20 seconds, you would've been billed for 60
seconds.
Question 123Skipped
Which Snowflake edition offers the highest level of security for organizations that have the
strictest requirements?
Standard
Business Critical
Enterprise
Correct answer
Overall explanation
Virtual Private Snowflake offers our highest level of security for organizations that have the
strictest requirements
chumma kizhii
#dm
Question 124Skipped
External tables
Correct answer
Permanent tables
Overall explanation
Question 125Skipped
Which virtual warehouse consideration can help lower compute resource credit
consumption?
Correct answer
Increasing the maximum cluster count parameter for a multi-cluster virtual warehouse
Overall explanation
Question 126Skipped
What is the abbreviated form to get all the files in the stage for the current user?
1. SHOW @%;
1. LIST @~;
Correct answer
1. LS @~;
1. LS @usr;
Overall explanation
chumma kizhii
#dm
LS (abbrev form)
Question 127Skipped
The Provider is charged for compute resources used by the Data Consumer to query the
shared data.
The shared data is copied into the Data Consumer account, so the Consumer can modify it
without impacting the base data of the Provider.
Correct selection
The Data Consumer pays only for compute resources to query the shared data.
Correct selection
The Data Consumer pays for data storage as well as for data computing.
Overall explanation
With Secure Data Sharing, no actual data is copied or transferred between accounts. The
only charges to consumers are for the compute resources (i.e. virtual warehouses) used to
query the shared data.
Any full Snowflake account can both provide and consume shared data.
Question 128Skipped
SYSADMIN
Correct answer
ACCOUNTADMIN
SECURITYADMIN
PUBLIC
Overall explanation
chumma kizhii
#dm
Only the ACCOUNTADMIN role has the "CREATE SHARE" privilege by default. The privilege
can be granted to additional roles as needed.
Question 129Skipped
Correct answer
A tag can have only one masking policy for each data type.
A tag can have multiple masking policies with varying data types.
A tag can have multiple masking policies for each data type.
Overall explanation
The tag can support one masking policy for each data type that Snowflake supports.
Question 130Skipped
Which SQL commands should be used to write a recursive query if the number of levels is
unknown? (Choose two.)
Correct selection
CONNECT BY
LISTAGG
MATCH RECOGNIZE
QUALIFY
Oracle Cloud
Correct selection
WITH
Overall explanation
Question 131Skipped
chumma kizhii
#dm
Correct answer
Any Snowflake user can self-enroll in MFA through the web interface.
Overall explanation
Question 132Skipped
Based on the amount of uncompressed data stored on the last day of the month.
Based on the amount of compressed data stored on the last day of the month.
Correct answer
Overall explanation
Storage costs benefit from the automatic compression of all data stored, and the total
compressed file size is used to calculate the storage bill for an account.
Question 133Skipped
Correct answer
Overall explanation
Question 134Skipped
Correct selection
SnowSQL
chumma kizhii
#dm
Snowflake Marketplace
SnowCD
Correct selection
Snowsight
Overall explanation
Question 135Skipped
When the procedure is defined with an argument that has the same name and type as the
session variable.
Correct answer
Overall explanation
Question 136Skipped
What command will you execute if you want to disable the query cache?
Correct answer
Overall explanation
This command turns off the query result caching feature for the current session. When the
caching is disabled, the results of queries are not stored in the cache, and subsequent
executions of the same query will not use the cached results. This can negatively affect
chumma kizhii
#dm
query performance, especially for queries executed frequently or with long execution times,
but it might be useful for development purposes.
Question 137Skipped
Which of the following roles or privileges are required to view the table function
TASK_HISTORY? (Choose three.)
SECURITYADMIN.
Correct selection
Correct selection
ACCOUNTADMIN.
SYSADMIN.
Correct selection
Overall explanation
The function returns the history of task usage for your entire Snowflake account or a
specified task. It returns results for the ACCOUNTADMIN role, the task owner, or a role with
the global MONITOR EXECUTION privilege.
It returns task activity within the last 7 days or the next scheduled execution within the next
8 days.
Question 138Skipped
If a transaction disconnects and goes into a detached state, which cannot be committed or
rolled back, how long will Snowflake take to abort the transaction?
Correct answer
4 hours.
60 minutes.
15 minutes.
12 hours.
Overall explanation
If the transaction is left open or not aborted by the user, Snowflake automatically rolls back
the transaction after being idle for four hours.
chumma kizhii
#dm
Question 139Skipped
Which of the following commands cannot be executed from the Snowflake UI? (Choose
two.)
LIST <stages>
COPY INTO.
Correct selection
PUT.
SHOW.
Correct selection
GET.
Overall explanation
These two commands cannot be executed from the Snowflake web interface; instead, you
should use the SnowSQL client to GET or PUT data files.
Question 140Skipped
What can a Snowflake user do with the information included in the details section of a
Query Profile?
Correct answer
Overall explanation
Question 141Skipped
The consumer account runs the GRANT INPORTED PRIVILEGES command on the data share
every 24 hours.
chumma kizhii
#dm
The consumer account acquires the data share through a private data exchange.
Correct answer
The objects in the data share are being deleted and the grant pattern is not re-applied
systematically.
Overall explanation
Any objects that you remove from a share are instantly unavailable to the consumers
accounts who have created databases from the share.
For example, if you remove a table from a share, users in consumer accounts can no longer
query the data in the table as soon as the table is removed from the share.
Question 142Skipped
We need to temporarily store intermediate data, which an ETL process will only use. We
don't need the data outside the ETL process.
If you want to optimize storage cost, what type of table will you create to store this data?
External.
Correct answer
Temporary.
Transient.
Permanent.
Overall explanation
With temporary tables, you can optimize storage costs, as when the Snowflake session ends,
data stored in the table is entirely purged from the system. But they also require storage
costs while the session is active.
A temporary table is purged once the session ends, so the retention period is for 24 hours or
the remainder of the session.
Question 143Skipped
What is the minimum Snowflake edition required to use Dynamic Data Masking?
Business Critical
Correct answer
chumma kizhii
#dm
Enterprise
Standard
Overall explanation
Question 144Skipped
chumma kizhii
#dm
Correct answer
Overall explanation
Only if the system estimates there’s enough query load to keep the cluster busy for at least 6
minutes
Question 145Skipped
Correct answer
Overall explanation
You can use the OBJECT_CONSTRUCT function combined with the COPY command to
convert the rows in a relational table to a single VARIANT column and unload the rows into a
file.
Question 146Skipped
Which command can be used to verify if data has been uploaded to the external stage
named my_stage?
show @my_stage
view @my_stage
chumma kizhii
#dm
display @my_stage
Correct answer
list @my_stage
Overall explanation
Question 147Skipped
A permanent table and temporary table have the same name, TBL1, in a schema.
Correct answer
The temporary table will take precedence over the permanent table.
An error will say there cannot be two tables with the same name in a schema.
The table that was created most recently will take precedence over the older table.
The permanent table will take precedence over the temporary table.
Overall explanation
All queries and other operations performed in the session on the table affect only the
temporary table.
Question 148Skipped
Correct answer
Use a smaller number of larger tables rather than a larger number of smaller tables.
Select all columns from tables, even if they are not needed in the query.
Overall explanation
Snowflake makes use of clustering on tables. Users can utilize cluster key to enhance query
performance (partition pruning) on large tables. So, the fewer and larger the tables, the
better the pruning and clustering will work.
chumma kizhii
#dm
The lesser the number of joins between several tables, the better performance will be in
general.
Other recommendations like Select all columns is the complete opposite of what Snowflake
recommends. Selecting only the required columns is a common query optimization
technique in Snowflake that can improve query performance and reduce resource
consumption.
Subqueries can be useful, but they can also slow down your queries.
Question 149Skipped
What action can a Resource Monitor not take when it hits the limit?
Correct answer
Notify.
Overall explanation
- Notify --> It performs no action but sends an alert notification (email/web UI).
- Notify & Suspend --> It sends a notification and suspends all assigned warehouses after all
statements being executed by the warehouse (s) have been completed.
- Notify & Suspend Immediately --> It sends a notification and suspends all assigned
warehouses immediately.
Question 150Skipped
chumma kizhii
#dm
After how many days does the load history of Snowpipe expire?
90 days.
180 days.
Correct answer
14 days.
1 day.
Overall explanation
The load history is stored in the metadata of the pipe for 14 days. Must be requested from
Snowflake via a REST endpoint, SQL table function, or ACCOUNT_USAGE view.
Question 151Skipped
Which of the following file formats are supported by Snowflake? (Choose three.)
Correct selection
Avro.
XLSX.
TXT.
Correct selection
CSV.
Correct selection
XML.
HTML.
Overall explanation
A File Format object describes and stores the format information required to load data into
Snowflake tables. You can specify different parameters, such as the file’s delimiter, if you
want to skip the header or not, etc. Snowflake supports both Structured and Semi-
Structured Data. You can see the different file formats in the following image:
chumma kizhii
#dm
Question 152Skipped
Which services does the Snowflake Cloud Services layer manage? (Choose two.)
Compute resources
Query execution
Correct selection
Authentication
Data storage
Correct selection
Metadata
Overall explanation
Question 153Skipped
When reviewing the load for a warehouse using the load monitoring chart, the chart
indicates that a high volume of queries is always queuing in the warehouse.
According to recommended best practice, what should be done to reduce the queue
volume? (Choose two.)
Limit user access to the warehouse so fewer queries are run against it.
Correct selection
Correct selection
chumma kizhii
#dm
Overall explanation
If the running query load is high or there’s queuing, consider starting a separate warehouse
and moving queued queries to that warehouse. Alternatively, if you are using multi-cluster
warehouses, you could change your multi-cluster settings to add additional clusters to
handle higher concurrency going forward.
Question 154Skipped
Correct selection
Correct selection
Overall explanation
Resizing a warehouse to a larger size is helpful to improve the performance of large, complex
queries against large data sets; and improve performance while loading and unloading
significant amounts of data.
Question 155Skipped
2. CLONE table1;
Correct answer
Snowflake creates a new entry in the metadata store to keep track of the new clone. The
existing micro-partitions of “table1” are mapped to the new table.
“newTable” is created, and Snowflake internally executes a batch job to copy all the data
from “table1”
chumma kizhii
#dm
“newTable” is created, and Snowflake internally executes a pipe to copy all the data from
“table1”
Overall explanation
Zero-Copy cloning does NOT duplicate data; it duplicates the metadata of the micro-
partitions. When you modify some cloned data, it will consume storage.
SET 4
Question 1Skipped
Clustering keys establish a hashed key on each node of a virtual warehouse to optimize
joins at run-time.
Clustering keys create a distributed, parallel data structure of pointers to a table's rows
and columns.
Clustering keys update the micro-partitions in place with a full sort, and impact the DML
operations.
Correct answer
chumma kizhii
#dm
Clustering keys sort the designated columns over time, without blocking DML operations.
Overall explanation
Question 2Skipped
1 day.
2 days.
Correct answer
7 days.
Unlimited.
Overall explanation
Question 3Skipped
A user has 10 files in a stage containing new customer data. The ingest operation completes
with no errors, using the following command:
The next day the user adds 10 files to the stage so that now the stage contains a mixture of
new customer data and updates to the previous data. The user did not remove the 10
original files.
If the user runs the same COPY INTO command what will happen?
Correct answer
All data from only the newly-added files will be appended to the table.
All data from all of the files on the stage will be appended to the table.
Only data about new customers from the new files will be appended to the table.
The operation will fail with the error UNCERTAIN FILES IN STAGE.
Overall explanation
chumma kizhii
#dm
COPY command maintains historic load metadata with target table, so day 1 , 10 files will will
not be loaded again.
Question 4Skipped
A view is defined on a permanent table. A temporary table with the same name is created in
the same schema as the referenced table.
An error stating that the referenced object could not be uniquely identified.
Correct answer
Overall explanation
Similar to the other table types (transient and permanent), temporary tables belong to a
specified database and schema; however, because they are session-based, they aren’t
bound by the same uniqueness requirements. This means you can create temporary and
non-temporary tables with the same name within the same schema.
However, note that the temporary table takes precedence in the session over any other
table with the same name in the same schema.
Question 5Skipped
What tasks can be completed using the COPY command? (Choose two.)
Correct selection
Correct selection
Overall explanation
chumma kizhii
#dm
Question 6Skipped
What does the orange bar on an operator represent when reviewing the Query Profile?
The fraction of data scanned from cache versus remote disk for the operator.
Correct answer
The fraction of time that this operator consumed within the query step.
The cost of the operator in terms of the virtual warehouse CPU utilization.
Overall explanation
Fraction of time that this operator consumed within the query step (e.g. 25% for Aggregate
[5]). This information is also reflected in the orange bar at the bottom of the operator node,
allowing for easy visual identification of performance-critical operators.
Question 7Skipped
A user unloaded a Snowflake table called mytable to an internal stage called mystage.
Which command can be used to view the list of files that has been uploaded to the stage?
Correct answer
list @mystage;
list @%mystage;
list @%mytable;
chumma kizhii
#dm
list @mytable;
Overall explanation
Question 8Skipped
Which command is used to unload data from a Snowflake table to an external stage?
GET
Correct answer
COPY INTO
Overall explanation
Use the COPY INTO <location> command to copy the data from the Snowflake database
table into one or more files in a Snowflake stage.
Question 9Skipped
What action should be taken if a Snowflake user wants to share a newly created object in a
database with consumers?
Correct answer
Use the GRANT privilege ... TO SHARE command to grant the necessary privileges.
Drop the object and then re-add it to the database to trigger sharing.
Recreate the object with a different name in the database before sharing.
Overall explanation
Grants access privileges for databases and other supported database objects (schemas,
UDFs, tables, and views) to a share. Granting privileges on these objects effectively adds the
objects to the share, which can then be shared with one or more consumer accounts.
chumma kizhii
#dm
Question 10Skipped
How many credits will consume a medium-size warehouse with 2 clusters running in
maximized mode for 3 hours?
8.
32.
16.
Correct answer
24.
Overall explanation
A medium size warehouse with one cluster consumes four credits per hour. As we have two
clusters, it will consume eight credits per hour. In three hours, it will consume 24 credits.
Question 11Skipped
The COPY INTO command can unload data from a table directly into which locations?
(Choose two.)
Correct selection
chumma kizhii
#dm
Correct selection
Overall explanation
Question 12Skipped
Account.
Correct answer
Roles.
Groups.
Users.
Overall explanation
Question 13Skipped
We want to generate a JSON object with the data from a table called users_table, composed
of two columns (AGE and NAME), ordered by the name column. How can we do it?
2. FROM users_table
3. order by users_object[‘NAME’];
2. FROM users_table
3. order by users_object[‘NAME’];
2. FROM users_table
3. order by users_object[‘NAME’];
chumma kizhii
#dm
Correct answer
2. FROM users_table
3. order by users_object[‘NAME’];
Overall explanation
Question 14Skipped
There are two Snowflake accounts in the same cloud provider region: one is production and
the other is non-production.
How can data be easily transferred from the production account to the non-production
account?
Create a reader account using the production account and link the reader account to the
non-production account.
Clone the data from the production account to the non-production account.
Create a subscription in the production account and have it publish to the non-production
account.
Correct answer
Create a data share from the production account to the non-production account.
Overall explanation
chumma kizhii
#dm
Zero Copy Clone would be the Snowflake tool to cover this case but the question refers that
we want to do it between 2 accounts of the even if they are in the same Cloud Provider. It is
not possible to use Zero Copy Clone between different accounts, so we have to use Data
Sharing.
Question 15Skipped
chumma kizhii
#dm
Which permission on a Snowflake virtual warehouse allows the role to resize the
warehouse?
ALTER
USAGE
Correct answer
MODIFY
MONITOR
Overall explanation
Question 16Skipped
Which of the following roles are recommended to create and manage users and roles?
(Choose two.
SYSADMIN
Correct selection
SECURITYADMIN
Correct selection
USERADMIN
ACCOUNTADMIN
PUBLIC
Overall explanation
Question 17Skipped
Snowflake’s access control framework combines which models for securing data? (Choose
two.)
Correct selection
chumma kizhii
#dm
Correct selection
Overall explanation
Snowflake’s approach to access control combines aspects from both of the following models:
• Discretionary Access Control (DAC): Each object has an owner, who can in turn grant
access to that object.
• Role-based Access Control (RBAC): Access privileges are assigned to roles, which are
in turn assigned to users.
Question 18Skipped
Which command can we use to query the table <my_table> as it was 15 minutes ago?
1. SELECT *
2. FROM my_table
Correct answer
1. SELECT *
2. FROM my_table
1. SELECT *
2. FROM my_table
We should’ve created a backup of that table. Otherwise, we are not able to do it.
Overall explanation
Thanks to the Time Travel functionality, it’s possible to query a table as it was some time
ago. We need to put the time in seconds as the offset parameter, in this case, 15 minutes *
60. We’ll add the “-” symbol as we are querying in the past.
Question 19Skipped
chumma kizhii
#dm
The time-travel retention period of a table is configured to be ten days. You now increase the
retention period to 20 days. What will happen with the table's data after this increment?
(Choose two.)
Any data that is between 10 and 20 days older will have time-travel extended.
Correct selection
Any data which has not reached the ten days time-travel period will now have time-travel
extended for 20 days.
Correct selection
Any data that is ten days older and moved to fail-safe will not have any impact.
Overall explanation
Increasing the time-travel retention period impacts the new data and the data that hasn't
reached the time-travel period. In this case, the new data and the data that hasn't reached
the ten days will be extended to 20 days.
Question 20Skipped
2. (NAME STRING(100));
What would the command "DESC TABLE MY_TABLE;" display as the column type?
Char.
String.
Correct answer
Varchar.
Text.
Overall explanation
Varchar has different synonyms, like STRING , TEXT , NVARCHAR , CHAR , CHARACTER…, but
in the end, they are all VARCHAR type when describing the table. Take a look at the following
example, where all the column types are VARCHAR:
chumma kizhii
#dm
Question 21Skipped
64.
100.
Correct answer
10.
Overall explanation
Question 22Skipped
Which file function generates a Snowflake-hosted file URL to a staged file using the stage
name and relative file path as inputs?
GET_RELATIVE_PATH
GET_STAGE_LOCATION
GET_ABSOLUTE_PATH
Correct answer
BUILD_STAGE_FILE_URL
chumma kizhii
#dm
Overall explanation
Question 23Skipped
What is the impact on queries that are being executed when a resource monitor set to the
“Notify & Suspend” threshold level is exceeded?
Correct answer
Overall explanation
Send a notification (to all account administrators with notifications enabled) and suspend all
assigned warehouses after all statements being executed by the warehouse(s) have
completed.
Send a notification (to all account administrators with notifications enabled) and suspend all
assigned warehouses immediately, which cancels any statements being executed by the
warehouses at the time.
Question 24Skipped
What activities can a user with the ORGADMIN role perform? (Choose two.)
Correct selection
Correct selection
chumma kizhii
#dm
Overall explanation
Question 25Skipped
Column charts
Correct selection
Bar charts
Correct selection
Scorecards
Area charts
Radar charts
Overall explanation
• Bar charts.
• Line charts.
• Scatterplots.
• Heat grids.
• Scorecards.
Question 26Skipped
Which parameter allows us to schedule the task_1 to run every day with a CRON expression?
Correct answer
1. SET SCHEDULE
1. SET FIXED_TIME
1. SET INITIALIZATION
1. SET CRON
chumma kizhii
#dm
Overall explanation
Question 27Skipped
When unloading data to an external stage, which compression format can be used for
Parquet files with the COPY INTO command?
BROTLI
GZIP
Correct answer
LZO
ZSTD
Overall explanation
Question 28Skipped
Correct answer
Overall explanation
Renaming a view.
Note that you cannot use this command to change the definition for a view. To change the
view definition, you must drop the view and then recreate it.
https://docs.snowflake.com/en/sql-reference/sql/alter-view.html#alter-view
chumma kizhii
#dm
Question 29Skipped
Which parameter can be used to instruct a COPY command to verify data files instead of
loading them into a specified table?
SKIP_BYTE_ORDER_MARK
STRIP_NULL_VALUES
Correct answer
VALIDATION_MODE
REPLACE_INVALID_CHARACTERS
Overall explanation
VALIDATION_MODE: This instructs the command to validate the data files instead of loading
them into target tables and allows you to perform the dry run to ensure the fail-safe delivery
of data.
Question 30Skipped
Which Snowflake table is an implicit object layered on a stage, where the stage can be either
internal or external?
Transient table
Temporary table
Correct answer
Directory table
Overall explanation
A directory table is a Snowflake table that is an implicit object layered on a stage, where the
stage can be either internal or external. A directory table allows querying the metadata and
contents of the files in the stage using standard SQL statements.
Question 31Skipped
Correct answer
1. SELECT MAX(ID)
2. FROM MYTABLE
chumma kizhii
#dm
3. GROUP BY ID
1. SELECT COUNT(*)
2. FROM MYTABLE
1. SELECT MIN(ID)
2. FROM MYTABLE
1. SELECT MAX(ID)
2. FROM MYTABLE
Overall explanation
You can test it by going to the query profile of each query. The first three queries use the
metadata cache, whereas the last one doesn’t do it because of the GROUP BY.
Question 32Skipped
Alert
Suspend immediately
Correct selection
Notify
Abort
Correct selection
Overall explanation
Notify & Suspend Send a notification (to all account administrators with notifications
enabled) and suspend all assigned warehouses after all statements being executed by the
warehouse(s) have completed.
Notify & Suspend Immediately Send a notification (to all account administrators with
notifications enabled) and suspend all assigned warehouses immediately, which cancels any
statements being executed by the warehouses at the time.
Notify Perform no action, but send an alert notification (to all account administrators with
notifications enabled).
Question 33Skipped
chumma kizhii
#dm
Which of the following operations require the use of a running virtual warehouse? (Choose
two.)
Correct selection
Correct selection
Altering a table
Overall explanation
Question 34Skipped
Which Snowflake object stores DML change made to tables and metadata about each
change?
Account Streams.
Pipes.
Tables.
Correct answer
Table Streams.
Overall explanation
Streams are Snowflake objects that record data manipulation language (DML) changes made
to tables and views, including INSERTS, UPDATES, and DELETES, as well as metadata about
each change. A stream can also be referred to as a “table stream”.
Question 35Skipped
Which function can be used with the COPY INTO statement to convert rows from a relational
table to a single VARIANT column, and to unload rows into a JSON file?
TO_VARIANT
OBJECT_AS
Correct answer
chumma kizhii
#dm
OBJECT_CONSTRUCT
FLATTEN
Overall explanation
You can use the OBJECT_CONSTRUCT function combined with the COPY command to
convert the rows in a relational table to a single VARIANT column and unload the rows into a
file.
Question 36Skipped
INSERT INTO
ALTER TABLE
Correct answer
SELECT
MERGE
Overall explanation
Shared databases are read-only. Users in a consumer account can view/query data, but
cannot insert or update data, or create any objects in the database.
Question 37Skipped
A materialized view can be set up with the auto-refresh feature using the SQL SET
command.
Correct answer
Overall explanation
chumma kizhii
#dm
This is more efficient and less error-prone than manually maintaining the equivalent of a
materialized view at the application level.
Question 38Skipped
A PUT command can be used to stage local files from which Snowflake interface?
Snowsight
Correct answer
SnowSQL
.NET driver
Overall explanation
PUT command Usage The command cannot be executed from the Worksheets page in the
Snowflake web interface; instead, use the SnowSQL client to upload data files, or check the
documentation for the specific Snowflake client to verify support for this command.
Question 39Skipped
What are potential impacts of storing non-native values like dates and timestamps in a
VARIANT column in Snowflake?
Correct answer
Overall explanation
For non-native data (such as dates and timestamps), the values are stored as strings when
loaded into a VARIANT column. Therefore, operations on these values could be slower and
also consume more space than when stored in a relational column with the corresponding
data type.
Question 40Skipped
What statements are true about the Snowflake History Page? (Choose two.)
chumma kizhii
#dm
Correct selection
The History page allows you to view and drill into the details of all queries executed in the
last 24 days.
Correct selection
The History page allows you to view and drill into the details of all queries executed in the
last 14 days.
Overall explanation
You can view results only for queries you have executed. You can also see other users’
queries but cannot see their results. If you have privileges to view queries executed by
another user, the Query Detail page displays the details for the query, but it won’t show the
actual query result.
However, if you have the same role, perform the same query, and the data has not changed,
you’ll use the Query Result Cache and get the same result. In this example, you can see the
execution of a query that uses the Query Results Cache, only spending 141ms to be
executed.
Question 41Skipped
1. SELECT MAX(AGE)
2. FROM USERS_TABLE;
Correct answer
1. SELECT *
2. FROM USERS_TABLE
3. WHERE email=’[email protected]’;
2. SELECT *
chumma kizhii
#dm
3. FROM USERS_TABLE
4. WHERE email=’[email protected]’;
1. SELECT COUNT(*)
2. FROM USERS_TABLE;
Overall explanation
The first query will use compute power, whereas the others don’t need a warehouse
running. This is an excellent example to see the use of the EXPLAIN command, which returns
the logical execution plan for the specified SQL statement. An explained plan shows the
operations (for example, table scans and joins) that Snowflake would perform to execute the
query.
For example, running the previous command in my Snowflake account, I generated this
result in 101ms:
Although EXPLAIN does not consume any compute credits, the compilation of the query
does consume Cloud Service credits, just as other metadata operations do. The output is the
same as the output of the command EXPLAIN_JSON.
Question 42Skipped
Correct answer
Overall explanation
Snowpipe uses compute resources provided by Snowflake (i.e. a serverless compute model).
Question 43Skipped
chumma kizhii
#dm
Which Snowflake partner category is represented at the top of this diagram (labeled 1)?
Business Intelligence
Correct answer
Data Integration
Overall explanation
Question 44Skipped
Which is the MINIMUM required Snowflake edition that a user must have if they want to use
AWS/Azure Privatelink or Google Cloud Private Service Connect?
Enterprise
Standard
Correct answer
Business Critical
Premium
chumma kizhii
#dm
Overall explanation
Question 45Skipped
Correct answer
Allows users to connect using secure single sign-on (SSO) through an external identity
provider
Disables the ability to use key pair and basic authentication (e.g., username/password)
when connecting
Overall explanation
In a federated environment, user authentication is separated from user access through the
use of one or more external entities that provide independent authentication of user
credentials. The authentication is then passed to one or more services, enabling users to
access the services through SSO.
Question 46Skipped
What table functions in the Snowflake Information Schema can be queried to retrieve
information about directory tables? (Choose two.)
Correct selection
AUTO_REFRESH_REGISTRATION_HISTORY
EXTERNAL_TABLE_FILE_REGISTRATION_HISTORY
EXTERNAL_TABLE_FILES
MATERIALIZED_VIEW_REFRESH_HISTORY
Correct selection
STAGE_DIRECTORY_FILE_REGISTRATION_HISTORY
Overall explanation
chumma kizhii
#dm
Question 47Skipped
Correct answer
The additional compute resources are provisioned when the warehouse is resumed.
The suspended warehouse is resumed and new compute resources are provisioned
immediately.
Overall explanation
Resizing a suspended warehouse does not provision any new compute resources for the
warehouse. It simply instructs Snowflake to provision the additional compute resources
when the warehouse is next resumed, at which time all the usage and credit rules associated
with starting a warehouse apply.
Question 48Skipped
When data is loaded into Snowflake, what formats does Snowflake use internally to store
the data in cloud storage? (Choose two.)
Correct selection
Compressed
Key-value
Correct selection
Columnar
Document
Graph
Overall explanation
chumma kizhii
#dm
When data is loaded into Snowflake, Snowflake reorganizes that data into its internal
optimized, compressed, columnar format.
Question 49Skipped
Correct answer
Both.
Overall explanation
Internal named stages are NEVER cloned, so pipes that reference internal stages are not
cloned.
Question 50Skipped
After how many hours does Snowflake cancel our running SQL statement by default?
10 hours.
1 hour.
24 hours.
Correct answer
48 hours.
Overall explanation
Question 51Skipped
Correct answer
chumma kizhii
#dm
Graphical representation of the main components of the processing plan of the query.
Overall explanation
You can see the query profiler in the following picture. We can see the graphical
representation of the components and some statistics for the overall query and for each
component of the query. Still, unfortunately, there are no hints to improve it, so we need to
become good Snowflake developers to spot bottlenecks by ourselves!
Question 52Skipped
Which role in Snowflake allows a user to enable replication for multiple accounts?
SYSADMIN
Correct answer
ORGADMIN
SECURITYADMIN
ACCOUNTADMIN
Overall explanation
chumma kizhii
#dm
Question 53Skipped
Correct selection
Correct selection
File keys
Overall explanation
Snowflake’s hierarchical key model consists of four levels of keys: the root key, account
master keys, table master keys, and file keys. Each account master key corresponds to one
customer account in Snowflake. Each table master key corresponds to one database table in
a database.
Question 54Skipped
Correct selection
chumma kizhii
#dm
Correct selection
Overall explanation
Question 55Skipped
The bulk data load history that is available upon completion of the COPY statement is stored
where and for how long?
Correct answer
Overall explanation
Stored in the metadata of the target table for 64 days. Available upon completion of the
COPY statement as the statement output.
Question 56Skipped
In a managed access schema, who can grant privileges on objects in the schema to other
roles? (Choose two.)
Correct selection
Correct selection
chumma kizhii
#dm
Overall explanation
In managed access schemas, object owners lose the ability to make grant decisions. Only the
schema owner or a role with the MANAGE GRANTS privilege can grant privileges on objects
in the schema, including future grants, centralizing privilege management.
Question 57Skipped
What is the maximum length of time travel available in the Snowflake Standard Edition?
30 Days
7 Days
Correct answer
1 Day
90 Days
Overall explanation
Question 58Skipped
Gantt charts
Correct selection
Line charts
Pie charts S3
chumma kizhii
#dm
Flowcharts
Correct selection
Scatterplots
Overall explanation
• Bar charts
• Line charts
• Scatterplots
• Heat grids
• Scorecards
Question 59Skipped
The owner of a task (the one who has the OWNERSHIP privilege) is deleted. What will
happen to the task?
Correct answer
The task will belong to the role that dropped the owner’s role.
Overall explanation
When the owner of a task is deleted, the task is "re-possessed" to the role that dropped the
owner's role. This ensures that ownership moves to a role closer to the role hierarchy's root.
In this case, the task will have to be resumed explicitly by the new owner, as it's
automatically paused.
Question 60Skipped
Which of the following Snowflake capabilities are available in all Snowflake editions?
(Choose two.)
Correct selection
chumma kizhii
#dm
Correct selection
Overall explanation
Question 61Skipped
The return of micro-partitions values that overlap with each other to reduce a query's
runtime.
chumma kizhii
#dm
A service that is handled by the Snowflake Cloud Services layer to optimize caching.
Correct answer
The filtering or disregarding of micro-partitions that are not needed to return a query.
Overall explanation
Question 62Skipped
If all virtual warehouse resources are maximized while processing a query workload, what
will happen to new queries that are submitted to the warehouse?
Correct answer
Overall explanation
Question 63Skipped
Which of the following file formats is not supported by Snowflake to unload data?
Correct answer
Avro.
Parquet.
CSV.
JSON.
Overall explanation
A File Format object describes and stores the format information required to load data into
Snowflake tables. You can specify different parameters, such as the file’s delimiter, if you
want to skip the header or not, etc. You can see the different file formats in the following
image:
chumma kizhii
#dm
Question 64Skipped
Which commands should be used to grant the privilege allowing a role to select data from all
current tables and any tables that will be created later in a schema? (Choose two.)
Correct selection
Correct selection
Overall explanation
Question 65Skipped
What is the COPY INTO command option default for unloading data into multiple files?
SINGLE = NULL
SINGLE = TRUE
SINGLE = 0
Correct answer
SINGLE = FALSE
Overall explanation
The COPY INTO <location> command provides a copy option (SINGLE) for unloading data into
a single file or multiple files. The default is SINGLE = FALSE (i.e. unload into multiple files).
chumma kizhii
#dm
Question 66Skipped
&
Correct answer
Overall explanation
Question 67Skipped
Correct selection
Correct selection
Overall explanation
Your current role must have SELECT privilege on the source table to create a clone. In
addition, to clone a schema or an object within a schema, you’d need privileges on the
container object(s) for both the source and the clone. You’d need the OWNERSHIP privilege
for pipes, streams, and tasks.
You can see the necessary privileges to create a clone in the following table:
chumma kizhii
#dm
Question 68Skipped
Who can activate a network policy for users in a Snowflake account? (Choose two.)
PUBLIC
Correct selection
Correct selection
ACCOUNTADMIN
USERADMIN
SYSADMIN
Overall explanation
Only security administrators (i.e. users with the SECURITYADMIN role) or higher or a role
with the global ATTACH POLICY privilege can activate a network policy for an account. Once
the policy is associated with your account, Snowflake restricts access to your account based
on the allowed list and blocked list.
Question 69Skipped
Snowsight
Correct answer
SnowSQL
SnowCD
chumma kizhii
#dm
Overall explanation
SnowSQL is the command line client for connecting to Snowflake to execute SQL queries and
perform all DDL and DML operations, including loading data into and unloading data out of
database tables.
SnowCD (i.e. Snowflake Connectivity Diagnostic Tool) helps users to diagnose and
troubleshoot their network connection to Snowflake.
The Snowflake SQL API is a REST API that you can use to access and update data in a
Snowflake database.
Question 70Skipped
Which programming languages are supported for Snowflake User-Defined Functions (UDFs)?
(Choose two.)
Correct selection
JavaScript
TypeScript
C#
Correct selection
Python
PHP
Overall explanation
Question 71Skipped
A JSON file that contains lots of dates and arrays needs to be processed in Snowflake. The
user wants to ensure optimal performance while querying the data.
Store the data in a table with a VARIANT data type. Query the table.
Store the data in a table with a VARIANT data type and include STRIP_NULL_VALUES while
loading the table. Query the table.
Store the data in an external stage and create views on top of it. Query the views.
chumma kizhii
#dm
Correct answer
Flatten the data and store it in structured data types in a flattened table. Query the table.
Overall explanation
For better pruning and less storage consumption, we recommend flattening your OBJECT
and key data into separate relational columns if your semi-structured data includes:
• Dates and timestamps, especially non-ISO 8601 dates and timestamps, as string
values
• Arrays
Non-native values (such as dates and timestamps in JSON) are stored as strings when loaded
into a VARIANT column, so operations on these values could be slower and also consume
more space than when stored in a relational column with the corresponding data type.
Question 72Skipped
What tasks can an account administrator perform in the Data Exchange? (Choose two.)
Correct selection
Correct selection
Overall explanation
By default, only an account administrator (a user with the ACCOUNTADMIN role) in the Data
Exchange administrator account can manage a Data Exchange, which includes the following
tasks:
• Show categories.
chumma kizhii
#dm
Question 73Skipped
A task is still being executed before the next scheduled task. What is going to happen with
the new scheduled task?
Correct answer
Overall explanation
Snowflake ensures that only one instance of a task with a schedule is executed at a given
time. If a task is still running when the next scheduled execution time occurs, then that
scheduled time is skipped.
Question 74Skipped
Streams.
Correct answer
Data Exchange.
Snowpipe.
Overall explanation
Data Exchange is your own data hub for securely collaborating around data between a
selected group of members you invite. It enables providers to publish data that consumers
can then discover. You can use your Data Exchange to exchange data between business units
internal to your company and collaborate with external parties such as vendors, suppliers,
partners, and customers.
Question 75Skipped
Which of the following indicates that it may be appropriate to use a clustering key for a
table? (Choose two.)
Correct selection
chumma kizhii
#dm
Correct selection
DML statements that are being issued against the table are blocked.
Overall explanation
Queries on the table are running slower than expected or have noticeably degraded over
time.
Question 76Skipped
Correct answer
16 MB.
8 MB.
64 MB.
128 MB.
Overall explanation
The VARIANT data type imposes a 16 MB size limit on individual rows. If the data exceeds 16
MB, enable the STRIP_OUTER_ARRAY file format option for the COPY INTO <table>
command to remove the outer array structure and load the records into separate table rows.
Question 77Skipped
Which statements are true concerning Snowflake’s underlying cloud infrastructure? (Choose
three.)
Snowflake can be deployed in a customer’s private cloud using the customer’s own
compute and storage resources for Snowflake compute and storage.
Correct selection
Snowflake data and services are deployed in at least three availability zones within a cloud
provider’s region.
chumma kizhii
#dm
Snowflake data and services are deployed in a single availability zone within a cloud
provider’s region.
Correct selection
All three layers of Snowflake’s architecture (storage, compute, and cloud services) are
deployed and managed entirely on a selected cloud platform.
Snowflake data and services are available in a single cloud provider and a single region;
the use of multiple cloud providers is not supported.
Correct selection
Snowflake uses the core compute and storage services of each cloud provider for its own
compute and storage.
Overall explanation
In addition, Snowflake’s virtual warehouses and cloud services layers are similarly deployed
across three availability zones in a region.
Question 78Skipped
Correct selection
Micro-partitions are immutable objects that support the use of Time Travel.
Correct selection
Micro-partitions can reduce the amount of I/O from object storage to virtual warehouses.
Overall explanation
As wrote here:
chumma kizhii
#dm
- As the name suggests, micro-partitions are small in size (50 to 500 MB, before
compression), which enables extremely efficient DML and fine-grained pruning for faster
queries.
- Micro-partitions can overlap in their range of values, which, combined with their uniformly
small size, helps prevent skew.
Question 79Skipped
GET
Correct selection
ROLLBACK
PUT
Correct selection
COMMIT
CALL
Overall explanation
A transaction is a sequence of SQL statements that are processed as an atomic unit. All
statements in the transaction are either applied (i.e. committed) or undone (i.e. rolled back)
together. Snowflake transactions guarantee ACID properties.
Question 80Skipped
Correct selection
chumma kizhii
#dm
Primary Key.
Unique Index.
Non-Unique Index.
Distribution Key.
Correct selection
Foreign Key.
Overall explanation
Constraints define integrity and consistency rules for data stored in tables. You can specify a
CONSTRAINT clause in a CREATE TABLE or ALTER TABLE statement. A table can have multiple
unique keys and foreign keys, but only one primary key. Snowflake supports defining and
maintaining constraints, but does not enforce them, except for NOT NULL constraints, which
are always enforced. For example:
3. );
Question 81Skipped
Snowflake's approach to the management of system access combines which of the following
models? (Choose two.)
Correct selection
Correct selection
Overall explanation
chumma kizhii
#dm
Snowflake’s approach to access control combines aspects from both of the following models:
Discretionary Access Control (DAC): Each object has an owner, who can in turn grant access
to that object.
Role-based Access Control (RBAC): Access privileges are assigned to roles, which are in turn
assigned to users.
Question 82Skipped
When can a newly configured virtual warehouse start running SQL queries?
Correct answer
Overall explanation
Snowflake does not begin executing SQL statements submitted to a warehouse until all of
the compute resources for the warehouse are successfully provisioned, unless any of the
resources fail to provision.
Question 83Skipped
What is the recommended file size for the best load performance and to avoid size
limitations?
Correct answer
Overall explanation
This is the best size to get the best load performance. Suppose you still have to load a big
file, for example, 100GB. In that case, you should carefully consider the ON_ERROR copy
option value, as there is a maximum allowed duration of 24 hours, and the operation could
be aborted without any portion of the file being committed.
chumma kizhii
#dm
Question 84Skipped
What privilege is needed for a Snowflake user to see the definition of a secure view?
USAGE
MODIFY
CREATE
Correct answer
OWNERSHIP
Overall explanation
The definition of a secure view is only exposed to authorized users (i.e. users who have been
granted the role that owns the view).
However, users that have been granted IMPORTED PRIVILEGES privilege on the SNOWFLAKE
database or another shared database have access to secure view definitions via the VIEWS
Account Usage view.
Question 85Skipped
For a multi-cluster virtual warehouse, which parameters are used to calculate the number of
credits billed? (Choose two.)
Correct selection
Warehouse size
Cache size
Correct selection
Number of clusters
Overall explanation
Question 86Skipped
chumma kizhii
#dm
Correct answer
Overall explanation
Snowflake shares metadata information about the usage of your account in the
ACCOUNT_USAGE share. You can also access different sample data sets for learning and
testing Snowflake’s functionalities with the SAMPLE_DATA share. You can see them in the
Snowflake UI if you have enough privileges:
Question 87Skipped
Correct answer
Overall explanation
Ability to consolidate account management and billing. The rest of the options are not
technically possible or are technically possible but do not depend on the use of
organizations.
chumma kizhii
#dm
Question 88Skipped
What are the benefits of the replication feature in Snowflake? (Choose two.)
Correct selection
Data security
Correct selection
Disaster recovery
Fail-safe
Time Travel
Question 89Skipped
You have a multi-cluster warehouse running with the standard scaling policy. The maximum
number of clusters is set to 8. If a lot of queries are queried, and the warehouse is constantly
starting new clusters, what is the maximum time the warehouse will start all the clusters?
8 minutes.
Correct answer
160 seconds.
80 seconds.
Overall explanation
Each successive cluster waits to start 20 seconds after the prior one has started. For
example, if your warehouse is configured with ten max clusters, it can take 200+ seconds to
start all 10 clusters. This doesn’t happen using the economy policy, that it will only start new
clusters if the system estimates there’s enough query load to keep the cluster busy for at
least 6 minutes. You can take a look to the following picture to know the differences
between these scaling policies:
chumma kizhii
#dm
Question 90Skipped
Any user with the appropriate privileges can view data storage for individual tables by using
which queries? (Choose two.)
Correct selection
Correct selection
Overall explanation
Any user with the appropriate privileges can view data storage for individual tables.
Snowflake provides the following methods for viewing table data storage:
Classic Console
Click on Databases
» <db_name> » Tables
SQL
or
chumma kizhii
#dm
Question 91Skipped
What is the recommended file sizing for data loading using Snowpipe?
Correct answer
Overall explanation
Loading data files roughly 100-250 MB in size or larger reduces the overhead charge relative
to the amount of total data loaded to the point where the overhead cost is immaterial.
Question 92Skipped
Which function should be used to insert JSON formatted string data into a VARIANT field?
CHECK_JSON
FLATTEN
TO_VARIANT
Correct answer
PARSE_JSON
Overall explanation
PARSE_JSON
Question 93Skipped
What activities can be monitored by a user directly from Snowsight's Activity tab without
using the Account_Usage views? (Choose two.)
chumma kizhii
#dm
Correct selection
Query history
Correct selection
Copy history
Login history
Overall explanation
It is highly recommended before the exam to browse the new Snowflake Snowsight UI and
familiarize yourself with the various capabilities it offers. Some questions like this will appear
on the exam.
Question 94Skipped
Which service or feature in Snowflake is used to improve the performance of certain types
of lookup and analytical queries that use an extensive set of WHERE conditions?
Tagging
Correct answer
Data classification
Overall explanation
The search optimization service can significantly improve the performance of certain types
of lookup and analytical queries that use an extensive set of predicates for filtering.
Question 95Skipped
Which feature of Snowflake’s Continuous Data Protection (CDP) has associated costs?
End-to-end encryption
Correct answer
chumma kizhii
#dm
Fail-safe
Network policies
Overall explanation
Question 96Skipped
Enterprise edition.
Correct answer
Standard edition.
Overall explanation
There are some restrictions in the Snowflake VPS edition. You can check all of them in the
following image (via docs.snowflake.com):
Question 97Skipped
Correct answer
Overall explanation
chumma kizhii
#dm
Decreasing the precision of a number column can impact Time Travel, for example,
converting from NUMBER(20,2) to NUMBER(10,2). SET DATA TYPE is the command that can
make that.
Question 98Skipped
Object
Correct answer
Geometry
Variant
Overall explanation
Question 99Skipped
While using a COPY command with a Validation_mode parameter, which of the following
statements will return an error?
Correct answer
Overall explanation
The VALIDATION_MODE parameter does not support COPY statements that transform data
during a load.
Question 100Skipped
A JSON document is stored in the source_column of type VARIANT. The document has an
array called elements. The array contains the name key that has a string value.
How can a Snowflake user extract the name from the first element?
Correct answer
chumma kizhii
#dm
source_column:elements[0].name
source_column:elements[1].name
source_column.elements[1]:name
source_column.elements[0]:name
Overall explanation
Question 101Skipped
To add or remove search optimization for a table, a user must have which of the following
privileges or roles? (Choose two.)
A SECURITYADMIN role
Correct selection
Correct selection
The ADD SEARCH OPTIMIZATION privilege on the schema that contains the table
Overall explanation
To add, configure, or remove search optimization for a table, you must have the following
privileges:
You must have ADD SEARCH OPTIMIZATION privilege on the schema that contains the table.
Question 102Skipped
When does Snowflake automatically encrypt data that is loaded into Snowflake? (Choose
two.)
Correct selection
chumma kizhii
#dm
Correct selection
Overall explanation
1. If the stage is an external stage (Image A), the user may optionally encrypt the data files
using client-side encryption (see Client-Side Encryption for more information). We
recommend client-side encryption for data files in external stages; but if the data is not
encrypted, Snowflake immediately encrypts the data when it is loaded into a table.
If the stage is an internal (i.e. Snowflake) stage (Image B) data files are automatically
encrypted by the Snowflake client on the user’s local machine prior to being transmitted to
the internal stage, in addition to being encrypted after they are loaded into the stage.
2. The user loads the data from the stage into a table.
The data is transformed into Snowflake’s proprietary file format and stored in a cloud
storage container. In Snowflake, all data at rest is always encrypted and encrypted with TLS
in transit. Snowflake also decrypts data when data is transformed or operated on in a table,
and then re-encrypts the data when the transformations and operations are complete.
chumma kizhii
#dm
Question 103Skipped
In addition to performing all the standard steps to share data, which privilege must be
granted on each database referenced by a secure view in order to be shared?
REFERENCES
Correct answer
REFERENCE_USAGE
READ
USAGE
Overall explanation
You must grant the REFERENCE_USAGE privilege separately on each database referenced in
a secure view, before granting the secure view to a share.
chumma kizhii
#dm
Question 104Skipped
Processing costs will be generated based on how long the query takes.
Correct answer
The cost for running the virtual warehouse will be charged by the second.
Overall explanation
Question 105Skipped
Unlimited.
Correct answer
One.
Two.
Overall explanation
You can enable clustering on specific tables by specifying ONE clustering key for each table.
We can only create one cluster key, but we can have several columns or expressions in that
cluster key.
Question 106Skipped
Correct answer
chumma kizhii
#dm
Overall explanation
Although clustering can substantially improve the performance and reduce the cost of some
queries, the compute resources used to perform clustering consume credits. As such, you
should cluster only when queries will benefit substantially from the clustering in huge tables.
Question 107Skipped
Correct answer
Overall explanation
Question 108Skipped
Pre-signed
Scoped
Virtual-hosted style
Correct answer
File
Overall explanation
Conceptually, directory tables are similar to external tables in that they store file-level
metadata about the data files in a stage. Query a directory table to retrieve the Snowflake-
hosted file URL to each file in the stage. A file URL permits prolonged access to a specified
file. That is, the file URL does not expire. The same file URL is returned by calling the
BUILD_STAGE_FILE_URL function.
chumma kizhii
#dm
Question 109Skipped
Infrastructure management
Correct answer
Query execution
Metadata management
Overall explanation
A warehouse provides the required resources, such as CPU, memory, and temporary storage,
to perform the following operations in a Snowflake session:
• Executing SQL SELECT statements that require compute resources (e.g. retrieving
rows from tables and views).
Question 110Skipped
2. FROM USERS;
The TOP 100 grades ordered by the creation date of the data.
Correct answer
Overall explanation
We’d need the ORDER BY clause if we want to generate the other results.
chumma kizhii
#dm
Question 111Skipped
Correct answer
Globally
Overall explanation
In general, a role with any one of the following sets of privileges can grant privileges on an
object to other roles:
Only the SECURITYADMIN and ACCOUNTADMIN system roles have the MANAGE GRANTS
privilege; however, the privilege can be granted to custom roles.
Question 112Skipped
If the result has not been reused within the last 12 hours
Correct answer
Overall explanation
Query results are reused if all of the following conditions are met:
• ...
Question 113Skipped
Which command should be used to look into the validity of an XML object in Snowflake?
TO_XML
chumma kizhii
#dm
Correct answer
CHECK_XML
PARSE_XML
XMLGET
Overall explanation
CHECK_XML
Checks the validity of an XML document. If the input string is NULL or a valid XML document,
the output is NULL. In case of an XML parsing error, the output string contains the error
message.
Question 114Skipped
How can a Snowflake user access a JSON object, given the following table? (Choose two.)
Correct selection
1. SRC:salesPerson.name
1. src:salesPerson.Name
1. src:salesperson.name
1. SRC:salesPerson.Name
Correct selection
1. src:salesPerson.name
Overall explanation
Regardless of which notation you use, the column name is case-insensitive but element
names are case-sensitive.
For example, in the following list, the first two paths are equivalent, but the third is not:
• src:salesperson.name
chumma kizhii
#dm
• SRC:salesperson.name
• SRC:Salesperson.Name
Question 115Skipped
What will happen to ALTER a column setting it to NOT NULL if it contains NULL values?
Correct answer
Overall explanation
When setting a column to NOT NULL, if the column contains NULL values, an error is
returned and no changes are applied to the column. This restriction prevents inconsistency
between values in rows inserted before the column was added and rows inserted after the
column was added.
Question 116Skipped
For directory tables, what stage allows for automatic refreshing of metadata?
User stage
Table stage
Correct answer
Overall explanation
You can automatically refresh the metadata for a directory table by using the following event
notification services:
chumma kizhii
#dm
Question 117Skipped
What are the two models that Snowflake combines as an approach to access control?
Correct answer
Overall explanation
Snowflake combines Discretionary Access Control (DAC) and Role-Based Access Control
(RBAC). Discretionary Access Control (DAC) remarks that "each object has an owner, who
can, in turn, grant access to that object". In contrast, the Role-based Access Control (RBAC)
remarks that "access privileges are assigned to roles, which are in turn assigned to users".
Question 118Skipped
Which of the following are best practice recommendations that should be considered when
loading data into Snowflake? (Choose two.)
Correct selection
Correct selection
Avoid using embedded characters such as commas for numeric data types.
Overall explanation
Question 119Skipped
Masking policies can be applied to which of the following Snowflake objects? (Choose two.)
Correct selection
A table
chumma kizhii
#dm
A stream
A stored procedure
A pipe
Correct selection
A materialized view
Overall explanation
In Snowflake, masking policies are schema-level objects, which means a database and
schema must exist in Snowflake before a masking policy can be applied to a column.
Currently, Snowflake supports using Dynamic Data Masking on tables and views.
Question 120Skipped
Which of the following listing types are not available in the Snowflake Data Marketplace?
Personalized Listing.
Free Listing.
Paid Listing.
Correct answer
Private Listing.
Overall explanation
Free Listing (also known as Standard Listing) is the best for providing generic, aggregated, or
non-customer-specific data. In contrast, consumers can request specific datasets from
providers using Personalized Listing. Snowflake recently added Paid Listings, where, as a
provider, you can charge consumers to access or use your listing.
There is another type, Private Listings, where you can use listings to share data and other
information directly with other Snowflake accounts. However, they are unavailable in the
Data Marketplace, as the question asks for. Here you have an example of a Personalized
Listing in the Snowflake Marketplace:
Question 121Skipped
chumma kizhii
#dm
Correct selection
Correct selection
Overall explanation
Some of the internal optimizations for views require access to the underlying data in the
base tables for the view. This access might allow data that is hidden from users of the view
to be exposed through user code, such as user-defined functions, or other programmatic
methods. Secure views do not utilize these optimizations, ensuring that users have no access
to the underlying data.
For security or privacy reasons, you might not wish to expose the underlying tables or
internal structural details for a view. With secure views, the view definition and details are
visible only to authorized users (i.e. users who are granted the role that owns the view).
Question 122Skipped
Which command will fail if you have a table created with the following DDL query?
Correct answer
Overall explanation
If you use the symbol " ", you have to specify the exact name of the table.
chumma kizhii
#dm
Question 123Skipped
Which roles can create, alter or drop network policies? (Choose two.)
USERADMIN.
Correct selection
SECURITYADMIN.
ORGADMIN
SYSADMIN.
Correct selection
ACCOUNTADMIN.
Overall explanation
Network policies allow restricting access to your account based on user IP address.
Snowflake applies the blocked IP address list when a network policy includes values in both
the allowed and blocked IP address lists.
Only security administrators (i.e., users with the SECURITYADMIN role) or higher or a role
with the global CREATE NETWORK POLICY privilege can create network policies.
Only the network policy owner (i.e., role with the OWNERSHIP privilege on the network
policy) or higher can alter a network policy.
Question 124Skipped
What is used to limit the credit usage of a virtual warehouse within a Snowflake account?
Query Profile
Stream
Correct answer
Resource monitor
Load monitor
Overall explanation
Question 125Skipped
chumma kizhii
#dm
What is it called when a customer managed key is combined with a Snowflake managed key
to create a composite key for encryption?
Correct answer
Client-side encryption
Overall explanation
Question 126Skipped
The Query Profile in the image is for a query executed in Snowsight. Four of the key nodes
are highlighted in yellow.
chumma kizhii
#dm
Aggregate[1]
Join[5]
chumma kizhii
#dm
TableScan[2]
Correct answer
TableScan[3]
Overall explanation
What we can see in this image is que Operator Tree of a Query profile.
The tree provides a graphical representation of the operator nodes that comprise a query
and the links that connect each operator:
• Operators are the functional building blocks of a query. They are responsible for
different aspects of data management and processing, including data access,
transformations and updates. Each operator node in the tree includes some basic
attributes:
<Type> [#]
Operator type and ID number. ID can be used to uniquely identify an operator within a query
profile (e.g. Aggregate [1] and Join [5] in the screenshot above).
Percentage
Fraction of time that this operator consumed within the query step (e.g. 53.4% for
TableScan[3]). This information is also reflected in the bar at the bottom of the operator
node, allowing for easy visual identification of performance-critical operators.
Label
• Links represent the data flowing between each operator node. Each link provides the
number of records that were processed.
Therefore, we can conclude that it is the node TableScan[3] that has the highest percentage
of time.
Question 127Skipped
Which account usage view in Snowflake can be used to identify the most-frequently
accessed tables?
Tables
Object_Dependencies
chumma kizhii
#dm
Correct answer
Access_History
Table_Storage_Metrics
Overall explanation
The Access_History view in Snowflake can be used to identify the most frequently accessed
tables. This view contains information about the historical access patterns for tables and
views in your Snowflake account, including details on queries, users, and access frequency.
By querying this view, you can analyze which tables are being accessed most frequently in
your Snowflake environment.
Question 128Skipped
A user created a database and set the DATA_RETENTION_TIME_IN_DAYS to 30, but did not
set the DATA_RETENTION_TIME_IN_DAYS in table T1. After 5 days, the user accidentally
drops table T1.
The table cannot be recovered because the DATA_RETENTION_TIME_IN_DAYS was not set
for table T1.
The table can only be recovered by contacting Snowflake Support to recover the table
from Fail-safe.
Correct answer
The table can be recovered because the table retention period default is at the database
level.
Overall explanation
By default, the maximum retention period is 1 day (i.e. one 24 hour period). With Snowflake
Enterprise Edition (and higher), the default for your account can be set to any value up to 90
days:
• When creating a table, schema, or database, the account default can be overridden
using the DATA_RETENTION_TIME_IN_DAYS parameter in the command.
chumma kizhii
#dm
Question 129Skipped
Which of the below columns are usually a good choice for clustering keys?
Correct answer
Overall explanation
A column with very low cardinality (gender in this case) might yield minimal pruning. On the
other hand, a column with very high cardinality (UUID or timestamp in this case) is also
typically not a good candidate to use directly as a clustering key, as there will be a lot of
values. Store_id is the most convenient option.
Question 130Skipped
For non-materialized views, what column in Information Schema and Account Usage
identifies whether a view is secure or not?
CHECK_OPTION
Correct answer
IS_SECURE
IS_UPDATEABLE
TABLE_NAME
Overall explanation
Question 131Skipped
Correct answer
chumma kizhii
#dm
Overall explanation
Snowflake automatically and transparently maintains materialized views. To see the last time
that Snowflake refreshed a materialized view, check the REFRESHED_ON and BEHIND_BY
columns in the output of the command SHOW MATERIALIZED VIEWS.
Question 132Skipped
Which command should be used to implement a masking policy that was already created in
Snowflake?
Correct answer
Overall explanation
Example:
Question 133Skipped
Which cache type is used to cache data output from SQL queries?
Remote cache
Correct answer
Result cache
Metadata cache
Overall explanation
Result Cache: Which holds the results of every query executed in the past 24 hours. These
are available across virtual warehouses, so query results returned to one user is available to
any other user on the system who executes the same query, provided the underlying data
has not changed.
chumma kizhii
#dm
Local Disk Cache: Which is used to cache data used by SQL queries. Whenever data is
needed for a given query it's retrieved from the Remote Disk storage, and cached in SSD and
memory.
Question 134Skipped
What happens when the values for both an ALLOWED_IP_LIST and a BLOCKED_IP_LIST are
used in a network policy?
Correct answer
Overall explanation
When a network policy includes values in both the allowed and blocked IP address lists,
Snowflake applies the blocked IP address list first.
Question 135Skipped
When dropping an internal stage, the files are deleted with the stage and the files are
recoverable.
Correct selection
When dropping an external stage, the files are not removed and only the stage is dropped.
When dropping an internal stage, only selected files are deleted with the stage and are not
recoverable.
When dropping an external stage, both the stage and the files within the stage are
removed.
Correct selection
When dropping an internal stage, the files are deleted with the stage and the files are not
recoverable.
Overall explanation
chumma kizhii
#dm
For an internal stage, all of the files in the stage are purged from Snowflake, regardless of
their load status. This prevents the files from continuing to using storage and, consequently,
accruing storage charges.
However, this also means that the staged files cannot be recovered after a stage is dropped.
For an external stage, only the stage itself is dropped; any data files in the referenced
external location (Amazon S3, Google Cloud Storage, or Microsoft Azure) are not removed.
Question 136Skipped
Correct answer
Querying a materialized view is slower than executing a query against the base table of
the view.
Overall explanation
It is recommended to be familiar with the limitations of the materialized views. This is often
a common question.
Question 137Skipped
Correct answer
<seq_name>.NEXTVAL
<seq_name>.THISVAL
<seq_name>.CURRENTVAL
<seq_name>.GETVAL
Overall explanation
chumma kizhii
#dm
We use sequences to generate unique numbers across sessions and statements, including
concurrent statements. You can use them to generate values for a primary key or any
column that requires a unique value. Let’s see an example.
2. (PEOPLE_SEQ.nextval, "Gonzalo"),
3. (PEOPLE_SEQ.nextval, "Nacho"),
4. (PEOPLE_SEQ.nextval, "Megan"),
5. (PEOPLE_SEQ.nextval, "Angel")
Question 138Skipped
What technique does Snowflake use to limit the number of micro-partitions scanned by
each query?
Indexing
Map reduce
Correct answer
Pruning
B-tree
chumma kizhii
#dm
Overall explanation
Question 139Skipped
Which SQL command can be used to verify the privileges that are granted to a role?
SHOW ROLES
Correct answer
Overall explanation
Question 140Skipped
Which privilege is required on a virtual warehouse to abort any existing executing queries?
MODIFY
USAGE
Correct answer
OPERATE
MONITOR
Overall explanation
OPERATE
Enables changing the state of a warehouse (stop, start, suspend, resume). In addition,
enables viewing current and past queries executed on a warehouse and aborting any
executing queries.
Question 141Skipped
Which Snowflake object enables loading data from files as soon as they are available in a
cloud storage location?
Stream
Correct answer
chumma kizhii
#dm
Pipe
Task
External stage
Overall explanation
Snowpipe enables loading data from files as soon as they’re available in a stage. This means
you can load data from files in micro-batches, making it available to users within minutes,
rather than manually executing COPY statements on a schedule to load larger batches.
Question 142Skipped
It’s the schema that even non-Snowflake users will be able to access, as it’s public.
Correct answer
Overall explanation
A schema is a logical grouping of database objects (tables, views, etc.), and each schema
belongs to a single database. The PUBLIC schema is the default schema for a database, and
all objects are, by default, created inside it if no other schema is specified.
Question 143Skipped
ACCOUNT_USAGE
Correct answer
READER_ACCOUNT_USAGE
WAREHOUSE_USAGE_SCHEMA
INFORMATION_SCHEMA
Overall explanation
chumma kizhii
#dm
Question 144Skipped
What happens to the shared objects for users in a consumer account from a share, once a
database has been created in that account?
Correct answer
Overall explanation
Once a database is created (in a consumer account) from a share, all the shared objects are
accessible to users in the consumer account.
Question 145Skipped
Which of the following SQL statements will list the version of the drivers currently being
used?
Correct answer
Overall explanation
Question 146Skipped
A Snowflake user has two tables that contain numeric values and is trying to find out which
values are present in both tables.
Correct answer
chumma kizhii
#dm
INTERSECT
MINUS
UNION
MERGE
Overall explanation
INTERSECT
Returns rows from one query’s result set which also appear in another query’s result set,
with duplicate elimination.
Question 147Skipped
Which of the following query profiler variables will indicate that a virtual warehouse is not
sized correctly for the query being executed?
Correct answer
Remote spillage
Synchronization
Initialization
Overall explanation
For some operations (e.g. duplicate elimination for a huge data set), the amount of memory
available for the compute resources used to execute the operation might not be sufficient to
hold intermediate results. As a result, the query processing engine will start spilling the data
to local disk. If the local disk space is not sufficient, the spilled data is then saved to remote
disks.
Question 148Skipped
According to Snowflake best practice recommendations, which role should be used to create
databases?
USERADMIN
SECURITYADMIN
ACCOUNTADMIN
chumma kizhii
#dm
Correct answer
SYSADMIN
Overall explanation
The system administrator (SYSADMIN) role includes the privileges to create warehouses,
databases, and all database objects (schemas, tables, etc.).
Question 149Skipped
You have the following data in a variant column called “person_data” from the table
“myTable”. How can you query the second hobby of Chris (“music”)?
1. {
2. "name": "Chris",
3. "favouriteTechnology": Snowflake,
4. "hobbies":[
5. {"name": "soccer"},
6. {"name": "music"},
7. {"name": "hiking"}
8. ]}
1. SELECT person_data:hobbies[1]
1. SELECT person_data:hobbies(1)
Correct answer
1. SELECT person_data:hobbies[1].name
1. SELECT person_data:hobbies(1).name
Overall explanation
Question 150Skipped
Correct answer
chumma kizhii
#dm
Overall explanation
Custom roles (i.e. any roles other than the system-defined roles) can be created by the
USERADMIN role (or a higher role) as well as by any role to which the CREATE ROLE privilege
has been granted. By default, a newly-created role is not assigned to any user, nor granted to
any other role.
When creating roles that will serve as the owners of securable objects in the system,
Snowflake recommends creating a hierarchy of custom roles, with the top-most custom role
assigned to the system role SYSADMIN.
Question 151Skipped
What do temporary and transient tables have in common in Snowflake? (Choose two.)
For both tables, the retention period ends when the tables are dropped.
Correct selection
Correct selection
For both tables, the retention period does not end when the session ends.
Overall explanation
chumma kizhii
#dm
Question 152Skipped
Correct selection
External Tokenization
Correct selection
Overall explanation
Question 153Skipped
What is the name of the SnowSQL file that can store connection information?
snowsql.pubkey
history
Correct answer
config
snowsql.cnf
Overall explanation
SnowSQL supports multiple configuration files that allow organizations to define base values
for connection parameters, default settings, and variables while allowing individual users to
customize their personal settings in their own <HOME_DIR>/.snowsql/config files.
chumma kizhii
#dm
Question 154Skipped
What general guideline does Snowflake recommend when setting the auto-suspension time
limit?
Correct answer
Overall explanation
Question 155Skipped
What role has the privileges to create and manage data shares by default?
Correct answer
ACCOUNTADMIN
USERADMIN
SYSADMIN
SECURITYADMIN
Overall explanation
By default, the privileges required to create and manage shares are granted only to
the ACCOUNTADMIN role, ensuring that only account administrators can perform these
tasks.
chumma kizhii
#dm
SET 5
Question 1Skipped
What feature can be used to reorganize a very large table on one or more columns?
Correct answer
Clustering keys
Micro-partitions
Key partitions
Clustered partitions
Overall explanation
Question 2Skipped
When a Snowflake user loads CSV data from a stage, which COPY INTO [table] command
guideline should they follow?
The data file must have the same number of columns as the target table.
Correct answer
Overall explanation
Question 3Skipped
Account and user authentication is only available with the Snowflake Business Critical
edition.
Correct answer
chumma kizhii
#dm
Periodic rekeying of encrypted data is available with the Snowflake Enterprise edition and
higher
Support for HIPAA and GDPR compliance is available for UI Snowflake editions.
Overall explanation
chumma kizhii
#dm
Question 4Skipped
Which feature allows a user the ability to control the organization of data in a micro-
partition?
chumma kizhii
#dm
Correct answer
Automatic Clustering
Horizontal Partitioning
Range Partitioning
Overall explanation
Search Optimization Service has nothing to do with the organization of data in micro-
partitions. Its clustering only which organizes data in micro-partitions
Question 5Skipped
Which of the following are handled by the cloud services layer of the Snowflake
architecture? (Choose two.)
Query execution
Data loading
Correct selection
Security
Correct selection
Overall explanation
The cloud services layer is a collection of services that coordinate activities across Snowflake.
These services tie together all of the different components of Snowflake in order to process
user requests, from login to query dispatch. The cloud services layer also runs on compute
instances provisioned by Snowflake from the cloud provider. Services managed in this layer
include:
• Authentication
• Infrastructure management
• Metadata management
• Access control
chumma kizhii
#dm
Question 6Skipped
64 days
Correct answer
14 days
1 day
7 days
Overall explanation
COPY INTO - Stored in the metadata of the target table for 64 days
Question 7Skipped
SELECT
READ
USAGE
Correct answer
OPERATE
Overall explanation
ALTER PIPE
Modifies a limited set of properties for an existing pipe object. Also supports the following
operations:
• Refreshing a pipe (i.e. copying the specified staged data files to the Snowpipe ingest
queue for loading into the target table).
chumma kizhii
#dm
A non-owner role with the OPERATE privilege on the pipe can pause or resume a pipe (using
ALTER PIPE … SET PIPE_EXECUTION_PAUSED = TRUE | FALSE).
SQL operations on schema objects also require the USAGE privilege on the database and
schema that contain the object.
Question 8Skipped
What is the default character set used when loading CSV files into Snowflake?
ISO 8859-1
ANSI_X3.4
UTF-16
Correct answer
UTF-8
Overall explanation
For delimited files (CSV, TSV, etc.), the default character set is UTF-8.
Question 9Skipped
Which of the following is a valid source for an external stage when the Snowflake account is
located on Microsoft Azure?
Correct answer
Overall explanation
Loading data from any of the following cloud storage services is supported regardless of the
cloud platform that hosts your Snowflake account:
• Amazon S3
• Microsoft Azure
chumma kizhii
#dm
Question 10Skipped
A Snowflake user wants to share unstructured data through the use of secure views.
Correct selection
Pre-signed URL
Correct selection
Scoped URL
HTTPS URL
File URL
Overall explanation
Question 11Skipped
While attempting to avoid data duplication, which COPY INTO option should be used to load
files with expired load metadata?
Correct answer
LOAD_UNCERTAIN_FILES
VALIDATION_MODE
LAST_MODIFIED
FORCE
Overall explanation
To load files whose metadata has expired, set the LOAD_UNCERTAIN_FILES copy option to
true. The copy option references load metadata, if available, to avoid data duplication, but
also attempts to load files with expired load metadata.
Question 12Skipped
Correct selection
chumma kizhii
#dm
Correct selection
Overall explanation
Question 13Skipped
What is an advantage of using an explain plan instead of the query profiler to evaluate the
performance of a query?
An explain plan will handle queries with temporary tables and the query profiler will not.
Correct answer
An explain plan can be used to conduct performance analysis without executing a query.
An explain plan's output will display automatic data skew optimization information.
Overall explanation
EXPLAIN compiles the SQL statement, but does not execute it, so EXPLAIN does not require a
running warehouse.
Although EXPLAIN does not consume any compute credits, the compilation of the query
does consume Cloud Service credits, just as other metadata operations do.
Question 14Skipped
A data provider wants to share data with a consumer who does not have a Snowflake
account. The provider creates a reader account for the consumer following these steps:
chumma kizhii
#dm
The reader account will automatically use the Standard edition of Snowflake.
Correct answer
The reader account can create a copy of the shared data using CREATE TABLE AS...
The reader account can clone data the provider has shared, but cannot re-share it.
Overall explanation
The user has not a snowflake account, so the user compute will be billed to provider
account.
Question 15Skipped
Which of the following can be used when unloading data from Snowflake? (Choose two.)
Correct selection
By using the SINGLE = TRUE parameter, a single file up to 5 GB in size can be exported to
the storage layer.
Use the PARSE_JSON function to ensure structured data will be unloaded into the
VARIANT data type.
Correct selection
Use the ENCODING file format option to change the encoding from the default UTF-8.
Overall explanation
Question 16Skipped
A stored procedure
chumma kizhii
#dm
Correct answer
Overall explanation
Question 17Skipped
Correct answer
Overall explanation
In this question it is important the detail of "assuming that no query has been executed
before". This means that options that could take advantage of the cache and not require
computation are wrong.
The correct option is the SELECT MIN because Snowflake stores the MIN and MAX-values of
each column of each micro-partition in it's Cloud Services Layer and also the COUNT
DISTINCT. I quote from the documentation:
Additional properties used for both optimization and efficient query processing.
Question 18Skipped
chumma kizhii
#dm
Correct answer
A multi-cluster shared data architecture using a central data repository and massively
parallel processing (MPP)
A single-cluster shared nothing architecture using a siloed data repository and symmetric
multiprocessing (SMP)
A single-cluster shared data architecture using a central data repository and massively
parallel processing (MPP)
A multi-cluster shared nothing architecture using a siloed data repository and symmetric
multiprocessing (SMP)
Overall explanation
A simple and straightforward question, but one that always appears in the exam.
Question 19Skipped
What features that are part of the Continuous Data Protection (CDP) feature set in
Snowflake do not require additional configuration? (Choose two.)
External tokenization
Correct selection
Time Travel
Correct selection
Data encryption
Overall explanation
Question 20Skipped
What are advantages clones have over tables created with CREATE TABLE AS SELECT
statement? (Choose two.)
The clone will have time travel history from the original table.
chumma kizhii
#dm
Correct selection
Correct selection
Overall explanation
Cloning is fast, but not instantaneous, particularly for large objects (e.g. tables). - so ALMOST
is correct.
Clone is not a Physical copy of the actual data instead it is a pointer to the original micro-
partitions and only when we modify the cloned data.
Question 21Skipped
Correct answer
A named Snowflake object that includes all the information required to share a database
Overall explanation
chumma kizhii
#dm
Question 22Skipped
Which type of join will list all rows in the specified table, even if those rows have no match in
the other table?
Correct answer
Outer join
Cross join
Natural join
Inner join
Overall explanation
Question 23Skipped
Which of the following are valid methods for authenticating users for access into Snowflake?
(Choose three.)
SCIM
Correct selection
Key-pair authentication
OCSP authentication
chumma kizhii
#dm
TLS 1.2
Correct selection
Federated authentication
Correct selection
OAuth
Overall explanation
Question 24Skipped
Correct answer
Overall explanation
If a base table is altered so that existing columns are changed or dropped, then all
materialized views on that base table are suspended.
Question 25Skipped
MONITOR
OPERATE
Correct selection
OWNERSHIP
Correct selection
USAGE
MODIFY
Overall explanation
chumma kizhii
#dm
Similar to other database objects (tables, views, UDFs, etc.), stored procedures are owned by
a role and have one or more privileges that can be granted to other roles.
• USAGE
• OWNERSHIP
Question 26Skipped
What is the default file size when unloading data from Snowflake using the COPY command?
Correct answer
16 MB
32 MB
8 GB
4 MB
Overall explanation
By default, COPY INTO location statements separate table data into a set of output files to
take advantage of parallel operations. The maximum size for each file is set using the
MAX_FILE_SIZE copy option. The default value is 16777216 (16 MB) but can be increased to
accommodate larger files.
Question 27Skipped
Which of the following Snowflake objects can be shared using a secure share? (Choose two.)
Sequences
Correct selection
Procedures
Materialized views
Correct selection
Tables
Overall explanation
chumma kizhii
#dm
• Tables
• External tables
• Secure views
• Secure UDFs
Question 28Skipped
While clustering a table, columns with which data types can be used as clustering keys?
(Choose two.)
GEOGRAPHY
VARIANT
OBJECT
Correct selection
BINARY
Correct selection
GEOMETRY
Overall explanation
Question 29Skipped
Correct answer
Overall explanation
chumma kizhii
#dm
Question 30Skipped
A materialized view should be created when which of the following occurs? (Choose two.)
The query is highly optimized and does not consume many compute resources.
Correct selection
Correct selection
The results of the query do not change often and are used frequently.
Overall explanation
Question 31Skipped
Correct answer
Parquet
JSON
TSV
Avro
Overall explanation
Parquet is a compressed, efficient columnar data representation designed for projects in the
Hadoop ecosystem.
Question 32Skipped
Which SQL command, when committed, will consume a stream and advance the stream
offset?
Correct answer
chumma kizhii
#dm
1. BEGIN COMMIT
Overall explanation
The stream position (i.e. offset) is advanced when the stream is used in a DML statement.
The position is updated at the end of the transaction to the beginning timestamp of the
transaction. The stream describes change records starting from the current position of the
stream and ending at the current transactional timestamp.
Question 33Skipped
Correct answer
To profile a particular query to understand the mechanics of the query, its behavior, and
performance.
To profile the user and/or executing role of a query and all privileges and policies applied
on the objects within the query.
To profile which queries are running in each warehouse and identify proper warehouse
utilization and sizing for better performance and cost balancing.
To profile how many times a particular query was executed and analyze its usage statistics
over time.
Overall explanation
Query Profile is a powerful tool for understanding the mechanics of queries. It can be used
whenever you want or need to know more about the performance or behavior of a
particular query.
Question 34Skipped
What are the correct parameters for time travel and fail-safe in the Snowflake Enterprise
Edition?
Default Time Travel Retention is set to 0 days. Maximum Time Travel Retention is 90 days.
Fail Safe retention time is 7 days.
Correct answer
chumma kizhii
#dm
Default Time Travel Retention is set to 1 day. Maximum Time Travel Retention is 90 days.
Fail Safe retention time is 7 days.
Default Time Travel Retention is set to 1 day. Maximum Time Travel Retention is 365 days.
Fail Safe retention time is 7 days.
Default Time Travel Retention is set to 7 days. Maximum Time Travel Retention is 1 day.
Fail Safe retention time is 90 days.
Default Time Travel Retention is set to 0 days. Maximum Time Travel Retention is 30 days.
Fail Safe retention time is 1 day.
Default Time Travel Retention is set to 90 days. Maximum Time Travel Retention is 7 days.
Fail Safe retention time is 356 days.
Overall explanation
chumma kizhii
#dm
Question 35Skipped
1. ALTER WAREHOUSE
Correct answer
chumma kizhii
#dm
1. CREATE SHARE
1. DROP ROLE
1. DESCRIBE TABLE
1. SHOW SCHEMAS
Overall explanation
In Snowflake, a reader account is a special type of user account that has read-only access to
data in Snowflake. This means that reader accounts can only perform actions that are
related to querying data, such as running SELECT statements and viewing metadata.
As a result, reader accounts cannot perform actions that modify the data or metadata stored
in Snowflake, such as creating new objects, modifying existing objects, or dropping objects.
This includes the CREATE SHARE command, which is used to create a new share and make it
available to other users.
Question 36Skipped
Correct answer
Overall explanation
When staging regular data sets, we recommend partitioning the data into logical paths that
include identifying details such as geographical location or other source identifiers, along
with the date when the data was written.
Organizing your data files by path lets you copy any fraction of the partitioned data into
Snowflake with a single command. This allows you to execute concurrent COPY statements
that match a subset of files, taking advantage of parallel operations.
Question 37Skipped
chumma kizhii
#dm
Which URL type allows users to access unstructured data without authenticating into
Snowflake or passing an authorization token?
File URL
Scoped URL
Correct answer
Pre-signed URL
Signed URL
Overall explanation
Pre-signed URLs are used to download or access files, via a web browser for example,
without authenticating into Snowflake or passing an authorization token. These URLs are
ideal for business intelligence applications or reporting tools that need to display the
unstructured file contents.
Question 38Skipped
Correct answer
Restoring data-related objects that have been deleted within the past 90 days
Querying data-related objects that were created within the past 365 days
Overall explanation
Snowflake Time Travel enables accessing historical data (i.e. data that has been changed or
deleted) at any point within a defined period. It serves as a powerful tool for performing the
following tasks:
• Restoring data-related objects (tables, schemas, and databases) that might have
been accidentally or intentionally deleted.
Question 39Skipped
chumma kizhii
#dm
If a Snowflake user decides a table should be clustered, what should be used as the cluster
key?
Correct answer
Overall explanation
Cluster columns that are most actively used in selective filters. For many fact tables involved
in date-based queries (for example “WHERE invoice_date > x AND invoice date <= y”),
choosing the date column is a good idea. For event tables, event type might be a good
choice, if there are a large number of different event types. (If your table has only a small
number of different event types, then see the comments on cardinality below before
choosing an event column as a clustering key.)
If there is room for additional cluster keys, then consider columns frequently used in join
predicates, for example “FROM table1 JOIN table2 ON table2.column_A = table1.column_B”.
Question 40Skipped
How often are the Account and Table master keys automatically rotated by Snowflake?
Correct answer
30 Days
60 Days
365 Days
90 Days
Overall explanation
All Snowflake-managed keys are automatically rotated by Snowflake when they are more
than 30 days old.
Question 41Skipped
chumma kizhii
#dm
Which of the following are considerations when using a directory table when working with
unstructured data? (Choose two.)
Correct selection
Correct selection
Overall explanation
Question 42Skipped
Correct answer
External stage
Table stage
User stage
Database stage
Overall explanation
We cannot drop the stage associated with a table or user; only named stages (internal or
external) can be dropped.
Question 43Skipped
Correct answer
Database replication
chumma kizhii
#dm
Overall explanation
Snowflake charges a per-byte fee for data egress when users transfer data from a Snowflake
account into a different region on the same cloud platform or into a completely different
cloud platform. Data transfers within the same region are free.
Question 44Skipped
Serverless features are not billed, unless the total cost for the month exceeds 10% of the
warehouse credits, on the account
Correct answer
Per minute multiplied by an automatic sizing for the job, with a minimum of one minute
Overall explanation
Charges for serverless features are calculated based on total usage of snowflake-managed
compute resources measured in compute-hours. Compute-Hours are calculated on a per
second basis, rounded up to the nearest whole second. The number of credits consumed per
compute hour varies depending on the serverless feature. To learn how many credits are
consumed by a serverless feature, refer to the “Serverless Feature Credit Table” in the
Snowflake service consumption table.
Question 45Skipped
Which categories are included in the execution time summary in a Query Profile? (Choose
two.)
Correct selection
Initialization
Correct selection
Pruning
chumma kizhii
#dm
Spilling
Overall explanation
Execution Time
Execution time provides information about “where the time was spent” during the
processing of a query. Time spent can be broken down into the following categories,
displayed in the following order:
• Local Disk IO — time when the processing was blocked by local disk access.
• Remote Disk IO — time when the processing was blocked by remote disk access.
• Network Communication — time when the processing was waiting for the network
data transfer.
Question 46Skipped
A size 3X-Large multi-cluster warehouse runs one cluster for one full hour and then runs two
clusters for the next full hour.
Correct answer
192
128
64
149
Overall explanation
Question 47Skipped
chumma kizhii
#dm
A company needs to read multiple terabytes of data for an initial load as part of a Snowflake
migration. The company can control the number and size of CSV extract files.
Produce the largest files possible, reducing the overall number of files to process.
Correct answer
Produce a larger number of smaller files and process the ingestion with size Small virtual
warehouses.
Use an external tool to issue batched row-by-row inserts within BEGIN TRANSACTION and
COMMIT commands.
Overall explanation
The key is that the question mentions that we have to read 'multiple terabytes of data'. As a
good practice, Snowflake recommends that the files to be ingested should be between 100
and 250 MB compressed, so the best option is to split the source data into smaller files.
Question 48Skipped
When unloading data to an external stage, what is the MAXIMUM file size supported?
10 GB
Correct answer
5 GB
16 GB
1 GB
Overall explanation
By default, COPY INTO location statements separate table data into a set of output files to
take advantage of parallel operations. The maximum size for each file is set using the
MAX_FILE_SIZE copy option. The default value is 16777216 (16 MB) but can be increased to
accommodate larger files. The maximum file size supported is 5 GB for Amazon S3, Google
Cloud Storage, or Microsoft Azure stages.
Question 49Skipped
Which Snowflake function will interpret an input string as a JSON document, and produce a
VARIANT value?
chumma kizhii
#dm
json_extract_path_text()
flatten
object_construct()
Correct answer
parse_json()
Overall explanation
Question 50Skipped
Which snowflake objects will incur both storage and cloud compute charges? (Choose two.)
Sequence
Correct selection
Clustered table
Transient table
Correct selection
Materialized view
Secure view
Overall explanation
Clustered table needs to undergo clustering as the data changes and Materialized view also
undergoes changes every time the underlying data changes or when the view is set to
refresh.
chumma kizhii
#dm
Question 51Skipped
What information is found within the Statistic output in the Query Profile Overview?
Operator tree
Correct answer
Table pruning
Overall explanation
Question 52Skipped
A view
A file in a stage
An individual column
Correct answer
An individual row
Overall explanation
chumma kizhii
#dm
Question 53Skipped
Which activities are included in the Cloud Services layer? (Choose two.)
Correct selection
Infrastructure management
Correct selection
User authentication
Partition scanning
Data storage
Overall explanation
The Cloud Services layer in Snowflake is responsible for critical data-related activities.
Services managed in this layer include:
• Authentication
• Infrastructure management
• Metadata management
• Access control
Question 54Skipped
Correct answer
Overall explanation
chumma kizhii
#dm
Search optimization can improve the performance of queries using these kinds of predicates:
• Geospatial queries.
The search optimization service does not directly improve the performance of joins.
However, it can improve the performance of filtering rows from either table prior to the join.
Question 55Skipped
Which privilege must be granted to a share to allow secure views the ability to reference
data in multiple databases?
Correct answer
REFERENCE_USAGE on databases
Overall explanation
Question 56Skipped
How does the search optimization service help Snowflake users improve query
performance?
Correct answer
It maintains a persistent data structure that keeps track of the values of the table’s
columns in each of its micro-partitions.
It scans the micro-partitions based on the joins used in the queries and scans only join
columns.
It keeps track of running queries and their results and saves those extra scans on the table.
It scans the local disk cache to avoid scans on the tables used in the query.
chumma kizhii
#dm
Overall explanation
To improve performance of search queries, the search optimization service creates and
maintains a persistent data structure called a search access path. The search access path
keeps track of which values of the table’s columns might be found in each of its micro-
partitions, allowing some micro-partitions to be skipped when scanning the table.
Question 57Skipped
2. warehouse_size = MEDIUM
3. min_cluster_count = 1
4. max_cluster_count = 1
5. auto_suspend = 60
6. auto_resume = true;
The image below is a graphical representation of the warehouse utilization across two days.
chumma kizhii
#dm
Correct answer
Overall explanation
Question 58Skipped
Which methods can be used to delete staged files from a Snowflake stage? (Choose two.)
Correct selection
Correct selection
Use the DELETE LOAD HISTORY command after the load completes.
Overall explanation
Question 59Skipped
Correct answer
Compiling an account's average cloud services cost over the previous month
chumma kizhii
#dm
Overall explanation
The METERING_HISTORY view in the ACCOUNT_USAGE schema can be used to return the
hourly credit usage for an account within the last 365 days (1 year).
Question 60Skipped
User INQUISITIVE_PERSON has been granted the role DATA_SCIENCE. The role
DATA_SCIENCE has privileges OWNERSHIP on the schema MARKETING of the database
ANALYTICS_DW.
Correct answer
Overall explanation
Question 61Skipped
In which Snowsight section can a user switch roles, modify their profile, and access
documentation?
Correct answer
Overall explanation
You can expect some questions about the operation of Snowsight in the exam. It is advisable
to navigate through the various sections to familiarize yourself with the interface.
chumma kizhii
#dm
Question 62Skipped
Query parsing and compilation occurs in which architecture layer of the Snowflake Cloud
Data Platform?
Storage layer
Correct answer
Compute layer
Overall explanation
Question 63Skipped
Correct answer
Overall explanation
Valid values
• Setting a value less than 60 is allowed, but may not result in the desired/expected
behavior because the background process that suspends a warehouse runs
approximately every 60 seconds and, therefore, is not intended for enabling exact
control over warehouse suspension.
Question 64Skipped
chumma kizhii
#dm
Standard
Enterprise
Business Critical
Correct answer
Overall explanation
Dedicated metadata store and pool of compute resources (used in virtual warehouses) is
offered through VPS
Question 65Skipped
Which of the following features are available with the Snowflake Enterprise edition? (Choose
two.)
Correct selection
Correct selection
Overall explanation
chumma kizhii
#dm
Question 66Skipped
chumma kizhii
#dm
connecting to Snowflake.
Which application will the Snowflake users need to install on their devices in order to
connect with MFA?
Okta Verify
Correct answer
Duo Mobile
Google Authenticator
Microsoft Authenticator
Overall explanation
Snowflake integrates with the Duo Mobile app to provide multi-factor authentication (MFA)
for users connecting to Snowflake. This means that in order to use MFA, Snowflake users
need to have the Duo Mobile app installed on their devices and enroll in Snowflake's MFA
service.
Okta Verify, Microsoft Authenticator, and Google Authenticator are alternative MFA apps,
but they are not directly integrated with Snowflake and may not be supported for use with
Snowflake's MFA service.
Question 67Skipped
Credit charges for Snowflake virtual warehouses are calculated based on which of the
following considerations? (Choose two.)
Correct selection
Correct selection
Overall explanation
Snowflake credits are charged based on the number of virtual warehouses you use, how
long they run, and their size.
chumma kizhii
#dm
Question 68Skipped
For how many days will the data be retained at the object level?
Correct answer
Overall explanation
Question 69Skipped
What does the “percentage scanned from cache” represent in the Query Profile?
Correct answer
Overall explanation
chumma kizhii
#dm
Question 70Skipped
Which commands can a Snowflake user execute to specify a cluster key for a table? (Choose
two.)
SHOW
Correct selection
ALTER
UPDATE
Correct selection
CREATE
SET
Overall explanation
A clustering key can be defined at table creation (using the CREATE TABLE command) or
afterward (using the ALTER TABLE command)..
Question 71Skipped
Correct selection
Correct selection
Overall explanation
Question 72Skipped
What is the recommended compressed file size range for continuous data loads using
Snowpipe?
chumma kizhii
#dm
8-16 MB
Correct answer
100-250 MB
16-24 MB
10-99 MB
Overall explanation
Snowpipe is typically used to load data that is arriving continuously. File sizing plays an
important role in Snowpipe's performance. The recommended file size for data loading is
100-250MB compressed, however, if data is arriving continuously, then try to stage the data
within one-minute intervals.
Question 73Skipped
Which query profile statistics help determine if efficient pruning is occurring? (Choose two.)
Correct selection
Partitions total
Correct selection
Partitions scanned
Overall explanation
Question 74Skipped
Credits are consumed based on the number of credits billed for each hour that a
warehouse runs.
Correct answer
Credits are consumed based on the warehouse size and the time the warehouse is
running.
chumma kizhii
#dm
Overall explanation
A virtual warehouse is one or more clusters of compute resources that enable executing
queries, loading data, and performing other DML operations.
Snowflake credits are used to pay for the processing time used by each virtual warehouse.
Snowflake credits are charged based on the number of virtual warehouses you use, how
long they run, and their size.
Question 75Skipped
What file formats does Snowflake support for loading semi-structured data? (Choose three.)
TSV
Correct selection
Avro
JPEG
Correct selection
JSON
Correct selection
Parquet
Overall explanation
Snowflake can import semi-structured data from JSON, Avro, ORC, Parquet, and XML formats
and store it in Snowflake data types designed specifically to support semi-structured data.
Question 76Skipped
What is the minimum Snowflake edition needed for database failover and fail-back between
Snowflake accounts for business continuity and disaster recovery?
Correct answer
Business Critical
Enterprise
chumma kizhii
#dm
Standard
Overall explanation
Question 77Skipped
Which ACCOUNT_USAGE views are used to evaluate the details of dynamic data masking?
(Choose two.)
ROLES
ACCESS_HISTORY
QUERY_HISTORY
RESOURCE_MONITORS
Correct selection
POLICY_REFERENCES
Correct selection
MASKING_POLICIES
Overall explanation
Snowflake provides two Account Usage views to obtain information about masking policies:
1. The MASKING POLICIES view provides a list of all masking policies in your Snowflake
account.
2. The POLICY_REFERENCES view provides a list of all objects in which a masking policy is set.
Question 78Skipped
Regular view
Regular table
Correct answer
Materialized view
chumma kizhii
#dm
Overall explanation
Question 79Skipped
get @%mytable;
Correct answer
list @my_stage;
get @my_stage;
list @~;
Overall explanation
LIST
Returns a list of files that have been staged (i.e. uploaded from a local file system or
unloaded from a table) in one of the following Snowflake stages:
Question 80Skipped
Which SQL command can be used to see the CREATE definition of a masking policy?
Correct answer
GET_DDL
Overall explanation
chumma kizhii
#dm
DESCRIBE will show the sql behind the policy but not in the form of a create statement.
Question 81Skipped
Correct answer
Overall explanation
Multi-cluster warehouses are best utilized for scaling resources to improve concurrency for
users/queries.
Question 82Skipped
What is the recommended way to change the existing file format type in my_format from
CSV to JSON?
Correct answer
Overall explanation
- Changing the type (CSV, JSON, etc.) for the file format.
To make any of these changes, you must recreate the file format.
Question 83Skipped
Which key governance feature in Snowflake allows users to identify automatically data
objects that contain sensitive data and their related objects?
chumma kizhii
#dm
Column-level security
Object tagging
Correct answer
Data classification
Overall explanation
Data classification in Snowflake is a feature that allows users to automatically identify and
classify columns in their tables containing personal or sensitive data.
Data classification is a multi-step process that associates Snowflake-defined tags (i.e. system
tags) to columns by analyzing the cells and metadata for personal data.
Based on the tracking information and related audit processes, the data engineer can
protect the column containing personal or sensitive data with a masking policy or the table
containing this column with a row access policy.
Question 84Skipped
Which of the following activities consume virtual warehouse credits in the Snowflake
environment? (Choose two.)
Correct selection
Cloning a database
Correct selection
Overall explanation
A warehouse provides the required resources, such as CPU, memory, and temporary storage,
to perform the following operations in a Snowflake session:
• Executing SQL SELECT statements that require compute resources (e.g. retrieving
rows from tables and views).
chumma kizhii
#dm
• Performing DML operations, such as: Updating rows in tables (DELETE , INSERT ,
UPDATE).
Note - To perform these operations, a warehouse must be running and in use for the
session. While a warehouse is running, it consumes Snowflake credits.
Question 85Skipped
Correct answer
To allow users the ability to choose the type of compute nodes that make up a virtual
warehouse cluster
Overall explanation
Question 86Skipped
Which function generates a Snowflake hosted file URL to a staged file using the stage name
and relative file path as inputs?
BUILD_SCOPED_FILE_URL
Correct answer
BUILD_STAGE_FILE_URL
GET_STAGE_LOCATION
GET_PRESIGNED_URL
Overall explanation
BUILD_STAGE_FILE_URL
chumma kizhii
#dm
Generates a Snowflake-hosted file URL to a staged file using the stage name and relative file
path as inputs. A file URL permits prolonged access to a specified file. That is, the file URL
does not expire.
This question can be tricky because there are very similar functions with small details that
differentiate them.
Question 87Skipped
Which Snowflake data governance feature can support auditing when a user query reads
column data?
Object dependencies
Correct answer
Access History
Column-level security
Data classification
Overall explanation
Access History in Snowflake refers to when the user query reads data and when the SQL
statement performs a data write operation, such as INSERT, UPDATE, and DELETE along with
variations of the COPY command, from the source data object to the target data object. The
user access history can be found by querying the Account Usage ACCESS_HISTORY view.
Question 88Skipped
How does Snowflake handle the data retention period for a table if a stream has not been
consumed?
Correct answer
Overall explanation
If the data retention period for a table is less than 14 days, and a stream has not been
consumed, Snowflake temporarily extends this period to prevent it from going stale. The
chumma kizhii
#dm
Question 89Skipped
64 days
Correct answer
14 days
Overall explanation
Stored in the metadata of the pipe for 14 days. Must be requested from Snowflake via a
REST endpoint, SQL table function, or ACCOUNT_USAGE view
Question 90Skipped
A Snowflake Administrator needs to ensure that sensitive corporate data in Snowflake tables
is not visible to end users, but is partially visible to functional managers.
Correct answer
Overall explanation
Masking policy administrators can implement a masking policy such that analysts (i.e. users
with the custom ANALYST role) can only view the last four digits of a phone number and
none of the social security number, while customer support representatives (i.e. users with
chumma kizhii
#dm
the custom SUPPORT role) can view the entire phone number and social security number for
customer verification use cases.
Question 91Skipped
What is the PRIMARY factor that determines the cost of using a virtual warehouse in
Snowflake?
Correct answer
Overall explanation
Question 92Skipped
What type of query will benefit from the query acceleration service?
Correct answer
Overall explanation
Examples of the types of workloads that might benefit from the query acceleration service
include:
• Ad hoc analytics.
Question 93Skipped
Which encryption type will enable client-side encryption for a directory table?
chumma kizhii
#dm
AES
Correct answer
SNOWFLAKE_FULL
AWS_CSE
SNOWFLAKE_SSE
Overall explanation
SNOWFLAKE_FULL: Client-side and server-side encryption. The files are encrypted by a client
when it uploads them to the internal stage using PUT.
Question 94Skipped
Correct answer
It does not return from the command until the warehouse has finished changing its size.
The warehouse size does not change until the warehouse is suspended and restarted.
The warehouse size does not change until all queries currently in the warehouse queue
have completed.
The warehouse size does not change until all queries currently running in the warehouse
have completed.
Overall explanation
WAIT_FOR_COMPLETION = FALSE | TRUE When resizing a warehouse, you can use this
parameter to block the return of the ALTER WAREHOUSE command until the resize has
finished provisioning all its compute resources. Blocking the return of the command when
resizing to a larger warehouse serves to notify you that your compute resources have been
fully provisioned and the warehouse is now ready to execute queries using all the new
resources.
Question 95Skipped
chumma kizhii
#dm
Correct answer
Set the minimum Clusters and maximum Clusters settings to the same value.
Set the minimum clusters and maximum clusters settings to different values.
Overall explanation
If min=max, there is no room for increasing any clusters and min and max would be same.
Hence same value for maximized mode.
Question 96Skipped
Correct selection
Named stage
Schema stage
Correct selection
Table stage
Correct selection
User stage
Stream stage
Database stage
Overall explanation
• User
• Table
• Named
Question 97Skipped
What happens when a Snowflake user changes the data retention period at the schema
level?
All child objects with an explicit retention period will be overridden with the new
retention period.
chumma kizhii
#dm
Correct answer
All child objects that do not have an explicit retention period will automatically inherit the
new retention period.
All child objects will retain data for the new retention period.
Overall explanation
If you change the data retention period for a database or schema, the change only affects
active objects contained within the database or schema. Any objects that have been
dropped (for example, tables) remain unaffected.
For example, if you have a schema s1 with a 90-day retention period and table t1 is in
schema s1, table t1 inherits the 90-day retention period. If you drop table s1.t1, t1 is
retained in Time Travel for 90 days. Later, if you change the schema’s data retention period
to 1 day, the retention period for the dropped table t1 is unchanged. Table t1 will still be
retained in Time Travel for 90 days.
Question 98Skipped
A user has an application that writes a new file to a cloud storage location every 5 minutes.
What would be the MOST efficient way to get the files into Snowflake?
Create a task that runs a COPY INTO operation from an external stage every 5 minutes.
Create a task that runs a GET operation to intermittently check for new files.
Correct answer
Set up cloud provider notifications on the file location and use Snowpipe with auto-ingest.
Create a task that PUTS the files in an internal stage and automate the data loading
wizard.
Overall explanation
Pipes are highly scalable and cost-effective since they only incur charges when data is
ingested, unlike other options like copying data at regular intervals or using external tables.
Question 99Skipped
chumma kizhii
#dm
Correct answer
Overall explanation
Higher the overlap micro partition, higher is the overlap depth. Overlap depth=1 means
there is no overlap
Question 100Skipped
What should be used when creating a CSV file format where the columns are wrapped by
single quotes or double quotes?
SKIP_BYTE_ORDER_MARK
ESCAPE_UNENCLOSED_FIELD
Correct answer
FIELD_OPTIONALLY_ENCLOSED_BY
BINARY_FORMAT
Overall explanation
chumma kizhii
#dm
Character used to enclose strings. Value can be NONE, single quote character ('), or double
quote character ("). To use the single quote character, use the octal or hex representation
(0x27) or the double single-quoted escape ('').
Question 101Skipped
When referring to User-Defined Function (UDF) names in Snowflake, what does the term
overloading mean?
There are multiple SQL UDFs with the same names and the same number of arguments.
Correct answer
There are multiple SQL UDFs with the same names but with a different number of
arguments or argument types.
There are multiple SQL UDFs with different names but the same number of arguments or
argument types.
There are multiple SQL UDFs with the same names and the same number of argument
types.
Overall explanation
Snowflake supports overloading procedures and functions. In a given schema, you can
define multiple procedures or functions that have the same name but different signatures.
The signatures must differ by the number of arguments, the types of the arguments, or
both.
Question 102Skipped
What happens to the underlying table data when a CLUSTER BY clause is added to a
Snowflake table?
Correct answer
Data may be colocated by the cluster key within the micro-partitions to improve pruning
performance
Larger micro-partitions are created for common data values to reduce the number of
partitions that must be scanned
Smaller micro-partitions are created for common data values to allow for more parallelism
Data is hashed by the cluster key to facilitate fast searches for common data values
Overall explanation
chumma kizhii
#dm
Question 103Skipped
ACCOUNTADMIN
Correct answer
ORGADMIN
SECURITYADMIN
SYSADMIN
Overall explanation
Usage notes: Only organization administrators (i.e. users with the ORGADMIN role) can call
this SQL function.
Question 104Skipped
Which statements reflect key functionalities of a Snowflake Data Exchange? (Choose two.)
Correct selection
A Data Exchange allows groups of accounts to share data privately among the accounts.
Data Exchange functionality is available by default in accounts using the Enterprise edition
or higher.
A Data Exchange allows accounts to share data with third, non-Snowflake parties.
If an account is enrolled with a Data Exchange, it will lose its access to the Snowflake
Marketplace.
Correct selection
The sharing of data in a Data Exchange is bidirectional. An account can be a provider for
some datasets and a consumer for others.
Overall explanation
Question 105Skipped
Which Snowflake layer is always used when accessing a query from the result cache?
chumma kizhii
#dm
Metadata
Data Storage
Compute
Correct answer
Cloud Services
Overall explanation
Question 106Skipped
What metadata does Snowflake store for rows in micro-partitions? (Choose two.)
Sorted values
chumma kizhii
#dm
Null values
Correct selection
Distinct values
Index values
Correct selection
Range of values
Overall explanation
Question 107Skipped
Which columns are part of the result set of the Snowflake LATERAL FLATTEN command?
(Choose two.)
BYTE_SIZE
CONTENT
DATATYPE
Correct selection
INDEX
Correct selection
PATH
Overall explanation
1. +-----+------+------+-------+-------+------+
3. |-----+------+------+-------+-------+------|
SEQ
A unique sequence number associated with the input record; the sequence is not
guaranteed to be gap-free or ordered in any particular way.
KEY
For maps or objects, this column contains the key to the exploded value.
chumma kizhii
#dm
PATH
The path to the element within a data structure which needs to be flattened.
INDEX
VALUE
THIS
Question 108Skipped
They can be created as secure and hide the underlying metadata from all users.
Correct answer
Overall explanation
Question 109Skipped
Correct answer
Creating a clone of an entire table at a specific point in the past from a permanent table
Overall explanation
Using Time Travel, you can perform the following actions within a defined period of time:
chumma kizhii
#dm
• Query data in the past that has since been updated or deleted.
• Create clones of entire tables, schemas, and databases at or before specific points in
the past.
Question 110Skipped
Which privilege is required for a role to be able to resume a suspended warehouse if auto-
resume is not enabled?
MONITOR
Correct answer
OPERATE
USAGE
MODIFY
Overall explanation
OPERATE: Enables changing the state of a warehouse (stop, start, suspend, resume). In
addition, enables viewing current and past queries executed on a warehouse and aborting
any executing queries.
Question 111Skipped
14 days
28 days
7 days
Correct answer
365 days
Overall explanation
Question 112Skipped
What Snowflake role must be granted for a user to create and manage accounts?
chumma kizhii
#dm
ACCOUNTADMIN
SECURITYADMIN
SYSADMIN
Correct answer
ORGADMIN
Overall explanation
An account can be created by an ORGADMIN through the web interface or using SQL
Question 113Skipped
Which of the following practices are recommended when creating a user in Snowflake?
(Choose two.)
Correct selection
Correct selection
Overall explanation
Question 114Skipped
Which command is used to unload data from a Snowflake table into a file in a stage?
Correct answer
COPY INTO
WRITE
EXTRACT INTO
GET
Overall explanation
chumma kizhii
#dm
In Snowflake, the "COPY INTO" command is used to unload data from a Snowflake table into
a file in a stage. The stage acts as an intermediate storage location for the unloaded data,
and the data can then be transferred to an external storage location such as Amazon S3 or
Microsoft Azure Blob Storage.
Question 115Skipped
When loading data into Snowflake, the COPY command supports which of the following?
Correct answer
Column reordering
Filters
Aggregates
Joins
Overall explanation
Question 116Skipped
Correct answer
Overall explanation
Question 117Skipped
Correct answer
chumma kizhii
#dm
Overall explanation
Question 118Skipped
ACCESS_HISTORY
Correct selection
LISTING_TELEMETRY_DAILY
DATA_TRANSFER_HISTORY
Correct selection
MONETIZED_USAGE_DAILY
WAREHOUSE_METERING_HISTORY
Overall explanation
You can expect in the exams some strange questions about parameters that are not the most
typical ones to use. It is not necessary to know all the parameters available in Snowflake, but
it is important that you are at least familiar with the most important ones.
Question 119Skipped
chumma kizhii
#dm
Which of the following statements describe features of Snowflake data caching? (Choose
two.)
When a virtual warehouse is suspended, the data cache is saved on the remote storage
layer.
Correct selection
The RESULT_SCAN table function can access and filter the contents of the query result
cache.
A user can only access their own queries from the query result cache.
Correct selection
When the data cache is full, the least-recently used data will be cleared to make room.
A user must set USE_METADATA_CACHE to TRUE to use the metadata cache in queries.
Overall explanation
Option When the data cache is full, the least-recently used data will be cleared to make
room. is correct because Snowflake automatically manages its data cache and evicts the
least-recently used data when the cache becomes full.
Option The RESULT_SCAN table function can access and filter the contents of the query result
cache. is correct because the RESULT_SCAN table function can be used to query and filter
the data that has been cached in the query result cache.
Option When a virtual warehouse is suspended, the data cache is saved on the remote
storage layer. is incorrect because when a virtual warehouse is suspended, the data cache is
not saved on the remote storage layer. The data cache is cleared when a virtual warehouse is
suspended and any data that needs to be cached is reloaded from the remote storage layer
when the virtual warehouse is resumed.
Option A user can only access their own queries from the query result cache. is incorrect
because the query result cache is a shared cache and all users can access the data that has
been cached. There are no restrictions based on user access.
Option A user must set USE_METADATA_CACHE to TRUE to use the metadata cache in
queries. is incorrect because the metadata cache is used by default in queries and there is no
need for a user to explicitly set USE_METADATA_CACHE to TRUE.
Question 120Skipped
Users are responsible for data storage costs until what occurs?
Correct answer
chumma kizhii
#dm
Overall explanation
Storage is calculated and charged for data regardless of whether it is in the Active, Time
Travel, or Fail-safe state. Because these life-cycle states are sequential, updated/deleted data
protected by CDP will continue to incur storage costs until the data leaves the Fail-safe state.
Question 121Skipped
How can a Snowflake user sample 10 rows from a table named SNOWPRO? (Choose two.)
Correct selection
Correct selection
Overall explanation
Question 122Skipped
Which Snowflake object contains all the information required to share a database?
Correct answer
Share
Secure view
Sequence
Private listing
Overall explanation
chumma kizhii
#dm
Shares are named Snowflake objects that encapsulate all of the information required to
share a database.
Question 123Skipped
What happens when a Data Provider revokes privileges to a share on an object in their
source database?
The Data Consumers stop seeing data updates and become responsible for storage charges
for the object.
A static copy of the object at the time the privilege was revoked is created in the Data
Consumers account.
Any additional data arriving after this point in time will not be visible to Data Consumers.
Correct answer
Overall explanation
Revokes access privileges for databases and other supported database objects (schemas,
tables, and views) from a share. Revoking privileges on these objects effectively removes the
objects from the share, disabling access to the objects granted via the database role in all
consumer accounts that have created a database from the share.
Question 124Skipped
On which of the following cloud platforms can a Snowflake account be hosted? (Choose
three.)
Correct selection
Oracle Cloud
Alibaba Cloud
Correct selection
Correct selection
chumma kizhii
#dm
Overall explanation
Question 125Skipped
Which of the following objects are contained within a schema? (Choose two.)
Share
Role
User
Correct selection
Stream
Correct selection
External table
Warehouse
Overall explanation
Role (ACCOUNT)
Stream (SCHEMA)
Warehouse (ACCOUNT)
User (ACCOUNT)
Share (DATABASE)
Question 126Skipped
A marketing co-worker has requested the ability to change a warehouse size on their
medium virtual warehouse called MKTG_WH.
Correct answer
chumma kizhii
#dm
Overall explanation
Question 127Skipped
Correct answer
Overall explanation
Result Cache: Which holds the results of every query executed in the past 24 hours. These
are available across virtual warehouses, so query results returned to one user is available to
any other user on the system who executes the same query, provided the underlying data
has not changed.
Question 128Skipped
Which of the following conditions must be met in order to return results from the results
cache? (Choose two.)
The new query is run using the same virtual warehouse as the previous query.
Correct selection
The user has the appropriate privileges on the objects associated with the query.
Micro-partitions have been reclustered since the query was last run.
Correct selection
The query has been run within 24 hours of the previously-run query.
Overall explanation
If the same query is fired again in 24 hrs, it will not be COMPUTED, which means it will not
be charged. and it is not affected by WH suspension.
So as the previous question the same exact query will return the pre-computed results if the
underlying data hasn't changed and the results were last accessed within previous 24-hour
period.
chumma kizhii
#dm
Question 129Skipped
Role
Correct selection
User
Tables
Schema
Correct selection
Account
Database
Overall explanation
Question 130Skipped
GET
Correct answer
PUT
COPY
LOAD
Overall explanation
If the question was "Which command can be used to UNLOAD ..." then is correct copy, but
loading on stage means that you are loading external data, so PUT option is correct.
Question 131Skipped
chumma kizhii
#dm
PARSE_JSON
CHECK_JSON
TO_JSON
Correct answer
FLATTEN
Overall explanation
Question 132Skipped
Correct selection
Python
C++
JavaScript
Correct selection
Scala
Overall explanation
• Java
• Python
• Scala
Question 133Skipped
Which of the following statements about data sharing are true? (Choose two.)
Correct selection
chumma kizhii
#dm
Correct selection
New objects created by a Data Provider are automatically shared with existing Data
Consumers and Reader Accounts.
Overall explanation
Question 134Skipped
IS_ARRAY
CHECK_JSON
IS_JSON
Correct answer
TYPEOF
Overall explanation
TYPEOF
Reports the type of a value stored in a VARIANT column. The type is returned as a string.
Question 135Skipped
Assume there is a table consisting of five micro-partitions with values ranging from A to Z.
A)
B)
chumma kizhii
#dm
C)
D)
Correct answer
chumma kizhii
#dm
Overall explanation
When there is no overlap in the range of values across all micro-partitions, the micro-
partitions are considered to be in a constant state (i.e. they cannot be improved by
clustering).
Question 136Skipped
A table needs to be loaded. The input data is in JSON format and is a concatenation of
multiple JSON documents. The file size is 3 GB. A warehouse size S is being used.
Split the file into multiple files in the recommended size range (100 MB - 250 MB).
Correct answer
Overall explanation
If the data exceeds 16 MB, enable the STRIP_OUTER_ARRAY file format option for the COPY
INTO <table> command to remove the outer array structure and load the records into
separate table rows
Question 137Skipped
Why does Snowflake recommend file sizes of 100-250 MB compressed when loading data?
Increases the latency staging and accuracy when loading the data
Correct answer
chumma kizhii
#dm
Optimizes the virtual warehouse size and multi-cluster setting to economy mode
Overall explanation
Question 138Skipped
What are the least privileges needed to view and modify resource monitors? (Choose two.)
USAGE
OWNERSHIP
Correct selection
MODIFY
Correct selection
MONITOR
SELECT
Overall explanation
By default, resource monitors can only be created by account administrators and, therefore,
can only be viewed and maintained by them.
However, roles that have been granted the following privileges on specific resource monitors
can view and modify the resource monitor as needed using SQL:
• MONITOR
• MODIFY
Question 139Skipped
What will happen if a Snowflake user increases the size of a suspended virtual warehouse?
The warehouse will resume immediately and start to share the compute load with other
running virtual warehouses.
The provisioning of new compute resources for the warehouse will begin immediately.
The warehouse will remain suspended but new resources will be added to the query
acceleration service.
Correct answer
chumma kizhii
#dm
The provisioning of additional compute resources will be in effect when the warehouse is
next resumed.
Overall explanation
Resizing a suspended warehouse does not provision any new compute resources for the
warehouse. It simply instructs Snowflake to provision the additional compute resources
when the warehouse is next resumed, at which time all the usage and credit rules associated
with starting a warehouse apply.
Question 140Skipped
Which of the following compute resources or features are managed by Snowflake? (Choose
two.)
Correct selection
Snowpipe
Updating data
Scaling up a warehouse
Correct selection
AUTOMATIC_CLUSTERING
Overall explanation
Question 141Skipped
OAuth
Okta
Correct answer
chumma kizhii
#dm
Duo Security
Overall explanation
MFA provides increased login security for users connecting to Snowflake. MFA support is
provided as an integrated Snowflake feature, powered by the Duo Security service, which is
managed completely by Snowflake.
Question 142Skipped
Where would a Snowflake user find information about query activity from 90 days ago?
account_usage.query_history_archive view
information_schema.query_history_by_session view
Correct answer
account_usage.query_history view
information_schema.query_history view
Overall explanation
Question 143Skipped
What is the MINIMUM Snowflake edition required to use the periodic rekeying of micro-
partitions?
Business Critical
Correct answer
Enterprise
Standard
Overall explanation
Question 144Skipped
What does the Activity area of Snowsight allow users to do? (Choose two.)
chumma kizhii
#dm
Correct selection
Correct selection
Overall explanation
Question 145Skipped
FOR
REPEAT
WHILE
Correct answer
LOOP
Overall explanation
BREAK is required in a LOOP but is not necessary in WHILE, FOR, and REPEAT.
Question 146Skipped
Which data types are supported by Snowflake when using semi-structured data? (Choose
two.)
Correct selection
VARIANT
STRUCT
Correct selection
ARRAY
QUEUE
VARRAY
chumma kizhii
#dm
Overall explanation
Question 147Skipped
The database ACCOUNTADMIN must define the clustering methodology for each
Snowflake table.
The clustering key must be included in the COPY command when loading data into
Snowflake.
Correct answer
Clustering is the way data is grouped together and stored within Snowflake micro-
partitions.
Overall explanation
Question 148Skipped
What are the default Time Travel and Fail-safe retention periods for transient tables?
Correct answer
chumma kizhii
#dm
Overall explanation
Transient tables can have a Time Travel retention period of either 0 or 1 day.
Temporary tables can also have a Time Travel retention period of 0 or 1 day; however, this
retention period ends as soon as the table is dropped or the session in which the table was
created ends.
Question 149Skipped
A Snowflake user executed a query and received the results. Another user executed the
same query 4 hours later. The data had not changed.
The virtual warehouse that is defined at the session level will be used to read all data.
Correct answer
No virtual warehouse will be used, data will be read from the result cache.
No virtual warehouse will be used, data will be read from the local disk cache.
Overall explanation
Question 150Skipped
chumma kizhii
#dm
Which of the following describes how multiple Snowflake accounts in a single organization
relate to various cloud providers?
Each Snowflake account can be hosted in a different cloud vendor, but must be in the
same region.
Each Snowflake account must be hosted in a different cloud vendor and region.
Correct answer
Each Snowflake account can be hosted in a different cloud vendor and region.
All Snowflake accounts must be hosted in the same cloud vendor and region.
Overall explanation
The cloud platform you choose for each Snowflake account is completely independent from
your other Snowflake accounts. In fact, you can choose to host each Snowflake account on a
different platform, although this may have some impact on data transfer billing when
loading data.
Question 151Skipped
Standard Edition
Correct answer
Enterprise Edition
Overall explanation
chumma kizhii
#dm
Question 152Skipped
Data storage
Correct answer
Cloud services
Cloud provider
Compute
Overall explanation
The cloud services layer is a collection of services that coordinate activities across Snowflake.
These services tie together all of the different components of Snowflake in order to process
user requests, from login to query dispatch. The cloud services layer also runs on compute
instances provisioned by Snowflake from the cloud provider.
• Authentication
• Infrastructure management
• Metadata management
• Access control
Question 153Skipped
What are the recommended steps to address poor SQL query performance due to data
spilling? (Choose two.)
Correct selection
Correct selection
chumma kizhii
#dm
Overall explanation
The spilling can't always be avoided, especially for large batches of data, but it can be
decreased by:
• Reducing the amount of data processed. For example, by trying to improve partition
pruning, or projecting only the columns that are needed in the output.
• Trying to split the processing into several steps (for example by replacing the CTEs
with temporary tables).
• Using a larger warehouse - this effectively means more memory and more local disk
space.
Question 154Skipped
Correct answer
Secure views are similar to materialized views in that they are the most performant.
Non-secure views are preferred over secure views when sharing data.
Overall explanation
Question 155Skipped
Correct answer
Overall explanation
chumma kizhii
#dm
Defining a File Format: File format defines the type of data to be unloaded into the stage or
S3. It is best practice to define an individual file format when regularly used to unload a
certain type of data based on the characteristics of the file needed.
SET 6
Question 1Skipped
How can an administrator check for updates (for example, SCIM API requests) sent to
Snowflake by the identity provider?
QUERY_HISTORY
ACCESS_HISTORY
LOAD_HISTORY
Correct answer
REST_EVENT_HISTORY
Overall explanation
Administrators can query the rest_event_history table to determine whether the identity
provider is sending updates (i.e. SCIM API requests) to Snowflake.
Question 2Skipped
What type of NULL values are supported in semi-structured data? (Choose two.)
Avro
Parquet
ORC
Correct selection
SQL
chumma kizhii
#dm
Correct selection
JSON
Overall explanation
Snowflake supports two types of NULL values in semi-structured data: SQL NULL:
• SQL NULL means the same thing for semi-structured data types as it means for
structured data types: the value is missing or unknown.
• JSON null (sometimes called “VARIANT NULL”): In a VARIANT column, JSON null
values are stored as a string containing the word “null” to distinguish them from SQL
NULL values.
Question 3Skipped
Which function will return a row for each for each object in a VARIANT, OBJECT, or ARRAY
column?
Correct answer
FLATTEN
CAST
GET
PARSE_JSON
Overall explanation
The FLATTEN function is a table function that takes a VARIANT, OBJECT, or ARRAY column and
returns a row for each element or attribute within the column.
Question 4Skipped
What JavaScript delimiters are available in Snowflake stored procedures? (Choose two.)
Correct selection
chumma kizhii
#dm
Correct selection
Overall explanation
The JavaScript portion of the stored procedure code must be enclosed within either single
quotes ' or double dollar signs $$.
Question 5Skipped
Which clustering indicator will show if a large table in Snowflake will benefit from explicitly
defining a clustering key?
Ratio
Correct answer
Depth
Percentage
Overall explanation
Question 6Skipped
Which of the following languages can be used to implement Snowflake User Defined
Functions (UDFs)? (Choose two.)
Correct selection
SQL
Correct selection
Javascript
Ruby
C#
PERL
Overall explanation
Question 7Skipped
chumma kizhii
#dm
Correct selection
Correct selection
Overall explanation
Transient tables are a type of table in Snowflake that persist until they are explicitly dropped.
They do not have a Fail-safe period, and they can only have a Time Travel retention period of
0 or 1 day. Transient tables cannot be cloned to permanent tables, and they cannot be
altered to make them permanent tables.
Question 8Skipped
Custom role
Account role
Correct answer
Database role
Secondary role
Overall explanation
chumma kizhii
#dm
Grant the database role to a share and grant future privileges on an object to the database
role:
Question 9Skipped
What are valid sub-clauses to the OVER clause for a window function? (Choose two.)
UNION ALL
GROUP BY
LIMIT
Correct selection
PARTITION BY
Correct selection
ORDER BY
Overall explanation
Window Syntax
Question 10Skipped
How are URLs that access unstructured data in external stages retrieved?
Correct answer
Overall explanation
Question 11Skipped
chumma kizhii
#dm
Correct selection
Correct selection
Users can import worksheets and share them with other users.
Overall explanation
You can share worksheets and folders of worksheets with other Snowflake users in your
account. Each worksheet is a unique session and can use roles different from the role you
select. Folders can't be nested.
Question 12Skipped
How does Snowflake handle the bulk unloading of data into single or multiple files?
It uses COPY INTO to copy the data from a table into one or more files in an external stage
only.
It uses COPY INTO for bulk unloading where the default option is SINGLE = TRUE.
Correct answer
Overall explanation
Bulk Unloading into Single or Multiple Files The COPY INTO <location> command provides a
copy option (SINGLE) for unloading data into a single file or multiple files. The default is
SINGLE = FALSE (i.e. unload into multiple files).
Snowflake assigns each file a unique name. The location path specified for the command can
contain a filename prefix that is assigned to all the data files generated. If a prefix is not
specified, Snowflake prefixes the generated filenames with data_.
Question 13Skipped
chumma kizhii
#dm
What ensures that a user with the role SECURITYADMIN can activate a network policy for an
individual user?
A role that has been granted the global ATTACH POLICY privilege
Correct answer
Ownership privilege on only the role that created the network policy
Overall explanation
Only the role with the OWNERSHIP privilege on both the user and the network policy, or a
higher role, can activate a network policy for an individual user.
Question 14Skipped
365 Days
Correct answer
30 Days
60 Days
90 Days
Overall explanation
All Snowflake-managed keys are automatically rotated by Snowflake when they are more
than 30 days old
Question 15Skipped
When unloading data, which file format preserves the data values for floating-point number
columns?
JSON
CSV
Avro
Correct answer
chumma kizhii
#dm
Parquet
Overall explanation
When floating-point number columns are unloaded to CSV or JSON files, Snowflake
truncates the values to approximately (15,9).
The values are not truncated when unloading floating-point number columns to Parquet
files.
Question 16Skipped
Correct answer
Overall explanation
Question 17Skipped
What does the VARIANT data type impose a 16 MB size limit on?
All rows
Individual columns
Correct answer
Individual rows
All columns
Overall explanation
Question 18Skipped
How can a Snowflake user share data with another user who does not have a Snowflake
account?
chumma kizhii
#dm
Correct answer
Overall explanation
Question 19Skipped
A Snowflake user wants to share data with someone who does not have a Snowflake
account.
Correct answer
Overall explanation
Question 20Skipped
What column type does a Kafka connector store formatted information in a single column?
ARRAY
VARCHAR
Correct answer
VARIANT
OBJECT
Overall explanation
chumma kizhii
#dm
Each Kafka message is passed to Snowflake in JSON format or Avro format. The Kafka
connector stores that formatted information in a single column of type VARIANT. The data is
not parsed, and the data is not split into multiple columns in the Snowflake table.
Question 21Skipped
What is the MINIMUM permission needed to access a file URL from an external stage?
READ
MODIFY
Correct answer
USAGE
SELECT
Overall explanation
Permissions Required
Query
Question 22Skipped
What actions does the use of the PUT command do automatically? (Choose two.)
Correct selection
Correct selection
Overall explanation
Question 23Skipped
chumma kizhii
#dm
TO_VARIANT()
Correct answer
OBJECT_CONSTRUCT()
PARSE_JSON()
BUILD_STAGE_FILE_URL()
Overall explanation
Question 24Skipped
They require the use of masking and row access policies across every table and view in the
schema.
They enforce identical privileges across all tables and views in a schema.
Correct answer
Overall explanation
With managed access schemas, object owners lose the ability to make grant decisions. Only
the schema owner (i.e. the role with the OWNERSHIP privilege on the schema) or a role with
the MANAGE GRANTS privilege can grant privileges on objects in the schema, including
future grants, centralizing privilege management.
Question 25Skipped
Which roles can make grant decisions to objects within a managed access schema? (Choose
two.)
Correct selection
ACCOUNTADMIN
chumma kizhii
#dm
SYSADMIN
Correct selection
SECURITYADMIN
USERADMIN
ORGADMIN
Overall explanation
Question 26Skipped
GROUP BY <>;
Correct answer
Automatically
ORDER BY <>;
PARTITION BY <>;
Overall explanation
Question 27Skipped
A single reader account can consume data from multiple provider accounts.
Data consumers are responsible for reader account setup and data usage costs.
chumma kizhii
#dm
Correct selection
Reader accounts enable data consumers to access and query data shared by the provider.
Correct selection
Overall explanation
Question 28Skipped
Correct selection
A directory table can be added to a stage when the stage is created, or later.
Correct selection
Directory tables store file-level metadata about the data files in a stage.
Overall explanation
A directory table is an implicit object layered on a stage and it stores file-level metadata
about the data files in the stage.
You can add a directory table to a stage when you create a stage (using CREATE STAGE) or
later (using ALTER STAGE).
Question 29Skipped
When the caller needs to run a statement that could not execute outside of the stored
procedure
When the caller needs to be prevented from viewing the source code of the stored
procedure
chumma kizhii
#dm
When the stored procedure needs to operate on objects that the caller does not have
privileges on
Correct answer
When the stored procedure needs to run with the privileges of the role that called the
stored procedure
Overall explanation
Question 30Skipped
What statistical information in a Query Profile indicates that the query is too large to fit in
memory? (Choose two.)
Correct selection
Correct selection
Overall explanation
Question 31Skipped
Correct answer
Inserting a value to FOREIGN KEY column that does not match a value in the column
referenced
chumma kizhii
#dm
Overall explanation
Snowflake supports defining and maintaining constraints, but does not enforce them, except
for NOT NULL constraints, which are always enforced.
Question 32Skipped
How long can a data consumer who has a pre-signed URL access data files using Snowflake?
Correct answer
Indefinitely
Overall explanation
Question 33Skipped
A user wants to access files stored in a stage without authenticating into Snowflake. Which
type of URL should be used?
File URL
Staged URL
Scoped URL
Correct answer
Pre-signed URL
Overall explanation
You can allow data consumers to retrieve either scoped or pre-signed URLs from the secure
view. Scoped URLs provide better security, while pre-signed URLs can be accessed without
authorization or authentication.
Question 34Skipped
Which Snowflake object can be accessed in the FROM clause of a query, returning a set of
rows having one or more columns?
chumma kizhii
#dm
Correct answer
A stored procedure
A task
Overall explanation
Question 35Skipped
Which function returns the URL of a stage using the stage name as the input?
GET_PRESIGNED_URL
Correct answer
GET_STAGE_LOCATION
BUILD_STAGE_FILE_URL
BUILD_SCOPED_FILE_URL
Overall explanation
Question 36Skipped
Which common query problems are identified by the Query Profile? (Choose two.)
Correct selection
Correct selection
Inefficient pruning
Syntax error
Overall explanation
chumma kizhii
#dm
Spilling — information about disk usage for operations where intermediate results do not fit
in memory.
Question 37Skipped
What are Snowflake best practices when assigning the ACCOUNTADMIN role to users?
(Choose two.)
Correct selection
All users assigned the ACCOUNTADMIN role should use Multi-Factor Authentication (MFA).
The ACCOUNTADMIN role should be given to any user who needs a high level of authority.
Correct selection
Overall explanation
• All users assigned the ACCOUNTADMIN role should also be required to use multi-
factor authentication (MFA) for login (for details, see Configuring Access Control).
• Assign this role to at least two users. We follow strict security procedures for
resetting a forgotten or lost password for users with the ACCOUNTADMIN role. These
procedures can take up to two business days. Assigning the ACCOUNTADMIN role to
more than one user avoids having to go through these procedures because the users
can reset each other’s passwords.
Question 38Skipped
A complex SQL query involving eight tables with joins is taking a while to execute. The Query
Profile shows that all partitions are being scanned.
The columns in the micro-partitions need granular ordering based on the dataset.
chumma kizhii
#dm
Incorrect joins are being used, leading to scanning and pulling too many records.
Correct answer
Overall explanation
Key is: Query Profile shows that all partitions are being scanned.
The efficiency of pruning can be observed by comparing Partitions scanned and Partitions
total statistics in the TableScan operators. If the former is a small fraction of the latter,
pruning is efficient. If not, the pruning did not have an effect.
Question 39Skipped
Materialized views
External tables
Correct answer
Overall explanation
• External tables.
• Materialized views.
• Column concatenation.
• Analytical expressions.
Question 40Skipped
Which object-level parameters can be set to help control query processing and concurrency?
(Choose two).
chumma kizhii
#dm
Correct selection
STATEMENT_QUEUED_TIMEOUT_IN_SECONDS
Correct selection
STATEMENT_TIMEOUT_IN_SECONDS
MAX_CONCURRENCY_LEVEL
MIN_DATA_RETENTION_TIME_IN_DAYS
DATA_RETENTION_TIME_IN_DAYS
Overall explanation
The number of queries that a warehouse can concurrently process is determined by the size
and complexity of each query. As queries are submitted, the warehouse calculates and
reserves the compute resources needed to process each query. If the warehouse does not
have enough remaining resources to process a query, the query is queued, pending
resources that become available as other running queries complete.
Snowflake provides some object-level parameters that can be set to help control query
processing and concurrency:
STATEMENT_QUEUED_TIMEOUT_IN_SECONDS
STATEMENT_TIMEOUT_IN_SECONDS
Question 41Skipped
Which statements reflect valid commands when using secondary roles? (Choose two.)
Correct selection
Correct selection
Overall explanation
chumma kizhii
#dm
Question 42Skipped
Small
X-Small
Large
Correct answer
Medium
Overall explanation
Question 43Skipped
A user needs to MINIMIZE the cost of large tables that are used to store transitory data. The
data does not need to be protected against failures, because the data can be reconstructed
outside of Snowflake.
External
Directory
Permanent
Correct answer
Transient
Overall explanation
Transient tables are best suited in scenarios where the data in your table is not critical and
can be recovered from external means if required. Also, they have no fail-safe period, and
Time travel is also only 1 day (which can be set to 0 also).
chumma kizhii
#dm
Question 44Skipped
The use of which technique or tool will improve Snowflake query performance on very large
tables?
Materialized views
Indexing
Multi-clustering
Correct answer
Clustering keys
Overall explanation
A clustering key is a subset of columns in a table (or expressions on a table) that are explicitly
designated to co-locate the data in the table in the same micro-partitions. This is useful for
very large tables where the ordering was not ideal (at the time the data was
inserted/loaded) or extensive DML has caused the table’s natural clustering to degrade.
Question 45Skipped
Correct answer
4 hours
1 hour
2 hours
8 hours
Overall explanation
chumma kizhii
#dm
MFA token caching can help to reduce the number of prompts that must be acknowledged
while connecting and authenticating to Snowflake, especially when multiple connection
attempts are made within a relatively short time interval. A cached MFA token is valid for up
to four hours.
Question 46Skipped
What metadata does Snowflake store concerning all rows stored in a micro-partition?
(Choose two.)
Correct selection
Correct selection
Overall explanation
• Additional properties used for both optimization and efficient query processing.
Question 47Skipped
A Snowflake user wants to unload data from a relational table sized 5 GB using CSV. The
extract needs to be as performant as possible.
Correct answer
Use a regular expression in the stage specification of the COPY command to restrict
parsing time.
chumma kizhii
#dm
Use Parquet as the unload file format, using Parquet's default compression feature.
Increase the default MAX_FILE_SIZE to 5 GB and set SINGLE = true to produce a single file.
Overall explanation
By default, COPY INTO location statements separate table data into a set of output files to
take advantage of parallel operations. The maximum size for each file is set using the
MAX_FILE_SIZE copy option. The default value is 16777216 (16 MB) but can be increased to
accommodate larger files. The maximum file size supported is 5 GB for Amazon S3, Google
Cloud Storage, or Microsoft Azure stages.
Question 48Skipped
Correct answer
Overall explanation
Transient tables are specifically designed for transitory data that needs to be maintained
beyond each session
Question 49Skipped
Which kind of Snowflake table stores file-level metadata for each file in a stage?
Transient
External
Correct answer
Directory
Temporary
Overall explanation
chumma kizhii
#dm
A directory table is an implicit object layered on a stage (not a separate database object) and
is conceptually similar to an external table because it stores file-level metadata about the
data files in the stage. A directory table has no grantable privileges of its own.
Question 50Skipped
Correct answer
Overall explanation
A warehouse can be resized up or down at any time, including while it is running and
processing statements.
Question 51Skipped
User A cloned a schema and overwrote a schema that User B was working on. User B no
longer has access to their version of the tables. However, this all occurred within the Time
Travel retention period defined at the database level.
Correct answer
Overall explanation
If an object with the same name already exists, UNDROP fails. You must rename the existing
object, which then enables you to restore the previous version of the object.
chumma kizhii
#dm
Question 52Skipped
Queries that are running on the current warehouse configuration are moved to the new
configuration and finished there.
Correct answer
Queries that are running on the current warehouse configuration are not impacted.
Queries that are running on the current warehouse configuration are aborted and have to
be resubmitted by the user.
Queries that are running on the current warehouse configuration are aborted and are
automatically resubmitted.
Overall explanation
Resizing a running warehouse does not impact queries that are already being processed by
the warehouse; the additional compute resources, once fully provisioned, are only used for
queued and new queries.
Question 53Skipped
Correct answer
Overall explanation
chumma kizhii
#dm
Question 54Skipped
Which user object property requires contacting Snowflake Support in order to set a value for
it?
Correct answer
MINS_TO_BYPASS_NETWORK_POLICY
DISABLED
MINS_TO_BYPASS_MFA
MINS_TO_UNLOCK
Overall explanation
Question 55Skipped
chumma kizhii
#dm
Correct answer
Overall explanation
Question 56Skipped
A tag object has been assigned to a table (TABLE_A) in a schema within a Snowflake
database.
Which CREATE object statement will automatically assign the TABLE_A tag to a target object?
Correct answer
Overall explanation
With CREATE TABLE … LIKE, tags assigned to the source table are assigned to the target table
Question 57Skipped
Attribute
Schema
User
Correct answer
Role
Overall explanation
chumma kizhii
#dm
Question 58Skipped
OPERATE
MODIFY
USAGE
Correct answer
MONITOR
Overall explanation
Question 59Skipped
What does Snowflake recommend regarding database object ownership? (Choose two.)
Correct selection
Create objects with a custom role and grant this role to SYSADMIN.
Correct selection
Overall explanation
SYSADMIN - Role that has privileges to create warehouses and databases (and other objects)
in an account.
If, as recommended, you create a role hierarchy that ultimately assigns all custom roles to
the SYSADMIN role, this role also has the ability to grant privileges on warehouses,
databases, and other objects to other roles.
Question 60Skipped
DELETE
chumma kizhii
#dm
RENAME
Correct answer
ALTER
INSERT
Overall explanation
External tables are read-only. You cannot perform data manipulation language (DML)
operations on them. However, you can use external tables for query and join operations. You
can also create views against external tables.
Question 61Skipped
What is used to extract the content of PDF files stored in Snowflake stages?
FLATTEN function
Correct answer
Window function
Overall explanation
Question 62Skipped
What does it mean when the sample function uses the Bernoulli sampling method?
Correct answer
Overall explanation
chumma kizhii
#dm
BERNOULLI (or ROW): Includes each row with a probability of p/100. Similar to flipping a
weighted coin for each row.
Question 63Skipped
Correct answer
Activity
Data
Dashboards
Admin
Overall explanation
Question 64Skipped
What is the default period of time the Warehouse Activity section provides a graph of
Snowsight activity?
1 week
2 hours
1 month
Correct answer
2 weeks
Overall explanation
Warehouse Activity
The Warehouse Activity section provides a graph of activity over a period of time:
• Hour
• Day
• Week
• 2 Weeks (default)
Question 65Skipped
chumma kizhii
#dm
How does Snowflake define its approach to Discretionary Access Control (DAC)?
Access privileges are assigned to roles, which are in turn assigned to users.
Correct answer
Each object has an owner, who can in turn grant access to that object.
Overall explanation
Discretionary Access Control (DAC): Each object has an owner, who can in turn grant access
to that object.
Question 66Skipped
What MINIMUM privilege is required on the external stage for any role in the GET REST API
to access unstructured data files using a file URL?
Correct answer
USAGE
OWNERSHIP
WRITE
READ
Overall explanation
External: USAGE
Internal: READ
Question 67Skipped
Correct selection
chumma kizhii
#dm
Correct selection
Overall explanation
Question 68Skipped
MIN_DATA_RETENSION_TIME_IN_DAYS
Correct answer
MAX_DATA_EXTENSION_TIME_IN_DAYS
LOCK_TIMEOUT
STALE_AFTER
Overall explanation
Question 69Skipped
Which activities are managed by Snowflake’s Cloud Services layer? (Choose two.)
Data compression
Correct selection
Authentication
Access delegation
Data pruning
Correct selection
Overall explanation
• Authentication
• Infrastructure management
• Metadata management
chumma kizhii
#dm
• Access control
Question 70Skipped
What are reasons for using the VALIDATE function in Snowflake after a COPY INTO command
execution? (Choose two.)
To count the number of errors encountered during the execution of the COPY INTO
command
Correct selection
To validate the files that have been loaded earlier using the COPY INTO command
Correct selection
To return errors encountered during the execution of the COPY INTO command
To fix errors that were made during the execution of the COPY INTO command
Overall explanation
VALIDATE function validates the files loaded in a past execution of the COPY INTO <table>
command and returns all the errors encountered during the load, rather than just the first
error.
Question 71Skipped
Two users share a virtual warehouse named WH_DEV_01. When one of the users loads data,
the other one experiences performance issues while querying data.
Correct answer
Overall explanation
chumma kizhii
#dm
If the running query load is high or there’s queuing, consider starting a separate warehouse
and moving queued queries to that warehouse.
Question 72Skipped
Which function should be used to find the query ID of the second query executed in a
current session?
Correct answer
1. Select LAST_QUERY_ID(2)
1. Select LAST_QUERY_ID(1)
1. Select LAST_QUERY_ID(-1)
1. Select LAST_QUERY_ID(-2)
Overall explanation
Positive numbers start with the first query executed in the session. For example:
Etc.
Negative numbers start with the most recently-executed query in the session. For example:
Question 73Skipped
Which command should a Snowflake user execute to load data into a table?
Correct answer
chumma kizhii
#dm
Overall explanation
Question 74Skipped
Correct answer
SNOWFLAKE.ACCOUNT_USAGE.ACCESS_HISTORY
SNOWFLAKE.ACCOUNT_USAGE.OBJECT_DEPENDENCIES
SNOWFLAKE.ACCOUNT_USAGE.COLUMNS
SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_EVENT_HISTORY
Overall explanation
Question 75Skipped
What does Snowflake attempt to do if any of the compute resources for a virtual warehouse
fail to provision during start-up?
Correct answer
Overall explanation
If any of the compute resources for the warehouse fail to provision during start-up,
Snowflake attempts to repair the failed resources.
Question 76Skipped
By default, which role has privileges to create tables and views in an account?
USERADMIN
PUBLIC
chumma kizhii
#dm
Correct answer
SYSADMIN
SECURITYADMIN
Overall explanation
The SYSADMIN role is a system-defined role that has privileges to create warehouses,
databases, and database objects in an account and grant those privileges to other roles.
Question 77Skipped
It provides detailed statistics about which queries are using the greatest number of
compute resources.
Correct answer
Overall explanation
Query Profile, available through the classic web interface, provides execution details for a
query. For the selected query, it provides a graphical representation of the main components
of the processing plan for the query, with statistics for each component, along with details
and statistics for the overall query.
Question 78Skipped
Snowflake strongly recommends that all users with what type of role be required to use
Multi-Factor Authentication (MFA)?
Correct answer
ACCOUNTADMIN
USERADMIN
SECURITYADMIN
SYSADMIN
Overall explanation
chumma kizhii
#dm
All users assigned the ACCOUNTADMIN role should also be required to use multi-factor
authentication (MFA) for login.
Question 79Skipped
A JSON object is loaded into a column named data using a Snowflake variant datatype. The
root node of the object is BIKE. The child attribute for this root node is BIKEID.
1. select data.BIKE.BIKEID
Correct answer
1. select data:BIKE.BIKEID
1. select data:BIKE:BIKEID
1. select data:BIKEID
Overall explanation
Question 80Skipped
Which command should be used when loading many flat files into a single table?
Correct answer
COPY INTO
INSERT
MERGE
PUT
Overall explanation
Question 81Skipped
What technique does Snowflake recommend for determining which virtual warehouse size
to select?
Always start with an X-Small and increase the size if the query does not complete in 2
minutes
chumma kizhii
#dm
Correct answer
Overall explanation
2. Don’t focus on warehouse size. Snowflake utilizes per-second billing, so you can run
larger warehouses (Large, X-Large, 2X-Large, etc.) and simply suspend them when not
in use.
Question 82Skipped
Snowsight allows users to view and refresh results but not to edit shared worksheets.
Snowsight offers different sharing permissions at the worksheet, folder, and dashboard
level.
Correct answer
To run a shared worksheet, a user must be granted the role used for the worksheet session
context.
Worksheets can be shared with users that are internal or external to any organization.
Overall explanation
When you share a worksheet with someone, you can manage access to the worksheet and
its contents by choosing which permissions to grant to the other user. These permissions are
also used for sharing dashboards. Worksheet owners have the same permissions as
worksheet editors.
Each worksheet in Snowsight uses a unique session with a specific role and warehouse
assigned in the context of the worksheet. The worksheet role is the primary role last used to
run the worksheet and is required to run the worksheet. The worksheet role can change if
the worksheet owner or editor runs the worksheet using a different role.
chumma kizhii
#dm
Question 83Skipped
Correct selection
Geography
BLOB
Correct selection
Variant
CLOB
JSON
Overall explanation
Question 84Skipped
Other than ownership what privileges does a user need to view and modify resource
monitors in Snowflake? (Choose two.)
DROP
ALTER
Correct selection
MONITOR
Correct selection
MODIFY
CREATE
Overall explanation
Question 85Skipped
How can a Snowflake user post-process the result of SHOW FILE FORMATS?
Correct answer
chumma kizhii
#dm
Overall explanation
RESULT_SCAN function returns the result set of a previous command (within 24 hours of
when you executed the query) as if the result was a table.
Question 86Skipped
In the Data Exchange, who can get or request data from the listings? (Choose two.)
Correct selection
Correct selection
Overall explanation
All users can browse listings in the Data Exchange, but only users with the ACCOUNTADMIN
role or the IMPORT SHARE privilege can get or request data.
Question 87Skipped
The user must run the CALL command to execute the block.
The SUBMIT command must run immediately after the block is defined
Correct answer
The statements that define the block must also execute the block.
Overall explanation
chumma kizhii
#dm
The BEGIN … END statement that defines the block also executes the block. (You don’t run a
separate CALL command to execute the block.)
Question 88Skipped
Several users are using the same virtual warehouse. The users report that the queries are
running slowly, and that many queries are being queued.
Correct answer
Overall explanation
We have to solve concurrency issues here and scale out is the solution so most suitable
option is increase the cluster count.
Question 89Skipped
Which features could be used to improve the performance of queries that return a small
subset of rows from a large table? (Choose two.)
Secure views
Correct selection
Correct selection
Automatic clustering
Overall explanation
The search optimization service aims to significantly improve the performance of certain
types of queries on tables, including:
chumma kizhii
#dm
• Selective point lookup queries on tables. A point lookup query returns only one or a
small number of distinct rows. (...)
Typically, queries benefit from clustering when the queries filter or sort on the clustering key
for the table. Sorting is commonly done for ORDER BY operations, for GROUP BY operations,
and for some joins.
Question 90Skipped
Which role has the ability to create a share from a shared database by default?
SYSADMIN
Correct answer
ACCOUNTADMIN
ORGADMIN
SECURITYADMIN
Overall explanation
By default, the privileges required to create and manage shares are granted only to
the ACCOUNTADMIN role, ensuring that only account administrators can perform these
tasks.
Question 91Skipped
If file format options are specified in multiple locations, the load operation selects which
option FIRST to apply in order of precedence?
Correct answer
Stage definition
Table definition
Session level
Overall explanation
If file format options are specified in multiple locations, the load operation applies the
options in the following order of precedence:
chumma kizhii
#dm
2. Stage definition.
3. Table definition.
Question 92Skipped
What is SnowSQL?
Snowflake's new user interface where users can visualize data into charts and dashboards.
Snowflake's proprietary extension of the ANSI SQL standard, including built-in keywords
and system functions.
Correct answer
Snowflake's command line client built on the Python connector which is used to connect
to Snowflake and execute SQL.
Overall explanation
SnowSQL is the command line client for connecting to Snowflake to execute SQL queries and
perform all DDL and DML operations, including loading data into and unloading data out of
database tables.
SnowSQL (snowsql executable) can be run as an interactive shell or in batch mode through
stdin or using the -f option.
Question 93Skipped
What is the recommended way to obtain a cloned table with the same grants as the source
table?
Correct answer
Create a script to extract grants and apply them to the cloned table.
chumma kizhii
#dm
Overall explanation
Question 94Skipped
A Query Profile shows a UnionAll operator with an extra Aggregate operator on top.
Correct answer
Inefficient pruning
Exploding joins
Overall explanation
In SQL, it is possible to combine two sets of data with either UNION or UNION ALL
constructs. The difference between them is that UNION ALL simply concatenates inputs,
while UNION does the same, but also performs duplicate elimination.
A common mistake is to use UNION when the UNION ALL semantics are sufficient. These
queries show in Query Profile as a UnionAll operator with an extra Aggregate operator on
top (which performs duplicate elimination).
Question 95Skipped
For the ALLOWED_VALUES tag property, what is the MAXIMUM number of possible string
values for a single tag?
256
10
Correct answer
300
64
Overall explanation
chumma kizhii
#dm
The ALLOWED_VALUES tag property enables specifying the possible string values that can be
assigned to the tag when the tag is set on an object. The maximum number of possible
string values for a single tag is 300.
Question 96Skipped
What is the difference between a stored procedure and a User-Defined Function (UDF)?
Multiple stored procedures can be called as part of a single executable statement while a
single SQL statement can only call one UDF at a time.
Values returned by a stored procedure can be used directly in a SQL statement while the
values returned by a UDF cannot.
Correct answer
Overall explanation
Question 97Skipped
What are the main differences between the account usage views and the information
schema views? (Choose two.)
No active warehouse is needed to query account usage views but one is needed to query
information schema views.
Account usage views do not contain data about tables but information schema views do.
Correct selection
Account usage views contain dropped objects but information schema views do not.
Information schema views are read-only but account usage views are not.
Correct selection
Data retention for account usage views is 1 year but is 7 days to 6 months for information
schema views, depending on the view.
Overall explanation
chumma kizhii
#dm
Question 98Skipped
How long is a query visible in the Query History page in the Snowflake Web Interface (UI)?
Correct answer
14 days
24 hours
30 days
60 minutes
Overall explanation
Question 99Skipped
A schema consists of one or more databases. A database contains tables, views, and
warehouses.
A schema consists of one or more databases. A database contains tables and views.
Correct answer
A database consists of one or more schemas. A schema contains tables and views.
A database consists of one of more schemas and warehouses. A schema contains tables
and views.
Overall explanation
chumma kizhii
#dm
Question 100Skipped
As a best practice, all custom roles should be granted to which system-defined role?
ACCOUNTADMIN
SECURITYADMIN
Correct answer
SYSADMIN
ORGADMIN
Overall explanation
Question 101Skipped
How does Snowflake recommend handling the bulk loading of data batches from files
already available in cloud storage?
Use Snowpipe.
Correct answer
Overall explanation
chumma kizhii
#dm
Question 102Skipped
Which constraint type is enforced in Snowflake from the ANSI SQL standard?
UNIQUE
FOREIGN KEY
PRIMARY KEY
Correct answer
NOT NULL
Overall explanation
Snowflake supports defining and maintaining constraints, but does not enforce them, except
for NOT NULL constraints, which are always enforced.
Question 103Skipped
A user wants to add additional privileges to the system-defined roles for their virtual
warehouse.
Correct answer
Overall explanation
If additional privileges are needed, Snowflake recommends granting the additional privileges
to a custom role and assigning the custom role to the system-defined role.
Question 104Skipped
chumma kizhii
#dm
hat is the MAXIMUM number of clusters that can be provisioned with a multi-cluster virtual
warehouse?
100
Correct answer
10
Overall explanation
Question 105Skipped
3. FILE_FORMAT = myformat;
Correct answer
A stage with a directory table that has metadata that must be manually refreshed will be
created.
The command will fail to run because the name of the directory table is not specified.
An error will be received stating that the storage location for the stage must be identified
when creating a stage with a directory table.
Overall explanation
Specifies whether to add a directory table to the stage. When the value is TRUE, a directory
table is created with the stage.
chumma kizhii
#dm
Directory tables on internal stages require manual metadata refreshes. You could also
choose to include a directory table on external stages and refresh the metadata manually.
For information about automated metadata refreshes, see automated metadata refreshes.
Question 106Skipped
When executing a COPY INTO command, performance can be negatively affected by using
which optional parameter on a large number of files?
FILES
FILE_FORMAT
Correct answer
PATTERN
VALIDATION_MODE
Overall explanation
Question 107Skipped
Which file function provides a URL with access to a file on a stage without the need for
authentication and authorization?
BUILD_SCOPED_FILE_URL
BUILD_STAGE_FILE_URL
GET_RELATIVE_PATH
Correct answer
GET_PRESIGNED_URL
Overall explanation
Generates the pre-signed URL to a staged file using the stage name and relative file path as
inputs. Access files in an external stage using the function.
chumma kizhii
#dm
Question 108Skipped
Correct answer
Overall explanation
Question 109Skipped
WAREHOUSE_LOAD_HISTORY
MATERIALIZED_VIEW_REFRESH_HISTORY
AUTOMATIC_CLUSTERING_HISTORY
Correct answer
WAREHOUSE_METERING_HISTORY
Overall explanation
Question 110Skipped
When initially creating an account in Snowflake, which settings can be specified? (Choose
two.)
Correct selection
Snowflake edition
chumma kizhii
#dm
Region
Correct selection
Account name
Account locator
Organization name
Overall explanation
Be cautious with this question, it asks you to choose 2 options but actually there are 3
correct options: Account name, Region, Snowflake edition. In the real exam choosing 2 of
the 3 valid options will be counted as a correct question.
Question 111Skipped
Box plot
Bubble chart
Correct answer
Scatterplot
Pie chart
Overall explanation
• Bar charts
• Line charts
• Scatterplots
• Heat grids
• Scorecards
Question 112Skipped
If a multi-cluster warehouse is using an economy scaling policy, how long will queries wait in
the queue before another cluster is started?
2 minutes
chumma kizhii
#dm
8 minutes
Correct answer
6 minutes
1 minute
Overall explanation
Question 113Skipped
Given the statement template below, which database objects can be added to a share?
(Choose two.)
GRANT ON TO SHARE;
Streams
Correct selection
Secure functions
Stored procedures
Tasks
Correct selection
chumma kizhii
#dm
Tables
Overall explanation
Question 114Skipped
What common query issues can be identified using the Query Profile? (Choose two.)
Data classification
Correct selection
Inefficient pruning
Data masking
Unions
Correct selection
Exploding joins
Overall explanation
Question 115Skipped
How can a Snowflake user validate data that is unloaded using the COPY INTO command?
Correct answer
Overall explanation
String (constant) that instructs the COPY command to return the results of the query in the
SQL statement instead of unloading the results to the specified cloud storage location. The
only supported validation option is RETURN_ROWS. This option returns all rows produced by
the query.
When you have validated the query, you can remove the VALIDATION_MODE to perform the
unload operation.
chumma kizhii
#dm
Question 116Skipped
Why should a user select the economy scaling policy for a multi-cluster warehouse?
Correct answer
Overall explanation
Economy
Conserves credits by favoring keeping running clusters fully-loaded rather than starting
additional clusters, which may result in queries being queued and taking longer to complete.
Question 117Skipped
Which system function can be used to manage access to the data in a share and display
certain data only to paying customers?
SYSTEM$ALLOWLIST
SYSTEM$ALLOWLIST_PRIVATELINK
Correct answer
SYSTEM$IS_LISTING_PURCHASED
SYSTEM$AUTHORIZE_PRIVATELINK
Overall explanation
If you choose to limit trial consumers to specific data and functionality, create a single share
for your paid listing and use secure views and a system function provided by Snowflake,
SYSTEM$IS_LISTING_PURCHASED, to control which data is visible to trial consumers and
which data is available only to paying consumers.
Question 118Skipped
A Snowflake account administrator has set the resource monitors as shown in the diagram,
with actions defined for each resource monitor as “Notify & Suspend Immediately”.
chumma kizhii
#dm
Correct answer
5000
3500
1500
Overall explanation
Warehouse 2 is controlled by policy on the account level and all five warehouses usage
count toward this limit. Having limit on another warehouses (in this example: 3, 4, 5) just
means that warehouse 3, 4, 5 can be suspended earlier when reaching their limits.
The question ask for the MAXIMUM, so the best case scenario for warehouse 2 is that other
warehouses doesn't consume ANY resources and in such case warehouse 2 can burn whole
5000 limit.
Question 119Skipped
PARSE_JSON
Correct answer
FLATTEN
GET_PATH
chumma kizhii
#dm
GET
Overall explanation
Question 120Skipped
Which query will return a sample of a table with 1000 rows named testtable, in which each
row has a 10% probability of being included in the sample?
Correct answer
Overall explanation
Question 121Skipped
Referencing SnowSQL
Correct selection
Correct selection
Worksheet sharing
Overall explanation
Question 122Skipped
What is the default compression type when unloading data from Snowflake?
Brotli
bzip2
chumma kizhii
#dm
Zstandard
Correct answer
gzip
Overall explanation
By default, all unloaded data files are compressed using gzip, unless compression is explicitly
disabled or one of the other supported compression methods is explicitly specified.
Question 123Skipped
Blobs
Block storage
Correct answer
Micro-partitions
JSON
Overall explanation
Question 124Skipped
chumma kizhii
#dm
Which Snowflake objects track DML changes made to tables, like inserts, updates, and
deletes?
Tasks
Correct answer
Streams
Pipes
Procedures
Overall explanation
Question 125Skipped
Which view can be used to determine if a table has frequent row updates or deletes?
TABLES
STORAGE_DAILY_HISTORY
Correct answer
TABLE_STORAGE_METRICS
STORAGE_USAGE
Overall explanation
Question 126Skipped
A Snowflake user wants to share transactional data with retail suppliers. However, some of
the suppliers do not use Snowflake.
According to best practice, what should the Snowflake user do? (Choose two.)
Correct selection
Extract the shared transactional data to an external stage and use cloud storage utilities to
reload the suppliers' regions.
Correct selection
Use a data share for suppliers in the same cloud region and a replicated proxy share for
other cloud deployments.
chumma kizhii
#dm
Create an ETL pipeline that uses select and inserts statements from the source to the
target supplier accounts.
Overall explanation
Question 127Skipped
If a virtual warehouse runs for 30 seconds after it is provisioned, how many seconds will the
customer be billed for?
121 seconds
30 seconds
Correct answer
60 seconds
1 hour
Overall explanation
Question 128Skipped
What type of function can be used to estimate the approximate number of distinct values
from a table that has trillions of rows?
Window
External
Correct answer
HyperLogLog (HLL)
MD5
Overall explanation
chumma kizhii
#dm
Question 129Skipped
Which table type is no longer available after the close of the session and therefore has no
Fail-safe or Time Travel recovery option?
Permanent
Transient
External
Correct answer
Temporary
Overall explanation
Question 130Skipped
Correct answer
Overall explanation
Question 131Skipped
To use the OVERWRITE option on INSERT, which privilege must be granted to the role?
TRUNCATE
chumma kizhii
#dm
Correct answer
DELETE
SELECT
UPDATE
Overall explanation
To use the OVERWRITE option on INSERT, you must use a role that has DELETE privilege on
the table because OVERWRITE will delete the existing records in the table.
Question 132Skipped
What mechanisms can be used to inform Snowpipe that there are staged files available to
load into a Snowflake table? (Choose two.)
Snowsight interactions
Correct selection
REST endpoints
Email integrations
Correct selection
Cloud messaging
Error notifications
Overall explanation
Question 133Skipped
Which role can execute the SHOW ORGANIZATION ACCOUNTS command successfully?
ACCOUNTADMIN
USERADMIN
Correct answer
ORGADMIN
SECURITYADMIN
Overall explanation
chumma kizhii
#dm
Question 134Skipped
Network policy
SCIM
Correct answer
Overall explanation
• OAuth.
Network Policy in Snowflake are used to control the network traffic allowed to access a
Snowflake account. They can restrict access based on IP address ranges or other network-
related properties.
SCIM is used for managing user identities across different systems, including provisioning
and deprovisioning users in a centralized manner.
RBAC in Snowflake is a method for controlling access to resources based on the roles
assigned to users. It defines what actions users can take and what data they can access
within Snowflake.
Question 135Skipped
Correct answer
chumma kizhii
#dm
Overall explanation
If the source object is a database or schema, the clone inherits all granted privileges on the
clones of all child objects contained in the source object:
Note that the clone of the container itself (database or schema) does not inherit the
privileges granted on the source container.
Question 136Skipped
A user wants to upload a file to an internal Snowflake stage using a PUT command.
Which tools and/or connectors could be used to execute this command? (Choose two.)
Snowsight worksheets
Correct selection
SnowSQL
Correct selection
Python connector
SnowCD
SQL API
Overall explanation
page in either Snowflake web interface; instead, use the SnowSQL client or Drivers to
upload data files, or check the documentation for a specific Snowflake client to verify
support for this command.
chumma kizhii
#dm
Question 137Skipped
A Snowflake user has a query that is running for a long time. When the user opens the query
profiler, it indicates that a lot of data is spilling to disk.
The result cache is almost full and is unable to hold the results.
The cloud storage staging area is not sufficient to hold the data results.
Clustering has not been applied to the table so the table is not optimized.
Correct answer
The warehouse memory is not sufficient to hold the intermediate query results.
Overall explanation
Question 138Skipped
Snowflake Partner Connect is limited to users with a verified email address and which role?
USERADMIN
SECURITYADMIN
Correct answer
ACCOUNTADMIN
SYSADMIN
Overall explanation
Question 139Skipped
A Snowflake user has been granted the CREATE DATA EXCHANGE LISTING privilege with their
role.
Which tasks can this user now perform on the Data Exchange? (Choose two.)
Rename listings
Correct selection
chumma kizhii
#dm
Correct selection
Overall explanation
Question 140Skipped
Which semi-structured data formats can be loaded into Snowflake with a COPY command?
(Choose two.)
CSV
EDI
Correct selection
ORC
Correct selection
XML
HTML
Overall explanation
Question 141Skipped
A custom role owns multiple tables. If this role is dropped from the system, who becomes
the owner of these tables?
SYSADMIN
ACCOUNTADMIN
Correct answer
Overall explanation
Question 142Skipped
How does the Snowflake search optimization service improve query performance?
chumma kizhii
#dm
Correct answer
Overall explanation
Question 143Skipped
Correct selection
Table functions for historical and usage data across the Snowflake account
Table functions for account-level objects, such as roles, virtual warehouses, and databases
Correct selection
Views for historical and usage data across the Snowflake account
Overall explanation
Each database created in your account automatically includes a built-in, read-only schema
named INFORMATION_SCHEMA.
• Views for all the objects contained in the database, as well as views for account-level
objects (i.e. non-database objects such as roles, warehouses, and databases)
• Table functions for historical and usage data across your account.
Question 144Skipped
What is the MINIMUM size of a table for which Snowflake recommends considering adding a
clustering key?
chumma kizhii
#dm
Correct answer
1 Terabyte (TB)
1 Kilobyte (KB)
1 Gigabyte (GB)
1 Megabyte (MB)
Overall explanation
Question 145Skipped
Which Snowflake function is maintained separately from the data and helps to support
features such as Time Travel, Secure Data Sharing, and pruning?
Column compression
Data clustering
Correct answer
Metadata management
Micro-partitioning
Overall explanation
Metadata management in Snowflake is maintained separately from the actual data in Cloud
Service Layer.
Question 146Skipped
Which table function is used to view all errors encountered during a previous data load?
Correct answer
VALIDATE
QUERY_HISTORY
GENERATOR
INFER_SCHEMA
Overall explanation
VALIDATE
chumma kizhii
#dm
Validates the files loaded in a past execution of the COPY INTO <table> command and
returns all the errors encountered during the load, rather than just the first error.
Question 147Skipped
Which Data Definition Language (DDL) commands are supported by Snowflake to manage
tags? (Choose two.)
GRANT TAG
DESCRIBE TAG
Correct selection
DROP TAG
Correct selection
ALTER TAG
Overall explanation
1. CREATE TAG
3. SHOW TAGS
4. DROP TAG
5. UNDROP TAG
Question 148Skipped
What is the compressed size limit for semi-structured data loaded into a VARIANT data type
using the COPY command?
32 MB
8 MB
Correct answer
16 MB
64 MB
chumma kizhii
#dm
Overall explanation
Question 149Skipped
Correct answer
Overall explanation
Question 150Skipped
Which types of URLs are provided by Snowflake to access unstructured data files? (Choose
two).
Correct selection
Scoped URL
Correct selection
File URL
Dynamic URL
chumma kizhii
#dm
Relative URL
Absolute URL
Overall explanation
Question 151Skipped
Multiple data types, with only one condition, and one or more masking functions
Multiple data types, with one or more conditions, and one or more masking functions
A single data type, with only one condition, and only one masking function
Correct answer
A single data type, with one or more conditions, and one or more masking functions
Overall explanation
A masking policy consists of a single data type, one or more conditions, and one or more
masking functions.
Question 152Skipped
A Snowflake user is writing a User-Defined Function (UDF) with some unqualified object
names.
Correct answer
Snowflake will only check the schema the UDF belongs to.
Snowflake will first check the current schema, and then the PUBLIC schema of the current
database.
Snowflake will first check the current schema, and then the schema the previous query
used.
Overall explanation
chumma kizhii
#dm
In queries, unqualified object names are resolved through a search path. The SEARCH_PATH
is not used inside views or Writing User-Defined Functions (UDFs). All unqualifed objects in a
view or UDF definition will be resolved in the view’s or UDF’s schema only.
Question 153Skipped
How many network policies can be assigned to an account or specific user at a time?
Correct answer
One
Three
Two
Unlimited
Overall explanation
Only a single network policy can be assigned to the account or a specific user at a time.
Question 154Skipped
Zero
Eight
Unlimited
Correct answer
One
Overall explanation
A resource monitor can be set to monitor multiple warehouses but a warehouse can be
assigned only to a single resource monitor.
chumma kizhii
#dm
Question 155Skipped
Performance optimization
Correct answer
Compliance auditing
Cost monitoring
Data backups
Overall explanation
Access History in Snowflake refers to when the user query reads data and when the SQL
statement performs a data write operation, such as INSERT, UPDATE, and DELETE along with
variations of the COPY command, from the source data object to the target data object. The
user access history can be found by querying the Account Usage ACCESS_HISTORY view. The
records in this view facilitate regulatory compliance auditing and provide insights on
popular and frequently accessed tables and columns because there is a direct link between
the user (i.e. query operator), the query, the table or view, the column, and the data.
chumma kizhii