0% found this document useful (0 votes)
92 views475 pages

MS PDF VIEWER Snowsetanswers 2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views475 pages

MS PDF VIEWER Snowsetanswers 2

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 475

#dm

SET 1

Question 1Incorrect

A role is created and owns 2 tables. This role is then dropped. Who will now own the two
tables?

Correct answer

The assumed role that dropped the role

SYSADMIN

The user that deleted the role

Your answer is incorrect

The tables are now orphaned

Overall explanation

See this link

Question 2Correct

What information is included in the display in the Query Profile? (Choose two.)

Clustering keys details

Credit usage details

Index hints used in query

Your selection is correct

Details and statistics for the overall query

Your selection is correct

Graphical representation of the query processing plan

Overall explanation

See this link.

Question 3Correct

Which of the following sentences is an example of scaling up for a Snowflake virtual


Warehouse?

chumma kizhii
#dm

Adding a new virtual warehouse of the same size.

Change from Economy Scaling Policy to Standard.

Your answer is correct

Changing the size of a warehouse from L to XL.

Request more data storage to our Cloud Provider.

Overall explanation

Scaling up adds more compute power to a warehouse by improving the components. There
will be a moment where scaling up is impossible as you cannot enhance the components
more. Scaling out adds more warehouses to work in parallel. You can see it in the following
image:

Question 4Skipped

What can the Snowflake SCIM API be used to manage? (Choose two.)

Correct selection

Users

Network policies

Correct selection

Roles

Integrations

Session policies

Overall explanation

chumma kizhii
#dm

Snowflake is compatible with SCIM2.0, SCIM is an open standard for automating user
provisioning. The SCIM API allows us to programmatically manage roles and users within the
Snowflake platform, making it easier to automate identity and access management tasks.

See this link.

Question 5Correct

Which of the following terms best describes Snowflake's database architecture?

Shared disk

Columnar shared nothing

Cloud-native shared memory

Your answer is correct

Multi-cluster, shared data

Overall explanation

Built from the ground up for the cloud, Snowflake’s unique multi-cluster shared data
architecture delivers the performance, scale, elasticity, and concurrency today’s
organizations require.

See this link

Question 6Skipped

What is a feature of column-level security in Snowflake?

Role access policies

Network policies

Correct answer

External tokenization

Internal tokenization

Overall explanation

Currently, Column-level Security includes two features:

• Dynamic Data Masking

• External Tokenization

See this link.

chumma kizhii
#dm

Question 7Skipped

A developer is granted ownership of a table that has a masking policy. The developer’s role is
not able to see the masked data.

Will the developer be able to modify the table to read the masked data?

No, because masking policies must always reference specific access roles.

Yes, because masking policies only apply to cloned tables.

Yes, because a table owner has full control and can unset masking policies.

Correct answer

No, because ownership of a table does not include the ability to change masking policies.

Overall explanation

Object owners (i.e. the role that has the OWNERSHIP privilege on the object) do not have
the privilege to unset masking policies.

Object owners cannot view column data in which a masking policy applies.

See this link.

Question 8Incorrect

Which statistic displayed in a Query Profile is specific to external functions?

Correct answer

Total invocations

Partitions scanned

Your answer is incorrect

Bytes sent over the network

Bytes written

Overall explanation

Total invocations — number of times that an external function was called. (This can be
different from the number of external function calls in the text of the SQL statement due to
the number of batches that rows are divided into, the number of retries (if there are
transient network problems), etc.)

See this link

Question 9Incorrect

chumma kizhii
#dm

Which of the following commands are not blocking operations? (Choose two.)

Your selection is incorrect

DELETE

UPDATE

Your selection is correct

INSERT

MERGE

Correct selection

COPY

Overall explanation

The following guidelines apply in most situations:

COMMIT operations (including both AUTOCOMMIT and explicit COMMIT) lock resources,
but usually only briefly. UPDATE, DELETE, and MERGE statements hold locks that generally
prevent them from running in parallel with other UPDATE, DELETE, and MERGE statements.
Most INSERT and COPY statements write only new partitions. Those statements often can
run in parallel with other INSERT and COPY operations, and sometimes can run in parallel
with an UPDATE, DELETE, or MERGE statement.

Question 10Skipped

What is the Snowflake recommended Parquet file size when querying from external tables to
optimize the number of parallel scanning operations?

Correct answer

256-512 MB

100-250 MB

16-128 MB

1-16 MB

Overall explanation

Parquet Files, Recommended Size Range: 256-512MB

See this link.

chumma kizhii
#dm

Do not confuse this question with the size recommendation for COPY operations in
Snowflake (100MB-250MB).

Question 11Incorrect

What activities can a user with the ORGADMIN role perform? (Choose two.)

Correct selection

Create an account for an organization.

Delete the account data for an organization.

Your selection is correct

View usage information for all accounts in an organization.

Your selection is incorrect

Edit the account data for an organization.

Select all the data in tables for all accounts in an organization.

Overall explanation

A user with the ORGADMIN role can perform the following actions:

• Creating an Account.

• View/show all accounts in the Organization.

• Viewing a List of Regions Available for an Organization.

• View usage information for all accounts in the organization.

• Enable database replication for an account in the organization.

Note: Once an account is created, ORGADMIN can view the account properties but does not
have access to the account data.

See this link.

Question 12Skipped

Which of the below APIs are NOT Snowpipe REST APIs? (Choose two.)

insertFiles

insertReport

Correct selection

loadFiles

loadHistoryScan

chumma kizhii
#dm

Correct selection

insertHistoryScan

Overall explanation

You can make calls to REST endpoints to get information. For example, by calling the
following insertReport endpoint, you can get a report of files submitted via insertFiles:

1. GET
https://<account_id>.snowflakecomputing.com/v1/data/pipes/<pipeName>/insertR
eport

See this link.

Question 13Incorrect

What is the MINIMUM role required to set the value for the parameter
ENABLE_ACCOUNT_DATABASE_REPLICATION?

Your answer is incorrect

ACCOUNTADMIN

SECURITYADMIN

Correct answer

ORGADMIN

SYSADMIN

Overall explanation

To enable replication for accounts, a user with the ORGADMIN role uses the
SYSTEM$GLOBAL_ACCOUNT_SET_PARAMETER function to set the
ENABLE_ACCOUNT_DATABASE_REPLICATION parameter to true.

See this link.

Question 14Incorrect

Which Snowflake feature allows a user to track sensitive data for compliance, discovery,
protection, and resource usage?

Internal tokenization

Your answer is incorrect

Row access policies

Correct answer

chumma kizhii
#dm

Tags

Comments

Overall explanation

Tags enable data stewards to monitor sensitive data for compliance, discovery, protection,
and resource usage use cases through either a centralized or decentralized data governance
management approach.

See this link.

Question 15Incorrect

What is used to denote a pre-computed data set derived from a SELECT query specification
and stored for later use?

Correct answer

Materialized view

Secure view

Your answer is incorrect

View

External table

Overall explanation

See this link.

chumma kizhii
#dm

Question 16Correct

What property from the Resource Monitors lets you specify whether you want to control the
credit usage of your entire account or a specific set of warehouses?

Your answer is correct

Monitor Level.

Credit Quota.

Notification.

Schedule.

Overall explanation

The monitor level is a property that specifies whether the resource monitor is used to
monitor the credit usage for your entire account or individual warehouses.

Question 17Incorrect

A virtual warehouse initially suffers from poor performance as a result of queries from
multiple concurrent processes that are queuing. Over time, the problem resolved.

What action can be taken to prevent this from happening again?

Correct answer

Change the multi-cluster settings to add additional clusters.

Enable the search optimization service for the underlying tables.

Your answer is incorrect

Add a cluster key to the most used JOIN key.

chumma kizhii
#dm

Increase the size of the virtual warehouse.

Overall explanation

Multi-cluster warehouses are designed specifically for handling queuing and performance
issues related to large numbers of concurrent users and/or queries. In addition, multi-cluster
warehouses can help automate this process if your number of users/queries tend to
fluctuate

See this link.

Question 18Skipped

What columns are returned when performing a FLATTEN command on semi-structured


data? (Choose two.)

NODE

Correct selection

VALUE

Correct selection

KEY

LEVEL

ROOT

Overall explanation

See this link.

Question 19Incorrect

Which file formats support unloading semi-structured data? (Choose two.)

Your selection is correct

JSON

ORC

Your selection is incorrect

XML

Avro

Correct selection

Parquet

chumma kizhii
#dm

Overall explanation

The following file formats are supported: Semi-structured JSON, Parquet

See this link.

Not all semi-structured formats supported for data upload are supported for data unload.

Question 20Correct

By default, which Snowflake role is required to create a share?

ORGADMIN

SHAREADMIN

Your answer is correct

ACCOUNTADMIN

SECURITYADMIN

Overall explanation

CREATE SHARE: Account :Only the ACCOUNTADMIN role has this privilege by default. The
privilege can be granted to additional roles as needed.

See this link.

Question 21Incorrect

Which loop type iterates until a condition is true?

Correct answer

REPEAT

Your answer is incorrect

FOR

LOOP

WHILE

Overall explanation

A REPEAT loop iterates until a condition is true. In a REPEAT loop, the condition is tested
immediately after executing the body of the loop. As a result, the body of the loop always
executes at least once.

chumma kizhii
#dm

A WHILE loop iterates while a condition is true. In a WHILE loop, the condition is tested
immediately before executing the body of the loop. If the condition is false before the first
iteration, then the body of the loop does not execute even once.

See this link.

Question 22Incorrect

What types of views are available in Snowflake? (Choose three.)

Your selection is incorrect

Table View

Correct selection

Regular

Your selection is correct

Materialized View

Your selection is correct

Secure View

External View

Private View

Overall explanation

You can see the differences between them in the following image:

Question 23Incorrect

Which Snowflake partner specializes in data catalog solutions?

Your answer is incorrect

Tableau

DataRobot

dbt

chumma kizhii
#dm

Correct answer

Alation

Overall explanation

See this link.

Question 24Correct

Which cloud provider is not supported by Snowflake?

Your answer is correct

IBM.

Google Cloud Platform.

AWS.

Azure.

Overall explanation

A Snowflake account can only be hosted on Amazon Web Services, Google Cloud Platforms,
and Microsoft Azure for now.

chumma kizhii
#dm

Question 25Correct

What is the storage hierarchy in Snowflake?

Account → Databases → Objects → Schemas.

Account → Databases → Warehouses → Objects.

Your answer is correct

Account → Databases → Schemas → Objects.

Account → Schemas → Databases → Objects.

Overall explanation

The top-most container is the customer organization. All databases for your Snowflake
account are contained in the account object. Securable objects such as tables, views, stages,
and UDFs are contained in a schema object, which is, in turn, contained in a database. You
can see the complete Snowflake hierarchy in the following image (via docs.snowflake.com):

Question 26Incorrect

What strategies can be used to optimize the performance of a virtual warehouse? (Choose
two.)

Suspend the warehouse frequently.

Your selection is correct

Increase the warehouse size.

Your selection is incorrect

chumma kizhii
#dm

Increase the MAX_CONCURRENCY_LEVEL parameter.

Correct selection

Reduce queuing.

Allow memory spillage.

Overall explanation

Optimizing Warehouses for Performance:

• Reduce queues (if WH is not multi cluster make it multi cluster else add another WH)

• Resolve memory spillage ( scale up or convert it to Snowpark-optimized WH)

• Increase warehouse size (Scale up)

• Try Query Acceleration - set ENABLE_QUERY_ACCELERATION = true

• Optimizing the Warehouse Cache (by increasing Auto-suspension)

• Limiting Concurrently Running Queries (Decrease the MAX_CONCURRENCY_LEVEL)

See this link.

Question 27Correct

How often does Snowpipe load the data?

When we manually execute the COPY procedure.

Once every 5 minutes.

Once every 1 minute.

Your answer is correct

As soon as they are available in a stage.

Overall explanation

Snowpipe enables loading data when the files are available in any (internal/external) stage.
We use it when we have a small volume of frequent data and want to load it continuously
(micro-batches). Snowpipe enables loading data when the files are available in any
(internal/external) stage. We use it when we have a small volume of frequent data and want
to load it continuously (micro-batches).

chumma kizhii
#dm

Question 28Incorrect

Account-level storage usage can be monitored via:

The Information Schema -> ACCOUNT_USAGE_HISTORY View

Correct answer

The Snowflake Web Interface (UI) in the Account -> Billing & Usage section

The Snowflake Web Interface (UI) in the Databases section

Your answer is incorrect

The Account Usage Schema -> ACCOUNT_USAGE_METRICS View

Overall explanation

See this link

Question 29Correct

What is the MOST performant file format for loading data in Snowflake?

Parquet

CSV (Unzipped)

Your answer is correct

CSV (Gzipped)

ORC

Overall explanation

Loading from Gzipped CSV is several times faster than loading from ORC and Parquet at an
impressive 15 TB/Hour. While 5-6 TB/hour is decent if your data is originally in ORC or
Parquet, don’t go out of your way to CREATE ORC or Parquet files from CSV in the hope that
it will load Snowflake faster.

chumma kizhii
#dm

Loading data into fully structured (columnarized) schema is ~10-20% faster than landing it
into a VARIANT.

See this link.

Question 30Incorrect

Which Snowflake object stores a generated identity and access management (IAM) entity for
your external cloud storage, along with an optional set of allowed or blocked storage
locations (Amazon S3, Google Cloud Storage, or Microsoft Azure)?

Correct answer

Storage Integration.

Storage Schema.

Your answer is incorrect

Security Integration.

User Stage.

Overall explanation

A storage integration is a Snowflake object that stores a generated identity and access
management (IAM) entity for your external cloud storage. This option will enable users to
avoid supplying credentials when creating stages or when loading or unloading data.

Question 31Incorrect

When should you consider disabling auto-suspend for a Virtual Warehouse? (Choose two.)

Correct selection

When the compute must be available with no delay or lag time

When you want to avoid queuing and handle concurrency

Correct selection

When managing a steady workload

Your selection is incorrect

chumma kizhii
#dm

When users will be using compute at different times throughout a 24/7 period

Your selection is incorrect

When you do not want to have to manually turn on the Warehouse each time a user
needs it

Overall explanation

See this link

Question 32Correct

Which Snowflake feature records changes made to a table so actions can be taken using that
change data capture?

Task

Materialized View

Pipe

Your answer is correct

Stream

Overall explanation

Note that a stream itself does not contain any table data. A stream only stores an offset for
the source object and returns CDC records by leveraging the versioning history for the
source object.

See this link.

Question 33Incorrect

How can a Snowflake user configure a virtual warehouse to support over 100 users if their
company has Enterprise Edition?

Use a larger warehouse.

Correct answer

Use a multi-cluster warehouse.

Add additional warehouses and configure them as a cluster.

Your answer is incorrect

Set the auto-scale to 100.

Overall explanation

chumma kizhii
#dm

Multi-cluster warehouse is the solution for simultaneous querying.

See this link.

Question 34Correct

How can a user improve the performance of a single large complex query in Snowflake?

Scale out the virtual warehouse.

Enable economy warehouse scaling.

Enable standard warehouse scaling.

Your answer is correct

Scale up the virtual warehouse.

Overall explanation

Resizing a warehouse generally improves query performance, particularly for larger, more
complex queries. For query complexity Scale Up , for concurrency and query load tuning
scale out.

See this link.

Question 35Correct

A query executed a couple of hours ago, which spent more than 5 minutes to run, is
executed again, and it returned the results in less than a second. What might have
happened?

Your answer is correct

Snowflake used the persisted query results from the query result cache.

Snowflake used the persisted query results from the metadata cache.

A new Snowflake version has been released in the last two hours, improving the speed of
the service.

Snowflake used the persisted query results from the warehouse cache.

Overall explanation

The query result cache stores the results of our queries for 24 hours, so as long as we
perform the same query and the data hasn’t changed in the storage layer, it will return the
same result without using the warehouse and without consuming credits.

Question 36Incorrect

chumma kizhii
#dm

How should a virtual warehouse be configured if a user wants to ensure that additional
multi-clusters are resumed with the shorteset delay possible?

Use the economy warehouse scaling policy

Configure the warehouse to a size larger than generally required

Correct answer

Use the standard warehouse scaling policy

Your answer is incorrect

Set the minimum and maximum clusters to autoscale

Overall explanation

The economy policy doesn't start immediately.

Only if the system estimates there’s enough query load to keep the cluster busy for at least 6
minutes.

See this link.

Question 37Correct

If a Small Warehouse is made up of 2 servers/cluster, how many servers/cluster make up a


Medium Warehouse?

Your answer is correct

16

128

chumma kizhii
#dm

32

Overall explanation

See this link

Question 38Correct

Which of the following objects is not covered by Time Travel?

Tables

Schemas

Your answer is correct

Stages

Databases

Overall explanation

See this link

Question 39Correct

Which of the following roles are NOT System-Defined Roles in Snowflake? (Choose two.)

SECURITYADMIN

Your selection is correct

STORAGEADMIN

Your selection is correct

VIEWER

SYSADMIN

USERADMIN

Overall explanation

The PUBLIC role is also a System-Defined role. You can see the differences between them in
the following table:

chumma kizhii
#dm

Question 40Incorrect

When a Pipe is recreated using the CREATE OR REPLACE PIPE command:

Your answer is incorrect

Previoulsy loaded files will be purged

The REFRESH parameter is set to TRUE

Previously loaded files will be ignored

Correct answer

The Pipe load history is reset to empty

Overall explanation

REFRESH is a parameter for ALTER PIPE. See this link.

It is NOT a parameter for CREATE [OR REPLACE] Pipe. See this link.

Further recreating a pipe resets history: See this link.

Load History

chumma kizhii
#dm

The load history for Snowpipe operations is stored in the metadata of the pipe object. When
a pipe is recreated, the load history is dropped. In general, this condition only affects users if
they subsequently execute an ALTER PIPE … REFRESH statement on the pipe. Doing so could
load duplicate data from staged files in the storage location for the pipe if the data was
already loaded successfully and the files were not deleted subsequently.

Question 41Correct

How is table data compressed in Snowflake?

The micro-partitions are stored in compressed cloud storage and the cloud storage handles
compression.

Each micro-partition is compressed as it is written into cloud storage using GZIP.

Your answer is correct

Each column is compressed as it is stored in a micro-partition.

The text data in a micro-partition is compressed with GZIP but other types are not
compressed.

Overall explanation

Snowflake automatically determines the most efficient compression algorithm for the
columns in each micro-partition.

See this link.

Question 42Correct

Which of the following statements is true of zero-copy cloning?

Zero-copy clones increase storage costs as cloning the table requires storing its data twice

Your answer is correct

At the instance/instant a clone is created, all micro-partitions in the original table and the
clone are fully shared

Zero-copy cloning is licensed as an additional Snowflake feature

All zero-copy clone objects inherit the privileges of their original objects

Overall explanation

Using Zero-Copy cloning, you can create a snapshot of any table, schema, or database. The
cloned object is independent and can be modified without modifying the original. It does
NOT duplicate data; it duplicates the metadata of the micro-partitions, making it not
consume storage.

chumma kizhii
#dm

See this link

Question 43Correct

What is the recommended Snowflake data type to store semi-structured data like JSON?

LOB

Your answer is correct

VARIANT

RAW

VARCHAR

Overall explanation

Semi-structured data is saved as Variant type in Snowflake tables, with a maximum limit size
of 16MB, and it can be queried using JSON notation. You can store arrays, objects, etc.

Reference:

See this link.

Question 44Correct

Which item in the Data Warehouse migration process does not apply in Snowflake?

Build the Data Pipeline

Your answer is correct

Migrate Indexes

chumma kizhii
#dm

Migrate Users

Migrate Schemas

Overall explanation

Snowflake does not use indexes.

Question 45Skipped

A Snowflake user needs to import a JSON file larger than 16 MB.

What file format option could be used?

trim_space = true

compression = auto

Correct answer

strip_outer_array = true

strip_outer_array = false

Overall explanation

strip_outer_array = true will remove the outer array structure and copy the file into multiple
table rows instead of row. (the limitation for table rows is max 16MB) with this solution it
will be fine.

See this link.

Question 46Correct

In which layer of Snowflake architecture is stored all security-related information?

Storage.

Your answer is correct

Cloud Services.

Compute.

All of the above.

Overall explanation

The Cloud Services layer is a collection of services coordinating activities across Snowflake.
It's in charge of Authentication, Infrastructure management, Metadata management, Query
parsing and optimization, and Access control.

chumma kizhii
#dm

Question 47Incorrect

Snowflake recommends, as a minimum, that all users with the following role(s) should be
enrolled in Multi-Factor Authentication (MFA):

SECURITYADMIN, ACCOUNTADMIN

Correct answer

ACCOUNTADMIN

Your answer is incorrect

SECURITYADMIN, ACCOUNTADMIN, PUBLIC, SYSADMIN

SECURITYADMIN, ACCOUNTADMIN, SYSADMIN

Overall explanation

See this link

Question 48Correct

To run a Multi-Cluster Warehouse in auto-scale mode, a user would:

Your answer is correct

Set the Minimum Clusters and Maximum Clusters settings to the different values

Set the Minimum Clusters and Maximum Clusters settings to the same value

Set the Warehouse type to Auto

Configure the Maximum Clusters setting to Auto-Scale

Overall explanation

If you set the minimum cluster count less than the maximum cluster count, then the
warehouse runs in Auto-scale mode.

See this link

Question 49Incorrect

What is the recommended approach for unloading data to a cloud storage location from
Snowflake?

Use a third-party tool to unload the data to cloud storage.

Correct answer

Unload the data directly to the cloud storage location.

Your answer is incorrect

chumma kizhii
#dm

Unload the data to a user stage, then upload the data to cloud storage.

Unload the data to a local file system, then upload it to cloud storage.

Overall explanation

The best approach is to use the COPY INTO <location> command to copy the data from the
Snowflake database table into one or more files in a Snowflake or external stage.

After that you will be able to download the file from the stage to a local file system (not
before)

See this link.

Question 50Correct

What technique does Snowflake use to limit the number of micro-partitions retrieved as
part of a query?

Computing.

Clustering.

Your answer is correct

Pruning.

Indexing.

Overall explanation

Query pruning consists of analyzing the smallest number of micro-partitions to solve a


query. This technique retrieves all the necessary data to give a solution without looking at all
the micro-partitions, saving a lot of time to return for the result.

Question 51Skipped

What are characteristics of directory tables when used with unstructured data? (Choose
two.)

Correct selection

A directory table can be added explicitly to a stage when the stage is created.

Only cloud storage stages support directory tables.

Correct selection

Directory tables store a catalog of staged files in cloud storage.

A directory table is a separate database object that can be layered explicitly on a stage.

Each directory table has grantable privileges of its own.

chumma kizhii
#dm

Overall explanation

A directory table is an implicit object layered on a stage (not a separate database object) and
is conceptually similar to an external table because it stores file-level metadata about the
data files in the stage. A directory table has no grantable privileges of its own.

Both external (external cloud storage) and internal (Snowflake) stages support directory
tables

See this link.

Question 52Skipped

A Snowflake user runs a query for 36 seconds on a size 2XL virtual warehouse.

What would be the credit consumption?

Snowflake will charge for 60 seconds at the rate of 64 credits per hour.

Correct answer

Snowflake will charge for 60 seconds at the rate of 32 credits per hour.

Snowflake will charge for 36 seconds at the rate of 32 credits per hour.

Snowflake will charge for 36 seconds at the rate of 64 credits per hour.

Overall explanation

Credits are billed per-second, with a 60-second (i.e. 1-minute) minimum.

See this link.

Question 53Skipped

Which command can be used to delete staged files from a Snowflake stage when the files
are no longer needed?

Correct answer

REMOVE

TRUNCATE TABLE

chumma kizhii
#dm

DROP

DELETE

Overall explanation

REMOVE

Removes files from either an external (external cloud storage) or internal (i.e. Snowflake)
stage.

See this link.

Question 54Skipped

If auto-suspend is enabled for a Virtual Warehouse, the Warehouse is automatically


suspended when:

There are no users logged into Snowflake.

The last query using the Warehouse completes.

Correct answer

The Warehouse is inactive for a specified period of time.

All Snowflakes sessions using the Warehouse are terminated.

Overall explanation

See this link

Question 55Skipped

A medium (M) warehouse has auto-suspend configured after 15 minutes. You have noticed
that all of the queries that run on this warehouse finish within a minute. What will you do to
optimize compute costs?

Disable the auto-suspend option.

Delete the warehouse after a minute.

Use another data-warehouse.

Correct answer

Reduce the auto-suspend time to 1 minute.

Overall explanation

By reducing the minutes of the "auto-suspend" option, the warehouse will automatically go
to sleep after 60 seconds of inactivity, significantly reducing credit consumption.

chumma kizhii
#dm

Question 56Skipped

What type of function returns one value for each invocation?

Table

Aggregate

Correct answer

Scalar

Window

Overall explanation

A scalar function is a function that returns one value per invocation; in most cases, you can
think of this as returning one value per row. This contrasts with Aggregate Functions, which
return one value per group of rows.

See this link.

Question 57Skipped

What is the purpose of the Snowflake SPLIT_TO_TABLE function?

To count the number of characters in a string

To split a string and flatten the results into columns

Correct answer

To split a string and flatten the results into rows

To split a string into an array of sub-strings

Overall explanation

SPLIT_TO_TABLE

This table function splits a string (based on a specified delimiter) and flattens the results into
rows.

See this link.

Question 58Skipped

A Snowflake query took 40 minutes to run. The results indicate that ‘Bytes spilled to local
storage’ was a large number.

What is the issue and how can it be resolved?

Correct answer

chumma kizhii
#dm

The warehouse is too small. Increase the size of the warehouse to reduce the spillage.

The Snowflake console has timed-out. Contact Snowflake Support.

The warehouse is too large. Decrease the size of the warehouse to reduce the spillage.

The warehouse consists of a single cluster. Use a multi-cluster warehouse to reduce the
spillage.

Overall explanation

Warehouse size should be increased (Scale up). Multi cluster warehouse will just help in
managing concurrency.

See this link.

Question 59Skipped

Which statement describes how Snowflake supports reader accounts?

A consumer needs to become a licensed Snowflake customer as data sharing is only


supported between Snowflake accounts.

The users in a reader account can query data that has been shared with the reader
account and can perform DML tasks.

Correct answer

The SHOW MANAGED ACCOUNTS command will view all the reader accounts that have
been created for an account.

A reader account can consume data from the provider account that created it and combine
it with its own data.

Overall explanation

SHOW MANAGED ACCOUNTS

Lists the managed accounts created for your account. Currently used by data providers to
create reader accounts for their consumers.

See this link.

To view all the reader accounts that have been created for your account, use the SHOW
MANAGED ACCOUNTS command.

See this link.

About option : A reader account can consume data from the provider account that created it
and combine it with its own data.

chumma kizhii
#dm

Reader accounts (formerly known as “read-only accounts”) enable providers to share data
with consumers who are not already Snowflake customers, without requiring the consumers
to become Snowflake customers.

A reader account is intended primarily for querying data shared by the provider of the
account. You can work with data, for example, by creating materialized views.

You cannot perform the following tasks in a reader account:

• Set a data metric function on objects in the reader account.

• Upload new data.

• Modify existing data.

• Unload data using a storage integration. However, you can use the COPY INTO
<location> command with your connection credentials to unload data into a cloud
storage location.

See this link.

Question 60Skipped

When can a Virtual Warehouse start running queries?

12am-5am

After replication

Correct answer

When its provisioning is complete

Only during administrator defined time slots

Overall explanation

Virtual warehouses can be configure to auto_resume=true/false, accordingly once it


provision it start executing the queries.

See this link

Question 61Skipped

Which of the following sizes is not a Warehouse Size?

XS

chumma kizhii
#dm

Correct answer

XXS

Overall explanation

The minimum configuration for a Snowflake Warehouse is X-Small (XS), which consumes one
credit/hour. You can see the different sizes in the following image:

Question 62Skipped

What authentication method does the Kafka connector use within Snowflake?

Username and password

Correct answer

Key pair authentication

Multi-Factor Authentication (MFA)

chumma kizhii
#dm

OAuth

Overall explanation

See this link.

Question 63Skipped

What consideration should be made when loading data into Snowflake?

Create small data files and stage them in cloud storage frequently.

Create small data files to optimize data loading.

The number of load operations that run in parallel can exceed the number of data files to
be loaded.

Correct answer

The number of data files that are processed in parallel is determined by the virtual
warehouse.

Overall explanation

One of the most typical practices of data ingestion.

See this link.

Question 64Skipped

Which Snowflake object can be used to record DML changes made to a table?

Task

Snowpipe

Correct answer

Stream

Stage

Overall explanation

Streams are used for CDC purposes.

See this link.

Question 65Skipped

Which Snowflake edition allows only one day of Time Travel?

Enterprise.

No version allows only one day of Time Travel.

chumma kizhii
#dm

Business Critical.

Correct answer

Standard.

Overall explanation

We can increase the Time Travel functionality to 90 days if we have (at least) the Snowflake
Enterprise Edition.

Question 66Skipped

What feature of Snowflake Continuous Data Protection can be used for maintenance of
historical data?

Network policies

Access control

Fail-safe

Correct answer

Time Travel

Overall explanation

Snowflake Time Travel enables accessing historical data that has been changed or deleted at
any point within a defined period. It is a powerful CDP (Continuous Data Protection) feature
which ensures the maintenance and availability of your historical data.

See this link.

Question 67Skipped

What are common issues found by using the Query Profile? (Choose two.)

Correct selection

Data spilling to a local or remote disk

chumma kizhii
#dm

Correct selection

Identifying inefficient micro-partition pruning

Identifying logical issues with the queries

Identifying queries that will likely run very slowly before executing them

Locating queries that consume a high amount of credits

Overall explanation

See this link.

Question 68Skipped

A deterministic query is run at 8am, takes 5 minutes, and the results are cached. Which of
the following statements are true? (Choose two.)

Snowflake edition is Enterprise or higher

Correct selection

The same exact query will return the precomputed results if the underlying data hasn't
changed and the results were last accessed within previous 24 hour period

The same exact query will return the precomputed results even if the underlying data has
changed as long as the results were last accessed within the previous 24 hour period

Correct selection

The 24 hours timer on the precomputed results gets renewed every time the exact query is
executed

The exact query will ALWAYS return the precomputed result set for the
RESULT_CACHE_ACTIVE = time period

Overall explanation

See this link.

Question 69Skipped

You have two virtual warehouses in your Snowflake account. If one of them updates the data
in the storage layer, when will the other one see it?

Once all the compute resources are provisioned for the second warehouse.

Correct answer

Immediately.

After an average time of 5 seconds.

chumma kizhii
#dm

After the sync process.

Overall explanation

All the warehouses of your account share the storage layer, so if the data is updated, all the
warehouses will be able to see it. You can see this behavior in the following image:

Question 70Skipped

Which of the following Snowflake features provide continuous data protection


automatically? (Choose two.)

Zero-copy clones

Internal stages

Correct selection

Fail-safe

Incremental backups

Correct selection

Time Travel

Overall explanation

See this link.

Question 71Skipped

Which pages are included in the Activity area of Snowsight? (Choose two.)

chumma kizhii
#dm

Sharing settings

Correct selection

Copy History

Automatic Clustering History

Contacts

Correct selection

Query History

Overall explanation

See this link.

Question 72Skipped

Which statement best describes Snowflake tables?

Snowflake tables are owned by a user

Correct answer

Snowflake tables are logical representations of underlying physical data

Snowflake tables are the physical instantiation of data loaded into Snowflake

Snowflake tables require that clustering keys be defined to perform optimally

Overall explanation

See this link

Question 73Skipped

Why is a federated environment used for user authentication in Snowflake?

To enable direct integration with external databases

Correct answer

To separate user authentication from user access

To provide real-time monitoring of user activities

To enhance data security and privacy

Overall explanation

In a federated environment, user authentication is separated from user access.

chumma kizhii
#dm

See this link.

Question 74Skipped

Files have been uploaded to a Snowflake internal stage. The files now need to be deleted.

Which SQL command should be used to delete the files?

PURGE

DELETE

MODIFY

Correct answer

REMOVE

Overall explanation

See this link.

Question 75Skipped

Which Snowflake object does not consume any storage costs?

Materialized view

Temporary table

Transient table

Correct answer

Secure view

Overall explanation

Transient and temporary tables contribute to the storage charges that Snowflake bills your
account until explicitly dropped. Data stored in these table types contributes to the overall
storage charges Snowflake bills your account while they exist.

Materialized views impact your costs for both storage and compute resources:

• Storage: Each materialized view stores query results, which adds to the monthly
storage usage for your account.

• Compute resources: In order to prevent materialized views from becoming out-of-


date, Snowflake performs automatic background maintenance of materialized views.

Question 76Skipped

chumma kizhii
#dm

Which common query problems can the Query Profile help a user identify and troubleshoot?
(Choose two.)

Correct selection

When there is a UNION without ALL

When the SELECT DISTINCT command returns too many values

When window functions are used incorrectly

When there are Common Table Expressions (CTEs) without a final SELECT statement

Correct selection

When there are exploding joins

Overall explanation

some of the common query problems identified by Query Profile, including:

• Exploding joins

• UNION without ALL

• Queries too large to fit in memory (most often evidenced by “spilling”)

• Inefficient pruning (most often evidenced by large table scans)

See this link.

Question 77Skipped

Who can grant object privileges in a regular schema?

SYSADMIN

Database owner

Correct answer

Object owner

Schema owner

Overall explanation

In regular (i.e. non-managed) schemas, object owners (i.e. a role with the OWNERSHIP
privilege on an object) can grant access on their objects to other roles, with the option to
further grant those roles the ability to manage object grants

With managed access schemas, object owners lose the ability to make grant decisions. Only
the schema owner (i.e. the role with the OWNERSHIP privilege on the schema) or a role with

chumma kizhii
#dm

the MANAGE GRANTS privilege can grant privileges on objects in the schema, including
future grants, centralizing privilege management.

See this link.

Question 78Skipped

A Virtual Warehouse's auto-suspend and auto-resume settings apply to:

The queries currently being run by the Virtual Warehouse

The primary cluster in the Virtual Warehouse

Correct answer

The entire Virtual Warehouse

The database the Virtual Warehouse resides in

Overall explanation

See this link

Question 79Skipped

Which Snowflake object helps evaluate virtual warehouse performance impacted by query
queuing?

Information_schema.warehouse_metering_history

Correct answer

Account_usage.query_history

Resource monitor

Information_schema.warehouse_load_history

Overall explanation

Warehouse query load measures the average number of queries that were running or
queued within a specific interval. You can customize the time period and time interval during
which to evaluate warehouse performance by querying the Account Usage QUERY_HISTORY
View.

See this link.

Question 80Skipped

Which view in SNOWFLAKE.ACCOUNT_USAGE shows from which IP address a user


connected to Snowflake?

Correct answer

chumma kizhii
#dm

LOGIN_HISTORY

ACCESS_HISTORY

QUERY_HISTORY

SESSIONS

Overall explanation

See this link.

Question 81Skipped

How does Snowflake allow a data provider with an Azure account in central Canada to share
data with a data consumer on AWS in Australia?

The data provider uses the GET DATA workflow in the Snowflake Data Marketplace to
create a share between Azure Central Canada and AWS Asia Pacific.

The data consumer and data provider can form a Data Exchange within the same
organization to create a share from Azure Central Canada to AWS Asia Pacific.

Correct answer

The data provider must replicate the database to a secondary account in AWS Asia Pacific
within the same organization then create a share to the data consumer's account

The data provider in Azure Central Canada can create a direct share to AWS Asia Pacific, if
they are both in the same organization.

Overall explanation

Different regions, different Cloud providers so replication is the key.

See this link.

Question 82Skipped

Which Snowflake feature can be used to find sensitive data in a table or column?

Correct answer

Data classification

External functions

Row level policies

Masking policies

Overall explanation

chumma kizhii
#dm

Data Classification allows categorizing potentially personal and/or sensitive data to support
compliance and privacy regulations.

See this link.

Question 83Skipped

Which command is used to take away staged files from a Snowflake stage after a successful
data ingestion?

DELETE

TRUNCATE

Correct answer

REMOVE

DROP

Overall explanation

Staged files can be deleted from a Snowflake stage (user stage, table stage, or named stage)
using the following methods:

• Files that were loaded successfully can be deleted from the stage during a load by
specifying the PURGE copy option in the COPY INTO <table> command.

• After the load completes, use the REMOVE command to remove the files in the
stage.

See this link.

Question 84Skipped

Who can access a referenced file through a scoped URL?

Correct answer

Only the user who generates the URL

Any user specified in the GET REST API call with sufficient privileges

Only the ACCOUNTADMIN

Any role specified in the GET REST API call with sufficient privileges

Overall explanation

Scoped URL

Only the user who generated the scoped URL can use the URL to access the referenced file.

chumma kizhii
#dm

See this link.

Question 85Skipped

During periods of warehouse contention, which parameter controls the maximum length of
time a warehouse will hold a query for processing?

QUERY_TIMEOUT_IN_SECONDS

STATEMENT_TIMEOUT_IN_SECONDS

MAX_CONCURRENCY_LEVEL

Correct answer

STATEMENT_QUEUED_TIMEOUT_IN_SECONDS

Overall explanation

STATEMENT_QUEUED_TIMEOUT_IN_SECONDS

Amount of time, in seconds, a SQL statement (query, DDL, DML, etc.) remains queued for a
warehouse before it is canceled by the system.

See this link.

Question 86Skipped

What does a Query Profile provide in Snowflake?

A multi-step query that displays each processing step in the same panel.

Correct answer

A graphical representation of the main components of the processing plan for a query.

A pre-computed data set derived from a query specification and stored for later use.

A collapsible panel in the operator tree pane that lists nodes by execution time in
descending order for a query.

Overall explanation

Query Profile, available through the classic web interface, provides execution details for a
query. For the selected query, it provides a graphical representation of the main components
of the processing plan for the query, with statistics for each component, along with details
and statistics for the overall query.

See this link.

Question 87Skipped

chumma kizhii
#dm

Snowflake provides two mechanisms to reduce data storage costs for short-lived tables.
These mechanisms are: (Choose two.)

Correct selection

Temporary Tables

Provisional Tables

Materialized views

Permanent Tables

Correct selection

Transient Tables

Overall explanation

See this link

Question 88Skipped

Which is true of Snowflake network policies? A Snowflake network policy: (Choose two.)

Correct selection

Is available to all Snowflake Editions

Is activated using an ALTER DATABASE command

Only ACCOUNTADMIN role or a role with the global CREATE NETWORK POLICY privilege
can create network policies

Is only available to customers with Business Critical Edition

Correct selection

Restricts or enables access to specific IP addresses

chumma kizhii
#dm

Overall explanation

See this link

Question 89Skipped

What object will you use to schedule a merge statement in Snowflake so that it runs every
hour?

Table.

Stream.

Correct answer

Task.

Pipe.

Overall explanation

Snowflake tasks are schedulable scripts that are run inside your Snowflake environment. No
event source can trigger a task; instead, a task runs on a schedule. In this case, it will run
every hour.

Question 90Skipped

Which function should be used to authorize users to access rows in a base table when using
secure views with Secure Data Sharing?

CURRENT_ROLE()

Correct answer

CURRENT_ACCOUNT()

CURRENT_USER()

CURRENT_SESSION()

Overall explanation

When using secure views with Secure Data Sharing, use the CURRENT_ACCOUNT function to
authorize users from a specific account to access rows in a base table.

See this link.

Question 91Skipped

How does a scoped URL expire?

chumma kizhii
#dm

Correct answer

When the persisted query result period ends.

The length of time is specified in the expiration_time argument.

When the data cache clears.

The encoded URL access is permanent.

Overall explanation

Types of URLs Available to Access Files

The following types of URLs are available to access files in cloud storage:

Scoped URL

Encoded URL that permits temporary access to a staged file without granting privileges to
the stage. The URL expires when the persisted query result period ends (i.e. the results
cache expires), which is currently 24 hours.

File URL

URL that identifies the database, schema, stage, and file path to a set of files. A role that has
sufficient privileges on the stage can access the files.

Pre-signed URL

Simple HTTPS URL used to access a file via a web browser. A file is temporarily accessible to
users via this URL using a pre-signed access token. The expiration time for the access token is
configurable.

See this link

Question 92Skipped

Which system-defined Snowflake role has permission to rename an account and specify
whether the original URL can be used to access the renamed account?

ACCOUNTADMIN

Correct answer

ORGADMIN

SYSADMIN

SECURITYADMIN

Overall explanation

chumma kizhii
#dm

An organization administrator (i.e. a user granted the ORGADMIN role) can rename an
account.

When an account is renamed, Snowflake creates a new account URL that is used to access
the account. During the renaming, the administrator can accept the default to save the
original account URL so users can continue to use it, or they can delete the original URL to
force users to use the new URL.

See this link.

Question 93Skipped

A company’s security audit requires generating a report listing all Snowflake logins (e.g., date
and user) within the last 90 days.

Which of the following statements will return the required information?

1. SELECT LAST_SUCCESS_LOGIN, LOGIN_NAME

2. FROM ACCOUNT_USAGE.USERS;

1. SELECT EVENT_TIMESTAMP, USER_NAME

2. FROM ACCOUNT_USAGE.ACCESS_HISTORY;

1. SELECT EVENT_TIMESTAMP, USER_NAME

2. FROM table(information_schema.login_history_by_user())

Correct answer

1. SELECT EVENT_TIMESTAMP, USER_NAME

2. FROM ACCOUNT_USAGE.LOGIN_HISTORY;

Overall explanation

login_history_by_user function returns login activity within the last 7 days only.

See this link.

Question 94Skipped

A Snowflake user wants to temporarily bypass a network policy by configuring the user
object property MINS_TO_BYPASS_NETWORK_POLICY.

What should they do?

Use the SECURITYADMIN role.

Use the USERADMIN role.

chumma kizhii
#dm

Correct answer

Contact Snowflake Support.

Use the SYSADMIN role.

Overall explanation

It is possible to temporarily bypass a network policy for a set number of minutes by


configuring the user object property MINS_TO_BYPASS_NETWORK_POLICY, which can be
viewed by executing DESCRIBE USER. Only Snowflake can set the value for this object
property. Please contact Snowflake Support to set a value for this property.

See this link.

Question 95Skipped

Which function returns the name of the warehouse of the current session?

WAREHOUSE()

RUNNING_WAREHOUSE()

Correct answer

CURRENT_WAREHOUSE()

ACTIVE_WAREHOUSE()

Overall explanation

I’m not a big fan of learning commands by heart, and they are unlikely to appear on the
exam, but this one may be useful. You have other commands to show the current database
and schema, as you can see by executing the following command:

1. SELECT CURRENT_WAREHOUSE(), CURRENT_DATABASE(), CURRENT_SCHEMA();

Question 96Skipped

What is the Snowflake multi-clustering feature for virtual warehouses used for?

To improve data loading from very large data sets

To speed up slow or stalled queries

Correct answer

To improve concurrency for users and queries

To improve the data unloading process to the cloud

Overall explanation

chumma kizhii
#dm

Multi-cluster warehouses enable you to scale compute resources to manage your user and
query concurrency needs as they change, such as during peak and off hours.

See this link.

Question 97Skipped

Which command will we use to download the files from the stage/location loaded through
the COPY INTO <LOCATION> command?

PUT.

UNLOAD.

INSERT INTO.

Correct answer

GET.

Overall explanation

We will use the GET command to DOWNLOAD files from a Snowflake internal stage (named
internal stage, user stage, or table stage) into a directory/folder on a client machine. You
need to use SnowSQL to use this command.

Question 98Skipped

Where can we see the amount of storage used by Snowflake's Fail-Safe functionality in the
User Interface?

In the Account & Sessions section.

Correct answer

In the Account & Usage section.

In the Account & Fail-Safe section.

In the Account & Billing section.

Overall explanation

You can see the Fail-Safe usage in the "Account Usage" section, as we can see in the
following image:

chumma kizhii
#dm

Question 99Skipped

Which file format will keep floating-point numbers from being truncated when data is
unloaded?

Correct answer

Parquet

ORC

JSON

CSV

Overall explanation

The data types such as FLOAT/DOUBLE/REAL (underlying using DOUBLE ), all are
approximate values, they are all stored as DOUBLE which has a precision/scale of 15/9.
When we are unloading floats/doubles, columns are unloaded to CSV or JSON files,
Snowflake truncates the values to approximately (15,9). Precision is always not accurate.
Snowflake can’t precisely represent any arbitrary value with double precision, it's as per
industry standard. Please refer to this Tech Note.

When floating-point number columns are unloaded to CSV or JSON files, Snowflake
truncates the values to approximately (15,9).

The values are not truncated when unloading floating-point number columns to Parquet
files.

See this link.

Question 100Skipped

Which statement best describes 'clustering'?

The database administrator must define the clustering methodology for each Snowflake
table

Correct answer

chumma kizhii
#dm

Clustering represents the way data is grouped together and stored within Snowflake's
micro-partitions

The clustering key must be included on the COPY command when loading data into
Snowflake

Clustering can be disabled within a Snowflake account

Overall explanation

See this link

Question 101Skipped

Which MINIMUM set of privileges is required to temporarily bypass an active network policy
by configuring the user object property MINS_TO_BYPASS_NETWORK_POLICY?

Only while in the ACCOUNTADMIN role

Only the role with the OWNERSHIP privilege on the network policy

Correct answer

Only Snowflake Support can set the value for this object property

Only while in the SECURITYADMIN role

Overall explanation

It is possible to temporarily bypass a network policy for a set number of minutes by


configuring the user object property MINS_TO_BYPASS_NETWORK_POLICY, which can be
viewed by executing DESCRIBE USER. Only Snowflake can set the value for this object
property. Please contact Snowflake Support to set a value for this property.

See this link.

Question 102Skipped

Who can create network policies within Snowflake? (Choose two.)

Correct selection

SECURITYADMIN or higher roles

ORGADMIN only

Correct selection

A role with the CREATE NETWORK POLICY privilege

SYSADMIN only

A role with the CREATE SECURITY INTEGRATION privilege

chumma kizhii
#dm

Overall explanation

Create a network policy

Only security administrators (i.e. users with the SECURITYADMIN role) or higher or a role
with the global CREATE NETWORK POLICY privilege can create network policies. Ownership
of a network policy can be transferred to another role

See this link.

Question 103Skipped

Which Snowflake edition supports Protected Health Information (PHI) data (in accordance
with HIPAA and HITRUST CSF regulations), and has a dedicated metadata store and pool of
compute resources?

Correct answer

Virtual Private Snowflake (VPS)

Standard

Business Critical

Enterprise

Overall explanation

Virtual Private Snowflake offers dedicated metadata store and pool of compute resources
(used in virtual warehouses).

See this link.

Question 104Skipped

What is the default access of a securable object until other access is granted?

Read access

Correct answer

No access

Full access

Write access

Overall explanation

Securable object: An entity to which access can be granted. Unless allowed by a grant,
access is denied.

See this link.

chumma kizhii
#dm

Question 105Skipped

Which columns are available in the output of a Snowflake directory table? (Choose two.)

FILE_NAME

STAGE_NAME

CATALOG_NAME

Correct selection

RELATIVE_PATH

Correct selection

LAST_MODIFIED

Overall explanation

The output of a Snowflake directory table includes the following columns:

RELATIVE_PATH: The path of the file relative to the stage.

SIZE: The size of the file in bytes.

LAST_MODIFIED: The date and time the file was last modified.

MD5: checksum for the file.

ETag: header for the file.

FILE_URL: The Snowflake-hosted file URL to the file. The other columns listed are not
available in the output of a Snowflake directory table.

See this link.

Question 106Skipped

Why would a customer size a Virtual Warehouse from an X-Small to a Medium?

To accommodate more users

To accommodate more queries

Correct answer

To accommodate a more complex workload

To accommodate fluctuations in workload

Overall explanation

You scale up to accommodate complex queries and you scale out to accommodate
concurrent queries.

chumma kizhii
#dm

See this link

Question 107Skipped

What is one of the benefits of using a multi-cluster virtual warehouse?

It will reduce the cost of running the warehouse.

It will automatically increase the warehouse size as needed.

It will speed up data loading.

Correct answer

It will automatically start and stop additional clusters as needed.

Overall explanation

See this link.

Remember:

Question 108Skipped

What is the maximum row size in Snowflake?

8KB

500MB

Correct answer

16MB

50MB

Overall explanation

See this link

chumma kizhii
#dm

16MB per row captured in the VARIANT field. Same for VARCHAR datatype.

Question 109Skipped

Which command is used to generate a zero-copy "snapshot" of any table, schema, or


database?

CREATE REPLICATION GROUP

ALTER

Correct answer

CREATE ... CLONE

COPY INTO

Overall explanation

See this link

Question 110Skipped

How can network and private connectivity security be managed in Snowflake?

By manually setting up an Intrusion Prevention System (IPS) on each account

By putting the Snowflake URL on the allowed list for get method responses

By manually setting up vulnerability patch management policies

Correct answer

By setting up network policies with IPv4 IP addresses

Overall explanation

Network policies provide options for managing network configurations to the Snowflake
service.

Network policies allow restricting access to your account based on user IP address.
Effectively, a network policy enables you to create an IP allowed list, as well as an IP blocked
list, if desired.

See this link.

Using network policies is one of the best practices related to network security.

Question 111Skipped

What is the purpose of collecting statistics on data in Snowflake?

To identify data storage order correlations

chumma kizhii
#dm

Correct answer

To enable efficient pruning based on query filters

To optimize query performance by reading all data in a table

To reduce the total number of micro-partitions in a table

Overall explanation

Snowflake collects rich statistics on data allowing it not to read unnecessary parts of a table
based on the query filters. (Pruning)

See this link.

Question 112Skipped

The VALIDATE table function has which parameter as an input argument for a Snowflake
user?

CURRENT_STATEMENT

LAST_QUERY_ID

UUID_STRING

Correct answer

JOB_ID

Overall explanation

Syntax:

VALIDATE( [<namespace>.]<table_name> , JOB_ID => { '<query_id>' | '_last' } )

See this link.

Question 113Skipped

How many children tasks can a task have?

10

1000

Correct answer

100

Overall explanation

chumma kizhii
#dm

Snowflake tasks are schedulable scripts that are run inside your Snowflake environment.
Users can define a simple tree-like structure of tasks that starts with a root task and is linked
together by task dependencies. The children's tasks only run after the parent's task finishes.
A single task can have a maximum of 100 predecessor tasks and 100 child tasks.

Question 114Skipped

What is the minimum Snowflake edition that customers planning on storing protected
information in Snowflake should consider for regulatory compliance?

Enterprise

Premier

Correct answer

Business Critical Edition

Standard

Overall explanation

PII and HIPAA compliance are only supported for Business Critical Edition or higher. But the
question should have been more specific as jjordan mentioned.

See this link.

Question 115Skipped

Which of the following roles is recommended to be used to create and manage users and
roles?

Correct answer

SECURITYADMIN

ACCOUNTADMIN

PUBLIC

SYSADMIN

Overall explanation

These link explaines all the roles

chumma kizhii
#dm

Question 116Skipped

Which Snowflake table type is only visible to the user who creates it, can have the same
name as permanent tables in the same schema, and is dropped at the end of the session?

User

Local

Correct answer

Temporary

Transient

Overall explanation

See this link.

Question 117Skipped

What aspect of an executed query is represented by the remote disk I/O statistic of the
Query Profile in Snowflake?

Time spent reading and writing data from and to remote storage when the data being
accessed does not fit into the executing virtual warehouse node memory

chumma kizhii
#dm

Time spent caching the data to remote storage in order to buffer the data being extracted
and exported

Time spent scanning the table partitions to filter data based on the predicate

Correct answer

Time spent reading and writing data from and to remote storage when the data being
accessed does not fit into either the virtual warehouse memory or the local disk

Overall explanation

For some operations (e.g. duplicate elimination for a huge data set), the amount of memory
available for the compute resources used to execute the operation might not be sufficient to
hold intermediate results. As a result, the query processing engine will start spilling the data
to local disk. If the local disk space is not sufficient, the spilled data is then saved to remote
disks.

This spilling can have a profound effect on query performance (especially if remote disk is
used for spilling). Performance degrades drastically when a warehouse runs out of memory
while executing a query because memory bytes must “spill” onto local disk storage. If the
query requires even more memory, it spills onto remote cloud-provider storage, which
results in even worse performance.

Remote Disk I/O is the metric of the Query profile which can analyze the time spent
reading/writing data from/it remote storage (i.e. S3 or Azure Blob storage). This would
include things like spilling to remote disk, or reading your datasets.

See this link.

Question 118Skipped

How are privileges inherited in a role hierarchy in Snowflake?

Privileges are only inherited by the direct child role in the hierarchy.

Correct answer

Privileges are inherited by any roles above that role in the hierarchy.

Privileges are inherited by any roles at the same level in the hierarchy.

Privileges are only inherited by the direct parent role in the hierarchy.

Overall explanation

The privileges associated with a role are inherited by any roles above that role in the
hierarchy.

See this link.

chumma kizhii
#dm

Question 119Skipped

Increasing the maximum number of clusters in a Multi-Cluster Warehouse is an example of:

Scaling rhythmically

Correct answer

Scaling out

Scaling up

Scaling max

Overall explanation

See this link

Question 120Skipped

Which Snowflake feature will allow small volumes of data to continuously load into
Snowflake and will incrementally make the data available for analysis?

Correct answer

CREATE PIPE

TABLE STREAM

INSERT INTO

COPY INTO

Overall explanation

The keyword " continuously". We will need to use Snowpipe.

See this link.

Question 121Skipped

Which Snowflake features can be enabled by calling the


SYSTEM$GLOBAL_ACCOUNT_SET_PARAMETER function by a user with the ORGADMIN role?
(Choose two.)

Correct selection

Account and database replication

Fail-safe

Correct selection

Client redirect

chumma kizhii
#dm

Clustering

Search optimization service

Overall explanation

See this link.

Question 122Skipped

Which statement accurately describes how a virtual warehouse functions?

Increasing the size of a virtual warehouse will always improve data loading performance.

Each virtual warehouse is an independent compute cluster that shares compute resources
with other warehouses.

Correct answer

Each virtual warehouse is a compute cluster composed of multiple compute nodes


allocated by Snowflake from a cloud provider.

All virtual warehouses share the same compute resources so performance degradation of
one warehouse can significantly affect all the other warehouses.

Overall explanation

Query Processing

Query execution is performed in the processing layer. Snowflake processes queries using
“virtual warehouses”. Each virtual warehouse is an MPP compute cluster composed of
multiple compute nodes allocated by Snowflake from a cloud provider.

See this link.

Question 123Skipped

What are key characteristics of virtual warehouses in Snowflake? (Choose two.)

Correct selection

Warehouses can be resized at any time, even while running.

Warehouses that are multi-cluster can have nodes of different sizes.

Correct selection

Warehouses can be started and stopped at any time.

Warehouses are billed on a per-minute usage basis.

Warehouses can only be used for querying and cannot be used for data loading.

chumma kizhii
#dm

Overall explanation

Warehouses can be started and stopped at any time. They can also be resized at any time,
even while running, to accommodate the need for more or less compute resources, based
on the type of operations being performed by the warehouse.

See this link.

You can expect a lot of questions about Virtual Warehouses in the exam.

Question 124Skipped

Snowflake will return an error when a user attempts to share which object?

Correct answer

Standard views

Secure views

Tables

Secure materialized views

Overall explanation

For data security and privacy reasons, only secure views are supported in shares at this time.
If a standard view is added to a share, Snowflake returns an error.

See this link.

Question 125Skipped

How does Snowflake store a table's underlying data? (Choose two.)

Correct selection

Columnar file format

Uncompressed

Text file format

User-defined partitions

Correct selection

Micro-partitions

Overall explanation

All data in Snowflake tables is automatically divided into micro-partitions, which are
contiguous units of storage. Each micro-partition contains between 50 MB and 500 MB of

chumma kizhii
#dm

uncompressed data (note that the actual size in Snowflake is smaller because data is always
stored compressed). Groups of rows in tables are mapped into individual micro-partitions,
organized in a columnar fashion.

See this link.

Question 126Skipped

Regardless of which notation is used, what are considerations for writing the column name
and element names when traversing semi-structured data?

Correct answer

The column name is case-insensitive but element names are case-sensitive

The column name and element names are both case-insensitive.

The column name and element names are both case-sensitive.

The column name is case-sensitive but element names are case-insensitive.

Overall explanation

Regardless of which notation you use, the column name is case-insensitive but element
names are case-sensitive.

For example, (src is a column name) in the following list, the first two paths are equivalent,
but the third is not:

src:salesperson.name

SRC:salesperson.name

SRC:Salesperson.Name

See this link.

Question 127Skipped

What happens to historical data when the retention period for an object ends?

Time Travel on the historical data is dropped.

The object containing the historical data is dropped.

The data is cloned into a historical object.

Correct answer

The data moves to Fail-safe

Overall explanation

chumma kizhii
#dm

When the retention period ends for an object, the historical data is moved into Snowflake
Fail-safe.

See this link.

Question 128Skipped

Which Snowflake object can be created to be temporary?

Storage integration

User

Correct answer

Stage

Role

Overall explanation

See this link.

Question 129Skipped

What is the purpose of using the OBJECT_CONSTRUCT function with the COPY INTO
command?

Correct answer

Convert the rows in a relational table to a single VARIANT column and then unload the
rows into a file.

Convert the rows in a source file to a single VARIANT column and then load the rows from
the file to a variant table.

Reorder the rows in a relational table and then unload the rows into a file.

Reorder the data columns according to a target table definition and then unload the rows
into the table.

Overall explanation

An OBJECT can contain semi-structured data and can be used to create hierarchical data
structures.

OBJECT_CONSTRUCT returns a VARIANT object, essentially a JSON document, as an output,


with either the key:value pairs as inputs or an asterisk (as in SELECT *) from a relational
query.

chumma kizhii
#dm

You can use the OBJECT_CONSTRUCT function combined with the COPY command to
convert the rows in a relational table to a single VARIANT column and unload the rows into a
file.

See this example.

Question 130Skipped

Which of the following are options when creating a Virtual Warehouse? (Choose two.)

Auto-disable

Auto-enable

Auto-resize

Correct selection

Auto-suspend

Correct selection

Auto-resume

Auto-drop

Overall explanation

See this link

Question 131Skipped

In which layer of its architecture does Snowflake store its metadata statistics?

Compute Layer

Storage Layer

Correct answer

Cloud Services Layer

Database Layer

Overall explanation

See this link

Question 132Skipped

What is the use of the OVERWRITE=TRUE option in the PUT command?

It is not possible to use OVERWRITE=TRUE option with PUT command.

It specified whether Snowflake overwrites the encryption key you used to upload the files.

chumma kizhii
#dm

Correct answer

It specifies whether Snowflake overwrites an existing file with the same name during
upload.

It specified whether Snowflake should overwrite the gzip compress algorithm with
another one that you provide.

Overall explanation

The OVERWRITE=TRUE option in the Snowflake PUT command is used to overwrite an


existing file with the same name in the target location. Imagine you run the following
command:

1. PUT file:///tmp/data/mydata.csv @my_int_stage;

If there is a file called "mydata.csv" in the stage, it won't load it. However, using the
OVERWRITE option, it will load it:

1. PUT file:///tmp/data/mydata.csv @my_int_stage OVERWRITE=TRUE;

Question 133Skipped

What are the key characteristics of ACСOUNT_USAGE views? (Choose two.)

Correct selection

Records for dropped objects are included in each view.

The historical data can be retained from 7 days to 6 months.

Correct selection

The data latency can vary from 45 minutes to 3 hours.

There is no data latency.

The historical data is not retained.

Overall explanation

See this link.

Question 134Skipped

The three main Snowflake layers are…

Staging, Data Warehouse, Dimensional.

Extraction, Load, Transformation.

Database, Virtual Warehouse, Metadata.

chumma kizhii
#dm

Correct answer

Centralized Storage, Compute & Cloud Services.

Overall explanation

You can see them in the following image (via docs.snowflake.com):

Question 135Skipped

Query results are stored in the Result Cache for how long after they are last accessed,
assuming no data changes have occurred?

12 hours

Correct answer

24 hours

1 Hour

3 Hours

Overall explanation

See this link

Question 136Skipped

Snowflake’s hierarchical key mode includes which keys? (Choose two.)

Correct selection

File keys

chumma kizhii
#dm

Secure view keys

Database master keys

Schema master keys

Correct selection

Account master keys

Overall explanation

See this link.

Question 137Skipped

A Snowflake user wants to share data using my_share with account xy12345.

Which command should be used?

1. alter account xy12345 add share my_share;

1. grant select on share my_share to account xy12345;

1. grant usage on share my_share to account xy12345;

Correct answer

chumma kizhii
#dm

1. alter share my_share add accounts = xy12345;

Overall explanation

ALTER SHARE [ IF EXISTS ] <name> { ADD | REMOVE } ACCOUNTS = <consumer_account> [ ,


<consumer_account> , ... ] [ SHARE_RESTRICTIONS = { TRUE | FALSE } ]

ALTER SHARE [ IF

ALTER SHARE [ IF EXISTS ] <name> SET { [ ACCOUNTS = <consumer_account> [ ,


<consumer_account> ... ] ]

See this link.

Question 138Skipped

Which clients does Snowflake support Multi-Factor Authentication (MFA) token caching for?
(Choose two.)

Correct selection

Python connector

Correct selection

ODBC driver

GO driver

Spark connector

Node.js driver

Overall explanation

Snowflake supports MFA token caching with the following drivers and connectors on macOS
and Windows. This feature is not supported on Linux.

• ODBC driver version 2.23.0 (or later).

• JDBC driver version 3.12.16 (or later).

• Python Connector for Snowflake version 2.3.7 (or later).

See this link.

Question 139Skipped

What does the client redirect feature in Snowflake enable?

A redirect of client connections to Snowflake accounts in the same regions for data
replication.

chumma kizhii
#dm

A redirect of client connections to Snowflake accounts in the same regions for business
continuity.

A redirect of client connections to Snowflake accounts in different regions for data


replication.

Correct answer

A redirect of client connections to Snowflake accounts in different regions for business


continuity.

Overall explanation

Client Redirect enables redirecting your client connections to Snowflake accounts in different
regions for business continuity and disaster recovery, or when migrating your account to
another region or cloud platform.

See this link.

Question 140Skipped

Which stream type can be used for tracking the records in external tables?

Correct answer

Insert-only

Append-only

Standard

External

Overall explanation

See this link.

Question 141Skipped

What action can a user take to address query concurrency issues?

Resize the virtual warehouse to a larger instance size.

Correct answer

Add additional clusters to the virtual warehouse.

Enable the query acceleration service.

Enable the search optimization service.

Overall explanation

chumma kizhii
#dm

Multi-cluster warehouses are best utilized for scaling resources to improve concurrency for
users/queries. They are not as beneficial for improving the performance of slow-running
queries or data loading. For these types of operations, resizing the warehouse provides
more benefits.

See this link.

Question 142Skipped

Which of the following is true of Snowpipe via REST API? (Choose two.)

You can only use it on Internal Stages

Correct selection

Snowflake automatically manages the compute required to execute the Pipe's COPY INTO
commands

All COPY INTO options are available during pipe creation

Snowpipe removes files after they have been loaded

Correct selection

Snowpipe keeps track of which files it has loaded

Overall explanation

See this link.

See this link.

See this link.

Question 143Skipped

What are the types of data consumer accounts available in Snowflake? (Choose two.)

Subscriber account

Shared Account

Public Account

Correct selection

Full Account

Correct selection

Reader Account

Overall explanation

chumma kizhii
#dm

There are two types of data consumers. The first one is the Full Accounts, the consumers
with existing Snowflake accounts. In this case, the consumer account pays for the queries
they make. We also have the Reader Accounts, the consumers without Snowflake accounts.
In this last case, the producer account pays all the compute credits that their warehouses
use. You can see this behavior in the following diagram:

Question 144Skipped

What is the advantage of using a reader account?

It is read-only and prevents the shared data from being updated by the provider

Correct answer

It can be used by a client that does not have a Snowflake account

It provides limited access to the data share and is therefore cheaper for the data provider

It can be connected to a Snowflake account in a different region

Overall explanation

See this link.

Question 145Skipped

The first user assigned to a new account, ACCOUNTADMIN, should create at least one
additional user with which administrative privilege?

PUBLIC

Correct answer

USERADMIN

SYSADMIN

ORGADMIN

Overall explanation

chumma kizhii
#dm

By default, when your account is provisioned, the first user is assigned the ACCOUNTADMIN
role. This user should then create one or more additional users who are assigned the
USERADMIN role. All remaining users should be created by the user(s) with the USERADMIN
role or another role that is granted the global CREATE USER privilege.

See this link.

Question 146Skipped

What value provides information about disk usage for operations where intermediate results
do not fit in memory in a Query Profile?

Network

Correct answer

Spilling

IO

Pruning

Overall explanation

Spilling — information about disk usage for operations where intermediate results do not fit
in memory

See this link.

Question 147Skipped

What is common between AWS Quicksight, PowerBI, and Tableau?

They are Snowflake Machine Learning Partners.

They are Snowflake Security & Governance Partners.

They are Snowflake Data Integration Partners.

Correct answer

They are Snowflake Business Intelligence Partners.

Overall explanation

Business intelligence (BI) tools enable analyzing, discovering, and reporting on data to help
make more informed business decisions. They use dashboards, charts, or other graphical
tools to deliver data visualization. We can see the Snowflake ecosystem in the following
image:

chumma kizhii
#dm

Question 148Skipped

Which Snowflake edition (and above) allows until 90 days of Time Travel?

Virtual Private Snowflake

Business Critical.

Standard.

Correct answer

Enterprise.

Overall explanation

By default, Time travel is enabled with a 1-day retention period. However, we can increase it
to 90 days if we have (at least) the Snowflake Enterprise Edition. It requires additional
storage, which will be reflected in your monthly storage charges.

Question 149Skipped

What type of columns does Snowflake recommend to be used as clustering keys? (Choose
two.)

chumma kizhii
#dm

A column with very high cardinality

A column with very low cardinality

Correct selection

A column that is most actively used in join predicates

Correct selection

A column that is most actively used in selective filters

A VARIANT column

Overall explanation

See this link.

Question 150Skipped

Which of the following statements are true concerning the Snowflake release process?
(Choose three.)

A customer is assigned a 30 minute window (that can be moved anytime within a week)
during which the system will be unavailable and customer is upgraded

Correct selection

Snowflake deploys new Behavior change releases every month.

Snowflake deploys patch releases every week, but new feature releases happen once a
month.

Correct selection

Snowflake deploys new feature releases and releases every week.

Correct selection

It is possible for you as a user to request 24-hour early access to the upcoming releases so
that you can do additional release testing before the release is rolled out.

There is usually some minimal downtime associated with Snowflake during the
deployments.

Overall explanation

The deployment processes happen transparently in the background; users experience no


downtime or disruption of service. You can see the different release types at the following
link.

Question 151Skipped

chumma kizhii
#dm

What should be the first option to restore data into a table?

Ask Snowflake Support

Fail-Safe.

Correct answer

Time-Travel.

Zero-Copy Cloning.

Overall explanation

Time-Travel enables accessing historical data (i.e., data that has been changed or deleted) at
any point within a defined period. If we drop a table, we can restore it with time travel. You
can use it with Databases, Schemas & Tables. The following diagram explains how Time-
Travel works:

Question 152Skipped

How many tasks can a tree of tasks have?

Correct answer

1000, including the root task.

100.

10.

1000 without including the root task.

Overall explanation

chumma kizhii
#dm

Users can define a simple tree-like structure of tasks that starts with a root task and is linked
together by task dependencies. A tree of tasks can have a maximum of 1000 tasks, including
the root one. Also, each task can have a maximum of 100 children.

Question 153Skipped

What are the available Snowflake scaling modes for configuring multi-cluster virtual
warehouses? (Choose two.)

Correct selection

Maximized

Scale-Out

Correct selection

Auto-Scale

Standard

Economy

Overall explanation

Note, that there are Scaling Policies called "Economy" and "Standard" in "Auto-Scale" mode.

See this link

Question 154Skipped

A Snowflake user wants to optimize performance for a query that queries only a small
number of rows in a table. The rows require significant processing. The data in the table
does not change frequently.

What should the user do?

Enable the query acceleration service for the virtual warehouse.

Add a clustering key to the table.

Add the search optimization service to the table.

Correct answer

Create a materialized view based on the query.

Overall explanation

Materialized views are particularly useful when:

chumma kizhii
#dm

• Query results contain a small number of rows and/or columns relative to the base
table (the table on which the view is defined).

• Query results contain results that require significant processing, including:

• Analysis of semi-structured data.

• Aggregates that take a long time to calculate.

• The query is on an external table (i.e. data sets stored in files in an external stage),
which might have slower performance compared to querying native database tables.

• The view’s base table does not change frequently.

See this link.

Question 155Skipped

Which command can be used to list all the file formats for which a user has access
privileges?

ALTER FILE FORMAT

LIST

Correct answer

SHOW FILE FORMATS

DESCRIBE FILE FORMAT

Overall explanation

SHOW FILE FORMATS

Lists the file formats for which you have access privileges. This command can be used to list
the file formats for a specified database or schema (or the current database/schema for the
session), or your entire account.

See this link.

SET 2

chumma kizhii
#dm

Question 1Incorrect

Which feature is only available in the Enterprise or higher editions of Snowflake?

Your answer is incorrect

Multi-factor Authentication (MFA)

SOC 2 type II certification

Object-level access control

Correct answer

Column-level security

Overall explanation

See this link

Question 2Incorrect

Will data cached in a warehouse be lost when the warehouse is resized?

No, because the size of the cache is independent from the warehouse size.

Correct answer

Possibly, if the warehouse is resized to a smaller size and the cache no longer fits.

Your answer is incorrect

Yes, because the compute resource is replaced in its entirety with a new compute
resource.

Yes, because the new compute resource will no longer have access to the cache encryption
key.

Overall explanation

See this link

Question 3Incorrect

Which privileges are required for a user to restore an object? (Choose two.)

UPDATE

Your selection is incorrect

UNDROP

MODIFY

Your selection is correct

chumma kizhii
#dm

OWNERSHIP

Correct selection

CREATE

Overall explanation

See this link.

Question 4Skipped

Authorization to execute CREATE [object] statements comes only from which role?

Application role

Secondary role

Database role

Correct answer

Primary role

Overall explanation

Note that authorization to execute CREATE <object> statements to create objects is provided
by the primary role.

See this link.

Question 5Skipped

Which of the following statements describes a benefit of Snowflake’s separation of compute


and storage? (Choose two.)

Correct selection

Storage expands without the requirement to add more compute.

Compute and storage can be scaled together.

Correct selection

Compute can be scaled up or down without the requirement to add more storage.

Growth of storage and compute are tightly coupled.

Use of storage avoids disk spilling.

Overall explanation

Both can be managed saperately as per business needs. They are not bounded, this is what
cloud is for.

chumma kizhii
#dm

Question 6Skipped

What information does the Query Profile provide?

Correct answer

Statistics for each component of the processing plan

Graphical representation of the data model

Real-time monitoring of the database operations

Detailed information about the database schema

Overall explanation

Query Profile provides execution details for a query. For the selected query, it provides a
graphical representation of the main components of the processing plan for the query, with
statistics for each component, along with details and statistics for the overall query.

See this link.

Question 7Skipped

How are network policies defined in Snowflake?

Correct answer

They are a set of rules that control access to Snowflake accounts by specifying the IP
addresses or ranges of IP addresses that are allowed to connect to Snowflake.

They are a set of rules that dictate how Snowflake accounts can be used between multiple
users.

They are a set of rules that define the network routes within Snowflake.

They are a set of rules that define how data can be transferred between different
Snowflake accounts within an organization.

Overall explanation

See this link.

Question 8Skipped

What SQL command would be used to view all roles that were granted to USER1?

1. show grants user USER1;

Correct answer

1. show grants to user USER1;

chumma kizhii
#dm

1. show grants on user USER1;

1. describe user USER1;

Overall explanation

SHOW GRANTS TO ...

ROLE role_name

Lists all privileges and roles granted to the role.

USER user_name

Lists all the roles granted to the user. Note that the PUBLIC role, which is automatically
available to every user, is not listed.

See this link.

Question 9Skipped

Which data types can be used in a Snowflake table that holds semi-structured data? (Choose
two.)

VARCHAR

Correct selection

VARIANT

BINARY

TEXT

Correct selection

ARRAY

Overall explanation

See this link.

Question 10Skipped

How does Snowflake reorganize data when it is loaded? (Choose two.)

Raw format

Correct selection

Compressed format

Correct selection

Columnar format

chumma kizhii
#dm

Zipped format

Binary format

Overall explanation

When data is loaded into Snowflake, Snowflake reorganizes that data into its internal
optimized, compressed, columnar format.

See this link.

Question 11Skipped

What statement is true about Snowflake’s unique architecture?

Correct answer

Multi-Cluster Shared Data.

Multi-Cluster Private Data.

One Node Private Data.

One Node Shared Data.

Overall explanation

Snowflake's architecture is a hybrid of traditional shared-disk and shared-nothing database


architectures. Snowflake uses a central data repository for persisted data accessible from all
compute nodes in the platform. At the same time, it processes queries using virtual
warehouses where each node in the cluster stores a portion of the entire data set locally.

Question 12Skipped

Which of the following is a data tokenization integration partner?

Tableau

DBeaver

Correct answer

Protegrity

SAP

Overall explanation

chumma kizhii
#dm

See this link

Question 13Skipped

Which system functions are available in Snowflake to view/monitor the clustering metadata
for a table? (Choose two.)

1. SYSTEM$CLUSTERING_STATUS

1. SYSTEM$CLUSTERING

Correct selection

1. SYSTEM$CLUSTERING_INFORMATION

Correct selection

1. SYSTEM$CLUSTERING_DEPTH

1. SYSTEM$CLUSTERING_METADATA

Overall explanation

The clustering depth measures the average depth of the overlapping micro-partitions for
specified columns in a table (1 or greater). The smaller the cluster depth is, the better

chumma kizhii
#dm

clustered the table is. You can use any previous commands to get the Cluster Depth of a
table.

Question 14Skipped

Which of the following services are NOT provided by the Cloud Services Layer? (Choose
two.)

Infrastructure Management.

Correct selection

Query Execution.

Authentication.

Metadata Management.

Correct selection

Storage.

Overall explanation

The Cloud Services layer is a collection of services coordinating activities across Snowflake.
It's in charge of Authentication, Infrastructure management, Metadata management, Query
parsing and optimization, and Access control.

Question 15Skipped

What is one of the characteristics of data shares?

Correct answer

Data shares utilize secure views for sharing view objects.

Data shares support full DML operations.

Data shares work by copying data to consumer accounts.

Data shares are cloud agnostic and can cross regions by default.

Overall explanation

• For data security and privacy reasons, only secure views are supported in shares at
this time. If a standard view is added to a share, Snowflake returns an error.

• Shared databases are read-only.

• With Secure Data Sharing, no actual data is copied or transferred between accounts.
All sharing uses Snowflake’s services layer and metadata store.

chumma kizhii
#dm

• Database replication will be needed to allow data providers to securely share data
with data consumers across different regions and cloud platforms.

See this link.

Question 16Skipped

What information is stored in the ACCESS_HISTORY view?

Correct answer

Query details such as the objects included and the user who executed the query

Names and owners of the roles that are currently enabled in the session

Details around the privileges that have been granted for all objects in an account

History of the files that have been loaded into Snowflake

Overall explanation

See this link.

Question 17Skipped

A user is loading JSON documents composed of a huge array containing multiple records into
Snowflake. The user enables the STRIP_OUTER_ARRAY file format option.

What does the STRIP_OUTER_ARRAY file format do?

It removes the NULL elements from the JSON object eliminating invalid data and enables
the ability to load the records.

It removes the last element of the outer array.

It removes the trailing spaces in the last element of the outer array and loads the records
into separate table columns.

Correct answer

It removes the outer array structure and loads the records into separate table rows.

Overall explanation

STRIP_OUTER_ARRAY, Removes the outer set of square brackets [ ] when loading the data,
separating the initial array into multiple lines

Question 18Skipped

If a virtual warehouse is suspended, what happens to the warehouse cache?

chumma kizhii
#dm

The warehouse cache persists for as long as the warehouse exists, regardless of its
suspension status.

The cache is maintained for the auto_suspend duration and can be restored if the
warehouse is restarted within this limit.

The cache is maintained for up to two hours and can be restored if the warehouse is
restarted within this limit.

Correct answer

The cache is dropped when the warehouse is suspended and is no longer available upon
restart.

Overall explanation

This cache is dropped when the warehouse is suspended, which may result in slower initial
performance for some queries after the warehouse is resumed.

See this link.

Question 19Skipped

Which Snowflake URL type allows users or applications to download or access files directly
from Snowflake stage without authentication?

File

Scoped

Directory

Correct answer

Pre-signed

Overall explanation

Pre-signed URLs are used to download or access files, via a web browser for example,
without authenticating into Snowflake or passing an authorization token.

See this link.

Question 20Skipped

What is the MINIMUM configurable idle timeout value for a session policy in Snowflake?

2 minutes

15 minutes

10 minutes

chumma kizhii
#dm

Correct answer

5 minutes

Overall explanation

The timeout period begins upon a successful authentication to Snowflake. If a session policy
is not set, Snowflake uses a default value of 240 minutes (i.e. 4 hours). The minimum
configurable idle timeout value for a session policy is 5 minutes. When the session expires,
the user must authenticate to Snowflake again.

See this link.

Question 21Skipped

Which semi-structured file formats are supported when unloading data from a table?
(Choose two.)

XML

Correct selection

Parquet

Avro

Correct selection

JSON

ORC

Overall explanation

See this link

Question 22Skipped

What does the average_overlaps in the output of SYSTEM$CLUSTERING_INFORMATION refer


to?

The average number of partitions physically stored in the same location.

The average number of micro-partitions in the table associated with cloned objects.

Correct answer

The average number of micro-partitions which contain overlapping value ranges.

The average number of micro-partitions stored in Time Travel.

Overall explanation

chumma kizhii
#dm

See this link.

Question 23Skipped

A Snowflake user needs to share unstructured data from an internal stage to a reporting tool
that does not have Snowflake access.

Which file function should be used?

BUILD_STAGE_FILE_URL

GET_STAGE_LOCATION

Correct answer

GET_PRESIGNED_URL

BUILD_SCOPED_FILE_URL

Overall explanation

Pre-signed URL function is used to download or access files without authenticating into
Snowflake or passing an authorization token. Pre-signed URLs are open; any user or
application can directly access or download the files. Ideal for business intelligence
applications or reporting tools that need to display the unstructured file contents.

Query the GET_PRESIGNED_URL function.

See this link.

Question 24Skipped

A Snowflake user is trying to load a 125 GB file using SnowSQL. The file continues to load for
almost an entire day.

What will happen at the 24-hour mark?

The file will stop loading and all data up to that point will be committed.

Correct answer

The file loading could be aborted without any portion of the file being committed.

The file’s number of allowable hours to load can be programmatically controlled to load
easily into Snowflake.

The file will continue to load until all contents are loaded.

Overall explanation

Note

chumma kizhii
#dm

Loading very large files (e.g. 100 GB or larger) is not recommended.

If you must load a large file, carefully consider the ON_ERROR copy option value. Aborting or
skipping a file due to a small number of errors could result in delays and wasted credits. In
addition, if a data loading operation continues beyond the maximum allowed duration of 24
hours, it could be aborted without any portion of the file being committed.

See this link.

Question 25Skipped

From what stage can a Snowflake user omit the FROM clause while loading data into a
table?

The user stage

The external named stage

Correct answer

The table stage

The internal named stage

Overall explanation

Note that when copying data from files in a table stage, the FROM clause can be omitted
because Snowflake automatically checks for files in the table stage.

See this link.

Question 26Skipped

What are supported file formats for unloading data from Snowflake? (Choose three.)

AVRO

XML

ORC

Correct selection

CSV

Correct selection

JSON

Correct selection

Parquet

chumma kizhii
#dm

Overall explanation

See this link

Question 27Skipped

What storage cost is completely eliminated when a Snowflake table is defined as transient?

Correct answer

Fail-safe

Active

Staged

Time Travel

Overall explanation

Similar to permanent tables, transient tables contribute to the overall storage charges that
Snowflake bills your account; however, because transient tables do not utilize Fail-safe, there
are no Fail-safe costs (i.e. the costs associated with maintaining the data required for Fail-
safe disaster recovery).

See this link.

Question 28Skipped

How is enhanced authentication achieved in Snowflake? (Choose two.)

Correct selection

Multi-Factor Authentication (MFA)

Masking policies

Correct selection

Federated authentication and Single Sign-On (SSO)

Snowflake-managed keys

Object level access control

Overall explanation

See this link.

Question 29Skipped

chumma kizhii
#dm

What actions can the resource monitor associated with a Warehouse take when it reaches
(or is about to) hit the limit? (Choose three.)

Delete the Snowflake account.

Correct selection

Suspend the Warehouse.

Open a Snowflake support case

Correct selection

Send a notification alert.

Correct selection

Kill the query that is running.

Execute DROP WAREHOUSE statement.

Overall explanation

A resource monitor can Notify, Notify & Suspend, and Notify & Suspend Immediately. You
can see these three actions in the following image:

See this link.

Question 30Skipped

Which of the following are options when creating a Virtual Warehouse? (Choose two.)

Auto-enable

Local SSD size

Correct selection

Auto-suspend

chumma kizhii
#dm

Correct selection

Auto-resume

User count

Overall explanation

See this link

Question 31Skipped

What is the minimum Snowflake edition that provides data sharing?

Correct answer

Standard

Premier

Business Critical Edition

Enterprise

Overall explanation

See this link

Question 32Skipped

What can you easily check to see if a large table will benefit from explicitly defining a
clustering key?

Correct answer

Clustering depth.

Clustering status.

Values in a table.

Clustering ratio.

Overall explanation

The clustering depth measures the average depth of the overlapping micro-partitions for
specified columns in a table (1 or greater). The smaller the cluster depth is, the better
clustered the table is. You can get the clustering depth of a Snowflake table using this
command:

1. SELECT SYSTEM$CLUSTERING_DEPTH('MY_TABLE', '(col1, col2)');

Question 33Skipped

chumma kizhii
#dm

How many resource monitors can be assigned at the account level?

Correct answer

Overall explanation

A single monitor can be set at the account level to control credit usage for all warehouses in
your account.

See this link.

Question 34Skipped

Which VALIDATION_MODE value will return the errors across the files specified in a COPY
command, including files that were partially loaded during an earlier load?

RETURN_ERRORS

RETURN_n_ROWS

RETURN_-1_ROWS

Correct answer

RETURN_ALL_ERRORS

Overall explanation

RETURN_ALL_ERRORS

Returns all errors across all files specified in the COPY statement, including files with errors
that were partially loaded during an earlier load because the ON_ERROR copy option was set
to CONTINUE during the load.

See this link.

Question 35Skipped

How does Snowflake Fail-safe protect data in a permanent table?

Fail-safe makes data available up to 1 day, recoverable only by Snowflake Support.

Correct answer

Fail-safe makes data available for 7 days, recoverable only by Snowflake Support.

chumma kizhii
#dm

Fail-safe makes data available for 7 days, recoverable by user operations.

Fail-safe makes data available up to 1 day, recoverable by user operations.

Overall explanation

Fail-safe provides a (non-configurable) 7-day period during which historical data may be
recoverable by Snowflake. This period starts immediately after the Time Travel retention
period ends.

See this link.

Question 36Skipped

How can a user get the MOST detailed information about individual table storage details in
Snowflake?

SHOW TABLES command

TABLES view

SHOW EXTERNAL TABLES command

Correct answer

TABLE_STORAGE_METRICS view

Overall explanation

See this link.

Question 37Skipped

Which function is used to convert rows in a relational table to a single VARIANT column?

ARRAY_CONSTRUCT

chumma kizhii
#dm

ARRAY_AGG

Correct answer

OBJECT_CONSTRUCT

OBJECT_AGG

Overall explanation

You can use the OBJECT_CONSTRUCT function combined with the COPY command to
convert the rows in a relational table to a single VARIANT column and unload the rows into a
file.

See this link.

Question 38Skipped

What is common between Fivetran, Informatica, Stitch, and Talend?

Correct answer

They are Snowflake Data Integration partners.

They are Snowflake Business Intelligence partners.

They are Snowflake Security partners.

They are Snowflake programming interfaces partners.

Overall explanation

There are two types of partners, technology partners, and solution partners. The technology
partners are the ones that integrate their solutions with Snowflake, and they can be divided
into Data Integration, ML & Data Science, Security & Governance, Business Intelligence, SQL
Editors, and Programming Interfaces. In this case, they all belong to the Data Integration
Partners. You can see the Snowflake ecosystem in the following image:

chumma kizhii
#dm

Question 39Skipped

In which editions can we configure MFA Authentication?

Standard.

Correct answer

All of them.

Enterprise.

Business Critical.

Overall explanation

MFA login is designed primarily for connecting to Snowflake through the web interface, but
it is also fully supported by SnowSQL and the Snowflake JDBC and ODBC drivers. MFA is
available in all the Snowflake editions as another security layer, and you can set it in the
settings tab:

chumma kizhii
#dm

Question 40Skipped

Which Snowflake objects are automatically created by default every time you create a
database? (Choose two.)

Correct selection

The PUBLIC schema.

The METADATA_SCHEMA.

The DEFAULT_SCHEMA.

Correct selection

The INFORMATION_SCHEMA.

The ANALYTICS_SCHEMA.

Overall explanation

The INFORMATION_SCHEMA contains views for all the objects in the database, as well as
views for account-level objects, and table functions for historical and usage data across your
account. The PUBLIC schema is the default schema for the database, and all objects are, by
default, created inside it if no other schema is specified.

chumma kizhii
#dm

Question 41Skipped

What is generally the FASTEST way to bulk load data files from a stage?

Using the Snowpipe REST API

Loading by path (internal stages) / prefix

Correct answer

Specifying a list of specific files to load

Using pattern matching to identify specific files by pattern

Overall explanation

Of the three bulk load options for identifying/specifying data files to load from a stage,
providing a discrete list of files is generally the fastest; however, the FILES parameter
supports a maximum of 1,000 files, meaning a COPY command executed with the FILES
parameter can only load up to 1,000 files.

See this link.

Question 42Skipped

What is the minimum Snowflake edition that provides multi-cluster warehouses?

Premier

Business Critical Edition

Correct answer

chumma kizhii
#dm

Enterprise

Standard

Overall explanation

See this link

Question 43Skipped

What is the SNOWFLAKE.ACCOUNT_USAGE view that contains information about which


objects were read by queries within the last 365 days (1 year)?

OBJECT_HISTORY

Correct answer

ACCESS_HISTORY

LOGIN_HISTORY

VIEWS_HISTORY

Overall explanation

Querying the ACCESS_HISTORY View This Account Usage view can be used to query the
access history of Snowflake objects (e.g. table, view, column) within the last 365 days (1
year).

See this link.

Question 44Skipped

In Snowflake, the use of federated authentication enables which Single Sign-On (SSO)
workflow activities? (Choose two.)

Performing role authentication

Authorizing users

Correct selection

Logging into Snowflake

Initiating user sessions

Correct selection

Logging out of Snowflake

Overall explanation

Federated authentication enables the following SSO workflows:

chumma kizhii
#dm

• Logging into Snowflake.

• Logging out of Snowflake.

• System timeout due to inactivity.

See this link.

Question 45Skipped

Data storage for individual tables can be monitored using which commands and/or objects?
(Choose two.)

Correct selection

SHOW TABLES;

Correct selection

Information Schema -> TABLE_STORAGE_METRICS

Information Schema -> TABLE_HISTORY

SHOW STORAGE BY TABLE;

Information Schema -> TABLE_FUNCTION

Overall explanation

These two options will show bytes stored.

See this link.

And this link.

Question 46Skipped

Which are the additional columns that the streams create? (Choose three.)

METADATA$IS_DELETED

METADATA$COLUMN_ID

Correct selection

METADATA$ACTION

Correct selection

METADATA$ROW_ID

Correct selection

METADATA$ISUPDATE

chumma kizhii
#dm

METADATA$ISREAD

Overall explanation

METADATA$ACTION Indicates the DML operation (INSERT, DELETE) recorded.

METADATA$ISUPDATE indicates whether the operation was part of an UPDATE statement.

METADATA$ROW_ID is a unique and immutable ID for the row.

Question 47Skipped

What happens to the incoming queries when a warehouse does not have enough resources
to process them?

Snowflake adds more clusters.

Queries are aborted.

Snowflake resizes the warehouse.

Correct answer

Queries are queued and executed when the warehouse has resources.

Overall explanation

If the warehouse does not have enough remaining resources to process a query, the query is
queued, pending resources that become available as other running queries complete.

Question 48Skipped

What is the only supported character set for loading and unloading data from all supported
file formats?

ISO-8859-1

UTF-16

WINDOWS-1253

Correct answer

UTF-8

Overall explanation

See this link.

Question 49Skipped

What entity is responsible for hosting and sharing data in Snowflake?

Correct answer

chumma kizhii
#dm

Data provider

Reader account

Managed account

Data consumer

Overall explanation

A data provider is any Snowflake account that creates shares and makes them available to
other Snowflake accounts to consume.

See this link.

Question 50Skipped

Which properties of a Resource Monitor can be modified?

Schedule, Actions.

Monitor Level, Schedule, Actions.

Credit Quota, Monitor Level, Schedule.

Correct answer

Credit Quota, Monitor Level, Schedule, Actions.

Credit Quota, Schedule, Actions.

Overall explanation

The Credit Quota specifies the number of Snowflake credits allocated to the monitor for the
specified frequency interval.

The Monitor Level specifies whether the resource monitor is used to monitor the credit
usage for your entire account or individual warehouses.

The Schedule indicates when the monitor will start monitoring and when the credits will
reset to 0.

Each action specifies a threshold and the action to perform when the threshold is reached
within the specified interval.

Question 51Skipped

How can a Snowflake user traverse semi-structured data?

Correct answer

Insert a colon (:) between the VARIANT column name and any first-level element.

chumma kizhii
#dm

Insert a double colon (::) between the VARIANT column name and any second-level
element.

Insert a colon (:) between the VARIANT column name and any second-level element.

Insert a double colon (::) between the VARIANT column name and any first-level element.

Overall explanation

See this link.

It is important to be familiar with the syntax for querying semi-structured data.

Question 52Skipped

Queries in Snowflake are getting queued on the warehouses and delaying the ETL processes
of the company. What are the possible solution options you can think of, considering we
have the Snowflake Enterprise edition? (Choose two.)

Contact Snowflake support to increase the size of the warehouse.

Upgrade Snowflake edition

Correct selection

Resize the warehouse.

Correct selection

Use a multi-cluster warehouse.

Set auto-resize parameter to TRUE.

Overall explanation

By resizing the warehouse, your company will scale up, reducing the time to execute big
queries. Using multi-cluster warehouses, you will have more queries running simultaneously
and a high concurrency when they execute, and this is the definition of scaling out. You can
see the differences between the different ways to scale in the following picture:

chumma kizhii
#dm

See this link.

Question 53Skipped

Which property needs to be added to the ALTER WAREHOUSE command to verify the
additional compute resources for a virtual warehouse have been fully provisioned?

SCALING_POLICY

RESOURCE_MONITOR

QUERY_ACCELERATION_MAX_SCALE_FACTOR

Correct answer

WAIT_FOR_COMPLETION

Overall explanation

See this link.

Question 54Skipped

What transformations are supported in a CREATE PIPE ... AS COPY `¦ FROM (`¦) statement?
(Choose two.)

Data can be filtered by an optional WHERE clause.

Correct selection

Columns can be omitted.

Correct selection

Columns can be reordered.

Incoming data can be joined with other tables.

Row level access can be defined.

Overall explanation

See this link

Question 55Skipped

How a Snowpipe charges calculated?

Correct answer

Per-second/per-core granularity

Per-second/per Warehouse size

chumma kizhii
#dm

Total storage bucket size

Number of Pipes in account

Overall explanation

See this link

Question 56Skipped

Which of these features is NOT supported by Snowflake?

Manage roles.

Correct answer

Manage the data that 3rd party applications upload to the marketplace.

Manage users.

Monitor usage and cost.

Overall explanation

You can provide and consume listings offered privately or publicly using the Snowflake
Marketplace, discovering and accessing a variety of third-party datasets. Becoming a
provider of listings in Snowflake makes it easier to manage sharing from your account to
other Snowflake accounts.

Question 57Skipped

Which Snowflake feature is used for both querying and restoring data?

Fail-safe

Correct answer

Time Travel

Cluster keys

Cloning

Overall explanation

See this link

Question 58Skipped

By default, the COPY INTO statement will separate table data into a set of output files to take
advantage of which Snowflake feature?

Correct answer

chumma kizhii
#dm

Parallel processing

Time Travel

Query plan caching

Query acceleration

Overall explanation

By default, the COPY INTO statement will separate table data into a set of output files to take
advantage of Snowflake's parallel processing feature. This means that when data is
unloaded, it can be split into multiple files and each file can be processed simultaneously by
different nodes in the cluster, improving performance. The number of output files can be
controlled by specifying the number of file parts in the COPY INTO statement.

Question 59Skipped

How does a Snowflake user extract the URL of a directory table on an external stage for
further transformation?

Use the DESCRIBE STAGE command.

Correct answer

Use the GET_STAGE_LOCATION function.

Use the GET_ABSOLUTE_PATH function.

Use the SHOW STAGES command.

Overall explanation

The GET_STAGE_LOCATION function returns the location of a stage, including the URL of the
directory table. The syntax for the GET_STAGE_LOCATION function.

SELECT GET_STAGE_LOCATION(@my_stage)

See this link.

Question 60Skipped

What type of query benefits the MOST from search optimization?

A query that uses only disjunction (i.e., OR) predicates

A query that includes analytical expressions

A query that filters on semi-structured data types

Correct answer

A query that uses equality predicates or predicates that use IN

chumma kizhii
#dm

Overall explanation

See this link.

And see this link

Question 61Skipped

When loading data into Snowflake via Snowpipe what is the compressed file size
recommendation?

1000-1500 MB

10-50 MB

Correct answer

100-250 MB

300-500 MB

Overall explanation

We recommend files at least above 10 MB on average, with files in the 100 to 250 MB range
offering the best cost-to-performance ratio.

See this link

Question 62Skipped

What is used during the FIRST execution of SELECT COUNT(*) FROM ORDER?

Cache result

Correct answer

Metadata-based result

Remote disk cache

Virtual warehouse cache

Overall explanation

Some queries include steps that are pure metadata/catalog operations rather than data-
processing operations. These steps consist of a single operator. Some examples include:

Metadata-based Result

chumma kizhii
#dm

A query whose result is computed based purely on metadata, without accessing any data.
These queries are not processed by a virtual warehouse. For example:

SELECT COUNT(*) FROM …

SELECT CURRENT_DATABASE()

See this link.

Question 63Skipped

According to Snowflake best practice recommendations, which system-defined roles should


be used to create custom roles? (Choose two.)

ORGADMIN

ACCOUNTADMIN

SYSADMIN

Correct selection

USERADMIN

Correct selection

SECURITYADMIN

Overall explanation

Custom account roles can be created using the USERADMIN role (or a higher role)

See this link.

Question 64Skipped

Users with the ACCOUNTADMIN role can execute which of the following commands on
existing users?

Can SHOW users, DEFINE a given user or ALTER, DROP, or MODIFY a user

Can DEFINE users, DESCRIBE a given user, or ALTER or DELETE a user

Can SHOW users, INDEX a given user, or ALTER or DELETE a user

Correct answer

Can SHOW users DESCRIBE a given user, or ALTER or DROP a user

Overall explanation

Only these operations are allowed CREATE USER , ALTER USER , DROP USER , DESCRIBE USER

chumma kizhii
#dm

See this link.

Question 65Skipped

Which REST API can be used with unstructured data?

insertReport

Correct answer

GET /api/files/

insertFiles

loadHistoryScan

Overall explanation

See this link.

Question 66Skipped

What type of account can be used to share data with a consumer who does not have a
Snowflake account?

Organization

Data provider

Data consumer

Correct answer

Reader

Overall explanation

Data sharing is only supported between Snowflake accounts. As a data provider, you might
want to share data with a consumer who does not already have a Snowflake account or is
not ready to become a licensed Snowflake customer.

To facilitate sharing data with these consumers, you can create reader accounts. Reader
accounts (formerly known as “read-only accounts”) provide a quick, easy, and cost-effective
way to share data without requiring the consumer to become a Snowflake customer.

See this link.

Question 67Skipped

A user has a standard multi-cluster warehouse auto-scaling policy in place.

Which condition will trigger a cluster to shut-down?

chumma kizhii
#dm

When after 5-6 consecutive checks the system determines that the load on the most-
loaded cluster could be redistributed.

Correct answer

When after 2-3 consecutive checks the system determines that the load on the least-
loaded cluster could be redistributed.

When after 5-6 consecutive checks the system determines that the load on the least-
loaded cluster could be redistributed.

When after 2-3 consecutive checks the system determines that the load on the most-
loaded cluster could be redistributed.

Overall explanation

After 2 to 3 consecutive successful checks (performed at 1 minute intervals), which


determine whether the load on the least-loaded cluster could be redistributed to the other
clusters without spinning up the cluster again.

See this link.

Question 68Skipped

As a best practice, clustering keys should only be defined on tables of which minimum size?

Correct answer

Multi-Terabyte (TB) Range

Multi-Gigabyte (GB) Range

Multi-Megabyte (MB) Range

Multi-Kilobyte (KB) Range

Overall explanation

See this link

Question 69Skipped

Which roles can create shares and resource monitors?

USERADMIN

Correct answer

ACCOUNTADMIN

SYSADMIN

SECURITYADMIN

chumma kizhii
#dm

Overall explanation

ACCOUNTADMIN is the only role that is able to create Shares and Resource Monitors by
default. However, account administrators can choose to enable users with other roles to
view and modify resource monitors using SQL.

See this link.

Question 70Skipped

Which of the following are not types of streams in Snowflake? (Choose two.)

Append-only.

Correct selection

Update-only.

Insert-only.

Standard.

Correct selection

Merge-only.

Overall explanation

Standard and Append-only streams are supported on tables, directory tables, and views. The
Standard one tracks all DML changes to the source table, including inserts, updates, and
deletes, whereas the Append-only one Tracks row inserts only. The Insert-only stream also
tracks row inserts only. The difference with the previous one is that this one is only
supported on EXTERNAL TABLES.

See this link.

Question 71Skipped

Which of the following describes external functions in Snowflake?

They contain their own SQL code.

They call code that is stored inside of Snowflake.

They can return multiple rows for each row received.

Correct answer

They are a type of User-defined Function (UDF).

Overall explanation

chumma kizhii
#dm

An external function is a type of UDF. Unlike other UDFs, an external function does not
contain its own code; instead, the external function calls code that is stored and executed
outside Snowflake

See this link.

Question 72Skipped

Which statements are NOT correct about micro-partitions in Snowflake? (Choose two.)

50 and 500MB of uncompressed data.

Contiguous units of storage.

Correct selection

Non-contiguous units of storage.

Organized in a columnar way.

Correct selection

50 and 500MB of compressed data.

Overall explanation

This definition is a must, and we need to know it perfectly “All data in Snowflake tables are
automatically divided into micro-partitions, which are contiguous units of storage between
50 and 500MB of uncompressed data, organized in a columnar way”.

See this link.

Question 73Skipped

How would a user execute a series of SQL statements using a task?

Correct answer

Use a stored procedure executing multiple SQL statements and invoke the stored
procedure from the task. CREATE TASK mytask .... AS call
stored_proc_multiple_statements_inside();

Create a task for each SQL statement (e.g. resulting in task1, task2, etc.) and string the
series of SQL statements by having a control task calling task1, task2, etc. sequentially.

Include the SQL statements in the body of the task CREATE TASK mytask .. AS INSERT INTO
target1 SELECT .. FROM stream_s1 WHERE .. INSERT INTO target2 SELECT .. FROM
stream_s1 WHERE ..

chumma kizhii
#dm

A stored procedure can have only one DML statement per stored procedure invocation
and therefore the user should sequence stored procedure calls in the task
definition CREATE TASK mytask .... AS call stored_proc1(); call stored_proc2();

Overall explanation

See this link.

Question 74Skipped

Which of the following commands are valid options for the VALIDATION_MODE parameter
within the Snowflake COPY_INTO command? (Choose two.)

TRUE

Correct selection

RETURN_ALL_ERRORS

RETURN_FIRST_n_ERRORS

Correct selection

RETURN_n_ROWS

RETURN_ERROR_SUM

Overall explanation

VALIDATION_MODE = RETURN_n_ROWS | RETURN_ERRORS | RETURN_ALL_ERRORS

See this link.

Question 75Skipped

Which objects together comprise a namespace in Snowflake? (Choose two.)

Correct selection

Schema

Virtual warehouse

Correct selection

Database

Table

Account

Overall explanation

See this link.

chumma kizhii
#dm

Question 76Skipped

What is the most granular object that the Time Travel retention period can be defined on?

Database

Account

Correct answer

Table

Schema

Overall explanation

The time travel data retention can be overwritten at the table level "When creating a table,
schema, or database, the account default can be overridden using the
DATA_RETENTION_TIME_IN_DAYS parameter in the command."

Question 77Skipped

What are the responsibilities of Snowflake's Cloud Service layer? (Choose three.)

Correct selection

Query parsing and optimization

Correct selection

Authentication

Virtual warehouse caching

Physical storage of micro-partitions

Query execution

Correct selection

Metadata management

Overall explanation

The cloud services layer is a collection of services that coordinate activities across Snowflake.
These services tie together all of the different components of Snowflake in order to process
user requests, from login to query dispatch. The cloud services layer also runs on compute
instances provisioned by Snowflake from the cloud provider.

Services managed in this layer include:

• Authentication

chumma kizhii
#dm

• Infrastructure management

• Metadata management

• Query parsing and optimization

• Access control

See this link.

Question 78Skipped

Which of the following are best practices for loading data into Snowflake? (Choose three.)

Correct selection

Enclose fields that contain delimiter characters in single or double quotes.

Correct selection

Split large files into a greater number of smaller files to distribute the load among the
compute resources in an active warehouse.

Load data from files in a cloud storage service in a different region or cloud platform from
the service or region containing the Snowflake account, to save on cost.

Partition the staged data into large folders with random paths, allowing Snowflake to
determine the best way to load each file.

Correct selection

Aim to produce data files that are between 100 MB and 250 MB in size, compressed.

When planning which warehouse(s) to use for data loading, start with the largest
warehouse possible.

Overall explanation

To optimize the number of parallel operations for a load, we recommend aiming to produce
data files roughly 100-250 MB (or larger) in size compressed.

Fields that contain delimiter characters should be enclosed in quotes (single or double). If
the data contains single or double quotes, then those quotes must be escaped.

Split larger files into a greater number of smaller files to distribute the load among the
compute resources in an active warehouse.

See this link

Question 79Skipped

chumma kizhii
#dm

Which governance feature is supported by all Snowflake editions?

Masking policies

Row access policies

Object tags

Correct answer

OBJECT_DEPENDENCIES view

Overall explanation

Object tags, Masking Policies and Row Access Policies are for Enterprise and above editions.
The only option remaining is Object Dependencies

See this link.

Question 80Skipped

Which of the following accurately represents how a table fits into Snowflake's logical
container hierarchy?

Database -> Table -> Schema -> Account

Database -> Schema -> Table -> Account

Account -> Schema -> Database -> Table

Correct answer

Account -> Database -> Schema -> Table

Overall explanation

See this link

Question 81Skipped

Which of the following are characteristics of Snowflake virtual warehouses? (Choose two.)

Correct selection

SnowSQL supports both a configuration file and a command line option for specifying a
default warehouse.

Correct selection

The default virtual warehouse size can be changed at any time.

A user cannot specify a default warehouse when using the ODBC driver.

chumma kizhii
#dm

Auto-resume applies only to the last warehouse that was started in a multi-cluster
warehouse.

The ability to auto-suspend a warehouse is only available in the Enterprise edition or


above.

Overall explanation

See this link.

Question 82Skipped

Which are the metadata columns for staged files? (Choose two.)

Correct selection

METADATA$FILE_ROW_NUMBER

Correct selection

METADATA$FILENAME

METADATA$FILE_SIZE

METADATA$FILE_ROW_ID

METADATA$FILEFORMAT

Overall explanation

The METADATA$FILENAME column is the name of the staged data file that the current row
belongs to. The METADATA$FILE_ROW_NUMBER is the row number for each record in the
container staged data file. This is a way of query the stage metadata:

1. SELECT metadata$filename, metadata$file_row_number from @MY_STAGE

You can see another example (via docs.snowflake.com) in the following image:

chumma kizhii
#dm

Question 83Skipped

What setting in Snowsight determines the databases, tables, and other objects that can be
seen and the actions that can be performed on them?

Masking policy

Correct answer

Active role

Column-level security

Multi-Factor Authentication (MFA)

Overall explanation

While using Snowsight, you can change the active role in your current session. Your active
role determines the databases, tables, and other objects you can see and the actions you
can perform on them.

See this link.

Question 84Skipped

The use of which Snowflake table type will reduce costs when working with ETL workflows?

Permanent

Correct answer

Temporary

Transient

External

Overall explanation

chumma kizhii
#dm

Snowflake supports creating temporary tables for storing non-permanent, transitory data
(e.g. ETL data, session-specific data). Temporary tables only exist within the session in which
they were created and persist only for the remainder of the session.

See this link.

Question 85Skipped

When should a multi-cluster virtual warehouse be used in Snowflake?

When dynamic vertical scaling is being used in the warehouse

Correct answer

When queuing is delaying query execution on the warehouse

When there are no concurrent queries running on the warehouse

When there is significant disk spilling shown on the Query Profile

Overall explanation

See this link.

Question 86Skipped

After how many days does the COPY INTO load metadata expire?

Correct answer

64 days.

1 day.

180 days.

14 days.

Overall explanation

The information about the loaded files is stored in Snowflake metadata. It means that you
cannot COPY the same file again in the next 64 days unless you specify it (with the
"FORCE=True" option in the COPY command). You can see this behavior in the following
image:

chumma kizhii
#dm

Question 87Skipped

Which SQL statement will require a virtual warehouse to run?

1. CREATE OR REPLACE TABLE TBL_EMPLOYEE

2. (

3. EMP_ID NUMBER,

4. EMP_NAME VARCHAR(30),

5. EMP_SALARY NUMBER,

6. DEPT VARCHAR(20)

7. );

1. ALTER TABLE TBL_EMPLOYEE ADD COLUMN EMP_REGION VARCHAR(20);

1. SELECT COUNT(*) FROM TBL_EMPLOYEE;

Correct answer

1. INSERT INTO TBL_EMPLOYEE(EMP_ID, EMP_NAME, EMP_SALARY, DEPT) VALUES(1,


'Adam', 20000, 'Finance’);

Overall explanation

A warehouse provides the required resources, such as CPU, memory, and temporary storage,
to perform the following operations in a Snowflake session:

Executing SQL SELECT statements that require compute resources (e.g. retrieving rows from
tables and views).

Performing DML operations, such as:

• Updating rows in tables (DELETE , INSERT , UPDATE).

• Loading data into tables (COPY INTO <table>).

chumma kizhii
#dm

• Unloading data from tables (COPY INTO <location>).

Queries using Snowflake metadata do not require a Warehouse to be turned on and do not
consume credits.

See this link.

Question 88Skipped

Which character or character combination identifies a table stage?

“%”

“/@”

“@”

Correct answer

“@%”

Overall explanation

Table Stage

The following example uploads a file named data.csv in the /data directory on your local
machine to the stage for a table named mytable.

Note that the @% character combination identifies a table stage.

• Linux or macOS

1. PUT file:///data/data.csv @%mytable;

• Windows

1. PUT file://C:\data\data.csv @%mytable;

See this link.

Question 89Skipped

Which property helps us control the credits consumed by a multi-cluster warehouse?

Maximum Clusters (MAX_CLUSTERS)

Maximum Credits (MAX_CREDITS)

Auto scale (AUTO_SCALE)

Correct answer

Scaling policy (SCALING_POLICY)

chumma kizhii
#dm

Overall explanation

When you create a multi-cluster warehouse, you need to specify a scaling policy to help you
control the credits consumed by the multi-cluster warehouse. There are two types of
policies; the "Standard policy" prioritizes starting additional warehouses over conserving
credits. The "Economy policy" is a more restrictive policy that prioritizes conserving credits
over conserving starting additional warehouses. You can set the scaling policy by executing
the CREATE WAREHOUSE or ALTER WAREHOUSE command specifying the SCALING_POLICY.
For example:

1. ALTER WAREHOUSE mywh SET SCALING_POLICY = 'ECONOMY';

Question 90Skipped

Which cache can also be referred to as SSD or local cache?

Results cache.

Metadata cache.

Correct answer

Warehouse cache.

Standard cache.

Overall explanation

Every warehouse has an attached Warehouse cache, also known as the SSD or Local cache.
While the data warehouse runs, the table fetched in the query will remain in this cache.
When the warehouse is suspended, the information will be lost.

Question 91Skipped

What privilege does a user need in order to receive or request data from the Snowflake
Marketplace?

Correct answer

IMPORT SHARE

CREATE SHARE

CREATE DATA EXCHANGE LISTING

IMPORTED PRIVILEGES

Overall explanation

You must use the ACCOUNTADMIN role or another role with the CREATE DATABASE and
IMPORT SHARE privileges to access a listing.

chumma kizhii
#dm

See this link.

Question 92Skipped

A user has unloaded data from Snowflake to a stage.

Which SQL command should be used to validate which data was loaded into the stage?

1. verify @file_stage

Correct answer

1. list @file_stage

1. view @file_stage

1. show @file_stage

Overall explanation

See this link.

Question 93Skipped

Which Snowflake object returns a set of rows instead of a single, scalar value, and can be
accessed in the FROM clause of a query?

Correct answer

UDTF

UDF

There is no object in Snowflake with the ability to return a set of rows

Stored procedure

Overall explanation

User-defined functions (UDFs) let you extend the system to perform operations that are not
available through Snowflake’s built-in, system-defined functions.

UDTFs can return multiple rows for each input row; that’s the only difference with UDFs.

Question 94Skipped

By default, how long is the standard retention period for Time Travel across all Snowflake
accounts?

0 days

90 days

chumma kizhii
#dm

Correct answer

1 day

7 days

Overall explanation

By default, Time travel is enabled with a 1-day retention period. However, we can increase it
to 90 days if we have (at least) the Snowflake Enterprise Edition. It requires additional
storage, which will be reflected in your monthly storage charges. You can see how the Time
Travel functionality works in the following image:

See this link.

Question 95Skipped

Which of the following statements is true of Snowflake?

Correct answer

It was built specifically for the cloud

It was built for Hadoop Architecture

It was designed as a hybrid database to allow customers to store data either on premises
or in the cloud

It was built as an on-premises solution and then ported to the cloud

Overall explanation

See this link

Question 96Skipped

chumma kizhii
#dm

Which functions can be used to share unstructured data through a secure view? (Choose
two.)

BUILD_STAGE_FILE_URL

GET_ABSOLUTE_PATH

Correct selection

BUILD_SCOPED_FILE_URL

Correct selection

GET_PRESIGNED_URL

GET_RELATIVE_PATH

Overall explanation

See this link.

Question 97Skipped

Which of the following objects can be shared through secure data sharing?

Stored procedure

Masking policy

Task

Correct answer

External table

Overall explanation

The following Snowflake database objects can be shared:

• Tables

• External tables

• Secure views

• Secure materialized views

• Secure UDFs

Question 98Skipped

Why would a Snowflake user decide to use a materialized view instead of a regular view?

The results of the view change often.

chumma kizhii
#dm

The query results are not used frequently.

Correct answer

The base tables do not change frequently.

The query is not resource intensive.

Overall explanation

See this link.

Question 99Skipped

What is the main function of Business Intelligence tools (e.g., Tableau or Quicksight)?

Transform data from other source systems and move them into Snowflake stages.

Extract data from other source systems and move them into Snowflake stages.

Carry out Machine Learning analysis.

Correct answer

Create charts from Snowflake data to give more business insights.

Overall explanation

Business Intelligence (BI) tools are software applications that enable organizations to collect,
analyze, and present data in a user-friendly way to make better business decisions. BI tools
can help organizations identify trends, patterns, and insights in their data, which can
optimize business processes, improve performance, and gain a competitive advantage. You
can create charts like this one using Tableau from Snowflake data. You should create views to
be able to use VARIANT columns in these types of tools, apart from paying them separately.
You can see an example of a Tableau chart in the following image:

chumma kizhii
#dm

Question 100Skipped

How long is the Fail-safe period for temporary and transient tables?

90 days

31 days

7 days

Correct answer

There is no Fail-safe period for these tables.

1 day

Overall explanation

See this link

Question 101Skipped

By definition, a secure view is exposed only to users with what privilege?

IMPORT SHARE

REFERENCES

USAGE

Correct answer

OWNERSHIP

Overall explanation

OWNERSHIP.

The definition of a secure view is only exposed to authorized users (i.e. users who have been
granted the role that owns the view).

See this link.

chumma kizhii
#dm

Question 102Skipped

The Snowflake Cloud Data Platform is described as having which of the following
architectures?

Shared-nothing

Correct answer

Multi-cluster shared data

Serverless query engine

Shared-disk

Overall explanation

Snowflake's architecture is a hybrid of traditional shared-disk and shared-nothing database


architectures.

See this link

Question 103Skipped

What is the lowest Snowflake edition that offers Time Travel up to 90 days?

Correct answer

Enterprise Edition

Premier Edition

Standard Edition

Business Critical Edition

Overall explanation

See this link

Question 104Skipped

Which SQL command will download all the data files from an internal table stage named
TBL_EMPLOYEE to a local window directory or folder on a client machine in a folder named
folder with space within the C drive?

1. GET @%TBL_EMPLOYEE 'file://C:\folder with space\';

1. PUT 'file://C:/folder with space/*' @%TBL_EMPLOYEE;

chumma kizhii
#dm

1. PUT 'file://C:\folder with space\*' @%TBL_EMPLOYEE;

Correct answer

1. GET @%TBL_EMPLOYEE 'file://C:/folder with space/';

Overall explanation

If the directory path includes special characters, the entire file URI must be enclosed in
single quotes. Note that the drive and path separator is a forward slash (/) in enclosed URIs
(e.g. 'file://C:/temp/load data' for a path in Windows that includes a directory named load
data).

See this link.

Question 105Skipped

The fail-safe retention period is how many days?

45 days

Correct answer

7 days

90 days

1 day

Overall explanation

See this link

Question 106Skipped

What are types of partners in the Snowflake ecosystem? (Choose two.)

Correct selection

Solution Partners.

Personalized Partners.

Private Partners.

Standard Partners.

Correct selection

Technology Partners.

Overall explanation

chumma kizhii
#dm

Technology Partners integrate their solutions with Snowflake to get data quickly into
Snowflake and offer software, driver, interfaces, etc. Solution Partners are trusted and
validated experts, like consulting partners. You can see the whole Snowflake ecosystem in
the following image:

Question 107Skipped

Which function will provide the proxy information needed to protect Snowsight?

Correct answer

SYSTEM$ALLOWLIST

SYSTEM$GET_PRIVATELINK

SYSTEM$GET_TAG

SYSTEM$AUTHORIZE_PRIVATELINK

Overall explanation

To determine the fully qualified URL and port for Snowsight, review the
SNOWSIGHT_DEPLOYMENT entry in the return value of the SYSTEM$ALLOWLIST function.

chumma kizhii
#dm

See this link.

Question 108Skipped

Which validation option is the only one that supports the COPY INTO (location) command?

RETURN__ROWS

RETURN_ERRORS

Correct answer

RETURN_ROWS

RETURN_ALL_ERRORS

Overall explanation

• Loading:

COPY INTO (table) VALIDATION_MODE = RETURN_N_ROWS, RETURN_ERRORS,


RETURN_ALL_ERRORS

See this link.

• Unloading:

COPY INTO (location) VALIDATION_MODE = RETURN_ROWS

See this link.

Question 109Skipped

What is a characteristic of data micro-partitioning in Snowflake?

Micro-partitioning may introduce data skew

Correct answer

Micro-partitioning happens when the data is loaded

Micro-partitioning requires the definition of a partitioning schema

Micro-partitioning can be disabled within a Snowflake account

Overall explanation

Micro-partitioning is automatically performed on all Snowflake tables. Tables are


transparently partitioned using the ordering of the data as it is inserted/loaded.

See this link.

Question 110Skipped

chumma kizhii
#dm

When reviewing a query profile, what is a symptom that a query is too large to fit into the
memory?

An AggregateOperator node is present

Correct answer

The query is spilling to remote storage

A single join node uses more than 50% of the query time

Partitions scanned is equal to partitions total

Overall explanation

See this link

Question 111Skipped

When unloading the data for file format type specified (TYPE = 'CSV'), SQL NULL can be
converted to string ‘null’ using which file format option?

SKIP_BYTE_ORDER_MARK

Correct answer

NULL_IF

EMPTY_FIELD_AS_NULL

ESCAPE_UNENCLOSED_FIELD

Overall explanation

NULL_IF = ( 'string1' [ , 'string2' ... ] )

When unloading data from tables: Snowflake converts SQL NULL values to the first value in
the list. Be careful to specify a value that you want interpreted as NULL. For example, if you
are unloading data to a file that will get read by another system, make sure to specify a value
that will be interpreted as NULL by that system.

See this link.

Question 112Skipped

A running virtual warehouse is suspended.

What is the MINIMUM amount of time that the warehouse will incur charges for when it is
restarted?

Correct answer

chumma kizhii
#dm

60 seconds

60 minutes

5 minutes

1 second

Overall explanation

The minimum billing charge for provisioning compute resources is 1 minute (i.e. 60 seconds).

See this link.

Question 113Skipped

What privileges are required to create a task?

The GLOBAL privilege CREATE TASK is required to create a new task.

Tasks are created at the Application level and can only be created by the Account Admin
role.

Many Snowflake DDLs are metadata operations only, and CREATE TASK DDL can be
executed without virtual warehouse requirement or task specific grants.

Correct answer

The role must have access to the target schema and the CREATE TASK privilege on the
schema itself.

Overall explanation

See this link

Question 114Skipped

How many shares can be consumed by a single Data Consumer?

100, but can be increased by contacting support

10

Correct answer

Unlimited

Overall explanation

See this link

Question 115Skipped

chumma kizhii
#dm

In which different ways can we query historical data? (Choose three.)

By user.

Correct selection

By timestamp.

By backup.

By session.

Correct selection

By query statement ID.

Correct selection

By offset.

Overall explanation

Querying over historical data is one of the main functionalities of Snowflake Time Travel,
apart from restoring deleted objects. With Snowflake, you don’t need to duplicate or back
up data from key points in the past. Here you have an example of querying the historical
data by query statement ID:

1. SELECT *

2. FROM my_table

3. BEFORE(STATEMENT => '020e4e3f-3201-de43-0560-32fd00b1355e');

Question 116Skipped

chumma kizhii
#dm

What parameter controls if the Virtual Warehouse starts immediately after the CREATE
WAREHOUSE statement?

START_TIME = CURRENT_DATE()

START_THE = 60 // (seconds from now)

START_AFTER_CREATE = TRUE/FALSE

Correct answer

INITIALLY_SUSPENDED = TRUE/FALSE

Overall explanation

Syntax

CREATE [ OR REPLACE ] WAREHOUSE [ IF NOT EXISTS ] <name>

[ [ WITH ] objectProperties ]

[ objectParams ]

Where:

objectProperties ::=

WAREHOUSE_SIZE = XSMALL | SMALL | MEDIUM | LARGE | XLARGE | XXLARGE | XXXLARGE


| X4LARGE | X5LARGE | X6LARGE

MAX_CLUSTER_COUNT = <num>

MIN_CLUSTER_COUNT = <num>

SCALING_POLICY = STANDARD | ECONOMY

AUTO_SUSPEND = <num> | NULL

AUTO_RESUME = TRUE | FALSE

INITIALLY_SUSPENDED = TRUE | FALSE

RESOURCE_MONITOR = <monitor_name>

COMMENT = '<string_literal>'

ENABLE_QUERY_ACCELERATION = TRUE | FALSE

QUERY_ACCELERATION_MAX_SCALE_FACTOR = <num>

Question 117Skipped

chumma kizhii
#dm

What is the default File Format used in the COPY command if one is not specified?

XML

Parquet

JSON

Correct answer

CSV

Overall explanation

See this link

Question 118Skipped

Which privilege is required to use the search optimization service in Snowflake?

GRANT SEARCH OPTIMIZATION ON SCHEMA TO ROLE

GRANT SEARCH OPTIMIZATION ON DATABASE TO ROLE

Correct answer

GRANT ADD SEARCH OPTIMIZATION ON SCHEMA TO ROLE

GRANT ADD SEARCH OPTIMIZATION ON DATABASE TO ROLE

Overall explanation

To add, configure, or remove search optimization for a table, you must have the following
privileges:

• You must have OWNERSHIP privilege on the table.

• You must have ADD SEARCH OPTIMIZATION privilege on the schema that contains
the table.

GRANT ADD SEARCH OPTIMIZATION ON SCHEMA <schema_name> TO ROLE <role>

See this link.

Question 119Skipped

Which data formats are supported by Snowflake when unloading semi-structured data?
(Choose two.)

Plain text file containing XML elements

Comma-separated JSON

Correct selection

chumma kizhii
#dm

Newline Delimited JSON

Binary file in Avro

Correct selection

Binary file in Parquet

Overall explanation

Binary file in Parquet

Newline Delimited JSON (ndjson)

See this link.

Question 120Skipped

Which formats are supported for unloading data from Snowflake? (Choose two.)

Correct selection

JSON

Correct selection

Delimited (CSV, TSV, etc.)

XML

ORC

Avro

Overall explanation

See this link

Question 121Skipped

What is used to diagnose and troubleshoot network connections to Snowflake?

Snowpark

Snowsight

Correct answer

SnowCD

SnowSQL

Overall explanation

chumma kizhii
#dm

SnowCD (i.e. Snowflake Connectivity Diagnostic Tool) helps users to diagnose and
troubleshoot their network connection to Snowflake.

Question 122Skipped

Which data types in Snowflake are synonymous for FLOAT? (Choose two.)

Correct selection

DOUBLE

DECIMAL

NUMERIC

NUMBER

Correct selection

REAL

Overall explanation

DOUBLE, DOUBLE PRECISION, REAL are synonymous with float

See this link.

Question 123Skipped

Which of the following file formats is not supported by Snowflake?

Avro

XML

Correct answer

XLSX

ORC

Overall explanation

A File Format object describes and stores the format information required to load data into
Snowflake tables. CSV, JSON, Parquet, XML, Avro & ORC are the supported Snowflake file
formats. To import XLSX, you'll have to convert it to CSV first.

Question 124Skipped

How many warehouses do you need to run to have Snowpipe constantly running?

10

chumma kizhii
#dm

100

Correct answer

You don’t need any warehouse as Snowpipe is a serverless feature.

Overall explanation

Snowpipe enables loading data when the files are available in any (internal/external) stage.
You use it when you have a small volume of frequent data, and you load it continuously
(micro-batches). Snowpipe is serverless, which means that it doesn’t use Virtual
Warehouses. You can see how Snowpipe works in the following diagram:

Question 125Skipped

If you want a multi-cluster warehouse, which is the lowest Snowflake edition that you should
opt for?

Correct answer

Enterprise.

Business Critical.

Virtual Private Snowflake.

Standard.

Overall explanation

You can see some differences between the Snowflake editions in the following image:

chumma kizhii
#dm

Question 126Skipped

For which use cases is running a virtual warehouse required? (Choose two.) A. __B. |
C. D. E

Correct selection

When unloading data from a table

When creating a table

Correct selection

When loading data into a table

When executing a SHOW command

When executing a LIST command

Overall explanation

Executing SQL SELECT statements that require compute resources (e.g. retrieving rows from
tables and views).

Performing DML operations, such as:

• Updating rows in tables (DELETE , INSERT , UPDATE).

• Loading data into tables (COPY INTO ).

• Unloading data from tables (COPY INTO )

chumma kizhii
#dm

See this link.

Question 127Skipped

Snowflake best practice recommends that which role be used to enforce a network policy on
a Snowflake account?

SYSADMIN

ACCOUNTADMIN

Correct answer

SECURITYADMIN

USERADMIN

Overall explanation

See this link.

Question 128Skipped

A tabular User-Defined Function (UDF) is defined by specifying a return clause that contains
which keyword?

ROW_NUMBER

VALUES

TABULAR

Correct answer

TABLE

Overall explanation

RETURNS TABLE(...)

Specifies that the UDF should return a table. Inside the parentheses, specify name-and-type
pairs for columns (as described below) to include in the returned table.

See this link.

Question 129Skipped

What is the default behavior of internal stages in Snowflake?

Correct answer

Each user and table are automatically allocated an internal stage.

Named internal stages are created by default.

chumma kizhii
#dm

Data files are automatically staged to a default location.

Users must manually create their own internal stages.

Overall explanation

By default, each user and table in Snowflake is automatically allocated an internal stage for
staging data files to be loaded. In addition, you can create named internal stages.

See this link.

Question 130Skipped

Which of the following are characteristics of schemas used in Snowflake? (Choose two.)

Each schema is contained within a virtual warehouse.

Correct selection

A database may contain one or more schemas.

A schema may contain one or more databases.

Correct selection

A schema represents a logical grouping of database objects.

A table can span more than one schema.

Overall explanation

Databases and schemas are used to organize data stored in Snowflake:

• A database is a logical grouping of schemas. Each database belongs to a single


Snowflake account.

• A schema is a logical grouping of database objects (tables, views, etc.). Each schema
belongs to a single database.

See this link

Question 131Skipped

How do Snowflake data providers share data that resides in different databases?

Materialized views

Correct answer

Secure views

User-Defined Functions (UDFs)

External tables

chumma kizhii
#dm

Overall explanation

Snowflake data providers can share data that resides in different databases by using secure
views. A secure view can reference objects such as schemas, tables, and other views from
one or more databases, as long as these databases belong to the same account.

See this link.

Question 132Skipped

At what level can the ALLOW_CLIENT_MFA_CACHING parameter be set?

Role

Session

Correct answer

Account

User

Overall explanation

chumma kizhii
#dm

Account — Can only be set for Account

See this link.

Question 133Skipped

Which command can we use to restore a deleted table on Snowflake?

Correct answer

UNDROP TABLE <mytable>

REPLACE TABLE <mytable>

REBUILD TABLE <mytable>

RESTORE TABLE <mytable>

Overall explanation

To restore objects, we use the command “UNDROP”. We can use it with Databases,
Schemas, or Tables. If we try to restore an object with a name that already exists, Snowflake
will give an error.

Question 134Skipped

Which of the following statements would be used to export/unload data from Snowflake?

INSERT INTO @stage

EXPORT TO @stage

Correct answer

COPY INTO @stage

EXPORT_TO_STAGE(stage => @stage, select => 'select * from t1');

Overall explanation

See this link

Question 135Skipped

What does Snowflake recommend a user do if they need to connect to Snowflake with a tool
or technology that is not listed in Snowflake’s partner ecosystem?

Use a custom-built connector.

Use Snowflake’s native API.

Contact Snowflake Support for a new driver.

Correct answer

chumma kizhii
#dm

Connect through Snowflake’s JDBC or ODBC drivers.

Overall explanation

See this link.

Question 136Skipped

How does the ACCESS_HISTORY view enhance overall data governance pertaining to read
and write operations? (Choose two.)

Protects sensitive data from unauthorized access while allowing authorized users to access
it at query runtime

Identifies columns with personal information and tags them so masking policies can be
applied to protect sensitive data

Correct selection

Provides a unified picture of what data was accessed and when it was accessed

Determines whether a given row in a table can be accessed by the user by filtering the
data based on a given policy

Correct selection

Shows how the accessed data was moved from the source to the target objects

Overall explanation

See this link.

Question 137Skipped

When a database is cloned, which objects in the clone inherit all granted privileges from the
source object? (Choose two.)

Database

Internal named stages

Correct selection

Schemas

Account

Correct selection

Tables

Overall explanation

chumma kizhii
#dm

Only child objects inherit the privileges. If the source object is a database or schema, the
clone inherits all granted privileges on the clones of all child objects contained in the source
object:

For databases, contained objects include schemas, tables, views, etc.

For schemas, contained objects include tables, views, etc.

See this link.

Question 138Skipped

What is the maximum total Continuous Data Protection (CDP) charges incurred for a
temporary table?

7 days

30 days

48 hours

Correct answer

24 hours

Overall explanation

Thus, the maximum total CDP charges incurred for a temporary table are 1 day (or less if the
table is explicitly dropped or dropped as a result of terminating the session). During this
period, Time Travel can be performed on the table.

See this link

Question 139Skipped

Which query contains a Snowflake hosted file URL in a directory table for a stage named
bronzestage?

1. select * from table(information_schema.stage_directory_file_registration_history(


stage_name=>'bronzestage'));

Correct answer

1. select * from directory(@bronzestage);

1. list @bronzestage;

1. select metadata$filename from @bronzestage;

Overall explanation

chumma kizhii
#dm

Following parameter: FILE_URL -Snowflake-hosted file URL to the file.

See this link.

Question 140Skipped

The property MINS_TO_BYPASS_NETWORK_POLICY is set at which level?

Role

Organization

Account

Correct answer

User

Overall explanation

The user object property MINS_TO_BYPASS_NETWORK_POLICY defines the number of


minutes in which a user can access Snowflake without conforming to an existing network
policy.

See this link.

MINS_TO_BYPASS_NETWORK_POLICY allows the user to temporarily bypass a network


policy for a set number of minutes.

Question 141Skipped

When floating-point number columns are unloaded to CSV or JSON files, Snowflake
truncates the values to approximately what?

Correct answer

(15,9)

(14,8)

(10,4)

(12,2)

Overall explanation

See this link.

Question 142Skipped

What versions of Snowflake should be used to manage compliance with Personal Identifiable
Information (PII) requirements? (Choose two.)

chumma kizhii
#dm

Correct selection

Business Critical Edition

Correct selection

Virtual Private Snowflake

Custom Edition

Enterprise Edition

Standard Edition

Overall explanation

See this link

Question 143Skipped

Which command should be used to download files from a Snowflake stage to a local folder
on a client's machine?

COPY

PUT

SELECT

Correct answer

GET

Overall explanation

See this link

Question 144Skipped

What are the primary authentication methods that Snowflake supports for securing REST API
interactions? (Choose two.)

Key pair authentication

Multi-Factor Authentication (MFA)

Federated authentication

Correct selection

OAuth

Correct selection

Username and password authentication

chumma kizhii
#dm

Overall explanation

Snowflake supports the following methods of authentication while using External API
Authentication:

• Basic authentication.

• OAuth with code grant flow.

• OAuth with client credentials flow.

See this link.

Question 145Skipped

A user has semi-structured data to load into Snowflake but is not sure what types of
operations will need to be performed on the data.

Based on this situation, what type of column does Snowflake recommend be used?

OBJECT

ARRAY

TEXT

Correct answer

VARIANT

Overall explanation

Snowflake natively supports semi-structured data, which means semi-structured data can be
loaded into relational tables without requiring the definition of a schema in advance.
Snowflake supports loading semi-structured data directly into columns of type VARIANT.

Typically, tables used to store semi-structured data consist of a single VARIANT column. Once
the data is loaded, you can query the data similar to structured data. You can also perform
other tasks, such as extracting values and objects from arrays. For more information, see the
FLATTEN table function.

See this link.

Question 146Skipped

Which certifications are compliant with Snowflake? (Choose three.)

Correct selection

HIPAA.

Correct selection

chumma kizhii
#dm

FedRAMP.

SC-900.

Correct selection

PCI-DSS.

ISO 9000.

Overall explanation

They won't ask you in-depth questions about this topic in the exam, but it's important to
remember some of the most important ones. You can see other certifications at the
following link.

Question 147Skipped

Which of the following view types are available in Snowflake? (Choose two.)

External view

Correct selection

Materialized view

Layered view

Correct selection

Secure view

Embedded view

Overall explanation

Question 148Skipped

Which command should be used to load data from a file, located in an external stage, into a
table in Snowflake?

GET

PUT

chumma kizhii
#dm

Correct answer

COPY

INSERT

Overall explanation

See this link

Question 149Skipped

How many predecessor tasks can a child task have?

1000

All the parents that we establish.

Correct answer

100

Overall explanation

A child task USED TO HAVE only a predecessor task. As a new feature of September 2022,
Snowflake also supports DAG of tasks. In a DAG, each non-root task can have dependencies
on multiple predecessor tasks, increasing the previous limit to 100 predecessors.

Question 150Skipped

What are the correct settings for column and element names, regardless of which notation is
used while accessing elements in a JSON object?

Both the column name and the element name are case-insensitive.

Both the column name and the element name are case-sensitive.

The column name is case-sensitive and the element names are case-insensitive.

Correct answer

The column name is case-insensitive and the element name is case-sensitive.

Overall explanation

The column name is case-insensitive and the element name is case-sensitive.

Regardless of which notation you use, the column name is case-insensitive but element
names are case-sensitive. For example, in the following list, the first two paths are
equivalent, but the third is not:

chumma kizhii
#dm

src:salesperson.name

SRC:salesperson.name

SRC:Salesperson.Name

See this link.

Question 151Skipped

What is a characteristic of materialized views in Snowflake?

Correct answer

Materialized views do not allow joins.

Clones of materialized views can be created directly by the user.

Aggregate functions can be used as window functions in materialized views.

Multiple tables can be joined in the underlying query of a materialized view.

Overall explanation

The following limitations apply to creating materialized views:

• A materialized view can query only a single table.

• Joins, including self-joins, are not supported.

• A materialized view cannot query:

• A materialized view.

• A non-materialized view.

• A UDTF (user-defined table function).

and more!

See this link.

Question 152Skipped

Which types of subqueries does Snowflake support? (Choose two.)

Correct selection

EXISTS, ANY / ALL, and IN subqueries in WHERE clauses: these subqueries can be
correlated or uncorrelated

Uncorrelated scalar subqueries in WHERE clauses

Correct selection

chumma kizhii
#dm

Uncorrelated scalar subqueries in any place that a value expression can be used

EXISTS, ANY / ALL, and IN subqueries in WHERE clauses: these subqueries can be
uncorrelated only

EXISTS, ANY / ALL, and IN subqueries in WHERE clauses: these subqueries can be
correlated only

Overall explanation

Snowflake currently supports the following types of subqueries:

• Uncorrelated scalar subqueries in any place that a value expression can be used.

• Correlated scalar subqueries in WHERE clauses.

• EXISTS, ANY / ALL, and IN subqueries in WHERE clauses. These subqueries can be
correlated or uncorrelated.

See this link.

Question 153Skipped

Which command line flags can be used to log into a Snowflake account using SnowSQL?
(Choose two.)

-c

-e

Correct selection

-a

-o

Correct selection

-d

Overall explanation

a, ~-accountname TEXT

Your account identifier. Honors $SNOWSQL_ACCOUNT.

-d, dbname TEXT

Database to use. Honors $SNOWSQL_DATABASE.

See this link.

Question 154Skipped

chumma kizhii
#dm

Which of the following statements are true of VALIDATION_MODE in Snowflake? (Choose


two.)

Correct selection

VALIDATION_MODE=RETURN_ALL_ERRORS is a parameter of the COPY command

The VALIDATION_MODE parameter supports COPY statements that transform data during
a load

The VALIDATION_MODE option will validate data to be loaded by the COPY statement
while completing the load and will return the rows that could not be loaded without error

Correct selection

The VALIDATION_MODE option will validate data to be loaded by the COPY statement
without completing the load and will return possible errors

The VALIDATION_MODE parameter supports COPY statements that load data from
external stages only

Overall explanation

See this link

Question 155Skipped

The MAXIMUM size for a serverless task run is equivalent to what size virtual warehouse?

Large

Medium

4X-Large

Correct answer

2X-Large

Overall explanation

The maximum size for a serverless task run is equivalent to an XXLARGE warehouse.

See this link.

chumma kizhii
#dm

SET 3

Question 1Skipped

Using COPY INTO <location> command, to which locations is not possible to unload data
from a table?

Named external stage that references an external location (Amazon S3, Google Cloud
Storage, or Microsoft Azure).

Named internal stage (or table/user stage).

An external location like Amazon S3 or Azure.

chumma kizhii
#dm

Correct answer

Local Drive.

Overall explanation

Once the data is in the internal stage, you can download them into your local drive using the
GET command. You can also unload data into an external location, as we can see in the
following image:

Question 2Skipped

How does Snowflake improve the performance of queries that are designed to filter out a
significant amount of data?

By increasing the number of partitions scanned

The use of TableScan

Correct answer

The use of pruning

The use of indexing

Overall explanation

See this link.

Question 3Skipped

A team runs the same query daily, generally with a frequency of fewer than 24 hours, and it
takes around 10 minutes to execute. They realized that the underlying data changes because
of an ETL process that runs every morning.

How can they use the results cache to save the 10 minutes that the query is being executed?

Correct answer

After the ETL run, execute the identical queries so that the result remains in the cache.

After the ETL run, increase the warehouse size. Decrease it after the query runs.

After the ETL run, copy the tables to another database for the team to query.

chumma kizhii
#dm

After the ETL run, use Time-Travel feature.

Overall explanation

In this case, because the underlying data changes every morning due to the ETL process, the
results cache may not be useful for the daily query execution. However, suppose the team
executes an identical query immediately after the ETL process runs. In that case, the results
of that query will be stored in the results cache and can be retrieved for subsequent queries.
By doing so, the team can save the 10 minutes that the query is being executed by retrieving
the results from the cache.

Question 4Skipped

Which COPY INTO command outputs the data into one file?

MULTIPLE=FALSE

MAX_FILE_NUMBER=1

FILE_NUMBER=1

Correct answer

SINGLE=TRUE

Overall explanation

See this link.

Question 5Skipped

What privileges are necessary for a consumer in the Data Exchange to make a request and
receive data? (Choose two.)

Correct selection

IMPORT SHARE

Correct selection

CREATE DATABASE

REFERENCE_USAGE

USAGE

OWNERSHIP

Overall explanation

To access a listing, you must use the ACCOUNTADMIN role or another role with the CREATE
DATABASE and IMPORT SHARE privileges.

chumma kizhii
#dm

See this link.

Question 6Skipped

While loading data through the COPY command, you can transform the data.

Which of the below transformations is not allowed?

Reorder columns.

Omit columns.

Correct answer

Filters.

Cast.

Truncate columns.

Overall explanation

You can see other transformations at the following link.

Question 7Skipped

Which of the below columns will you consider while choosing a cluster key? (Choose two.)

Correct selection

Columns that are typically used in the selective filters.

Columns with extremely high cardinality.

Columns with extremely low cardinality.

Correct selection

Columns are frequently used in join predicates.

Overall explanation

A column with very low cardinality (e.g., a column indicating only whether a person is male
or female) might yield minimal pruning. On the other hand, a column with very high
cardinality (e.g., a column containing UUID or nanosecond timestamp values) is also typically
not a good candidate to use directly as a clustering key.

Question 8Skipped

A user created a transient table and made several changes to it over the course of several
days. Three days after the table was created, the user would like to go back to the first
version of the table.

chumma kizhii
#dm

How can this be accomplished?

Use the FAIL_SAFE parameter for Time Travel to retrieve the data from Fail-safe storage.

Contact Snowflake Support to have the data retrieved from Fail-safe storage.

Use Time Travel, as long as DATA_RETENTION_TIME_IN_DAYS was set to at least 3 days.

Correct answer

The transient table version cannot be retrieved after 24 hours.

Overall explanation

Question 9Skipped

What Snowflake database object is derived from a query specification, stored for later use,
and can speed up expensive aggregation on large data sets?

Secure view

External table

Temporary table

Correct answer

Materialized view

Overall explanation

Simple but very frequent question in exams. Materialized views are an important topic.

See this link.

Question 10Skipped

What option will you specify to delete the stage files after a successful load into a Snowflake
table with the COPY INTO command?

Correct answer

chumma kizhii
#dm

1. PURGE = TRUE

1. TRUNCATE = TRUE

1. REMOVE = TRUE

1. DELETE = TRUE

Overall explanation

If the PURGE option is set to TRUE, Snowflake will try its best to remove successfully loaded
data files from stages. If the purge operation fails for any reason, it won't return any error for
now.

1. COPY INTO mytable PURGE = TRUE;

Question 11Skipped

What COPY INTO SQL command should be used to unload data into multiple files?

SINGLE=TRUE

MULTIPLE=TRUE

Correct answer

SINGLE=FALSE

MULTIPLE=FALSE

Overall explanation

The default is SINGLE = FALSE (i.e. unload into multiple files).

See this link.

Question 12Skipped

A sales table FCT_SALES has 100 million records.

The following query was executed:

SELECT COUNT (1) FROM FCT_SALES;

How did Snowflake fulfill this query?

Query against a virtual warehouse cache

Correct answer

chumma kizhii
#dm

Query against the metadata cache

Query against the result set cache

Query against the most-recently created micro-partition

Overall explanation

The count() is one of the operations that is resolved in the Metadata Layer.

See this link for additional information about metadata operations.

Question 13Skipped

When unloading data with the COPY INTO command, what is the purpose of the PARTITION
BY parameter option?

To sort the contents of the output file by the specified expression.

Correct answer

To split the output into multiple files, one for each distinct value of the specified
expression.

To delimit the records in the output file using the specified expression.

To include a new column in the output using the specified window function expression.

Overall explanation

The PARTITION BY copy option accepts an expression by which the unload operation
partitions table rows into separate files unloaded to the specified stage.

See this link.

Question 14Skipped

Which character identifies a user stage?

Correct answer

“@~”

“@%”

“@”

“~”

Overall explanation

Each user has a Snowflake personal stage allocated to them by default for storing files, and
no one can access them except the user it belongs to. It's represented with the "@~"

chumma kizhii
#dm

character. In the following example, we are uploading the file "myfile.csv" to the stage from
the current user:

1. PUT file://C:\data\myfile.csv @~

Question 15Skipped

By default, which role allows a user to manage a Snowflake Data Exchange share?

USERADMIN

SYSADMIN

Correct answer

ACCOUNTADMIN

SECURITYADMIN

Overall explanation

By default, the privileges required to create and manage shares are granted only to the
ACCOUNTADMIN role, ensuring that only account administrators can perform these tasks.

See this link.

Question 16Skipped

What is the minimum Snowflake edition that you need for the Data Sharing capability?

Business Critical

Enterprise

Virtual Private Snowflake

Correct answer

Standard

Overall explanation

Secure Data Sharing lets you share selected objects in a database in your account with other
Snowflake accounts. All the data-sharing features are available for these three types of
editions.

Question 17Skipped

Based on Snowflake recommendations, when creating a hierarchy of custom roles, the top-
most custom role should be assigned to which role?

chumma kizhii
#dm

SECURITYADMIN

Correct answer

SYSADMIN

USERADMIN

ACCOUNTADMIN

Overall explanation

See this link.

Question 18Skipped

The effects of query pruning can be observed by evaluating which statistics? (Choose two.)

chumma kizhii
#dm

Bytes scanned

Bytes written

Correct selection

Partitions scanned

Correct selection

Partitions total

Bytes read from result

Overall explanation

• Pruning — information on the effects of table pruning:

• Partitions scanned — number of partitions scanned so far.

• Partitions total — total number of partitions in a given table.

See this link.

Question 19Skipped

Which command is used to unload data from a table or move a query result to a stage?

GET

PUT

MERGE

Correct answer

COPY INTO

Overall explanation

Use the COPY INTO <location> command to copy the data from the Snowflake database
table into one or more files in a Snowflake or external stage.

See this link.

Question 20Skipped

What computer language can be selected when creating User-Defined Functions (UDFs)
using the Snowpark API?

Swift

Correct answer

Python

chumma kizhii
#dm

JavaScript

SQL

Overall explanation

Snowpark API supports: Python, Scala and Java

See this link.

Question 21Skipped

Which command can be used to list all network policies available in an account? A.

DESCRIBE SESSION POLICY

SHOW SESSION POLICIES

Correct answer

SHOW NETWORK POLICIES

DESCRIBE NETWORK POLICY

Overall explanation

See this link.

Question 22Skipped

Which command will you run to list all users and roles to which a role has been granted?

Correct answer

1. SHOW GRANTS OF ROLE <ROLE>

1. SHOW GRANTS IN ROLE <ROLE>

1. SHOW USERS OF ROLE <ROLE>

1. SHOW GRANTS TO ROLE <ROLE>

Overall explanation

“SHOW GRANTS OF ROLE” will list the users, whereas “SHOW GRANTS TO ROLE” will list the
privileges to which this role has access.

Here you can see an example of running the command in my Snowflake account:

chumma kizhii
#dm

Question 23Skipped

Which Snowflake edition supports private communication between Snowflake and your
other VPCs through AWS PrivateLink?

All Snowflake editions supports private communication between Snowflake and your
other VPCs through AWS PrivateLink.

Standard.

Enterprise.

Correct answer

Business Critical.

Overall explanation

AWS PrivateLink is an AWS service for creating private VPC endpoints that allow direct,
secure connectivity between your AWS VPCs and the Snowflake VPC without traversing the
public Internet. This feature requires the Business Critical edition or higher.

You can see the differences between the Snowflake editions in the following image:

chumma kizhii
#dm

Question 24Skipped

Which of the following features, associated with Continuous Data Protection (CDP), require
additional Snowflake-provided data storage? (Choose two.)

Data encryption

Tri-Secret Secure

Correct selection

Fail-safe

Correct selection

Time Travel

External stages

Overall explanation

Both Time Travel and Fail-safe require additional data storage, which has associated fees .

See this link.

Question 25Skipped

Which statements are correct concerning the leveraging of third-party data from the
Snowflake Data Marketplace? (Choose two.)

Data transformations are required when combining Data Marketplace datasets with
existing data in Snowflake.

Correct selection

Data is live, ready-to-query, and can be personalized.

Data is not available for copying or moving to an individual Snowflake account.

Correct selection

Data is available without copying or moving.

Data needs to be loaded into a cloud provider as a consumer account.

chumma kizhii
#dm

Overall explanation

Data in the Snowflake Data Marketplace is already formatted and ready to query, and can be
personalized for specific business needs.

Data from the Snowflake Data Marketplace is accessed through Snowflake's Secure Data
Sharing technology, which allows users to access the data without copying or moving it to
their own account.

Loading data into a cloud provider as a consumer account is not required to leverage data
from the Snowflake Data Marketplace, and the data can be accessed and used in a
Snowflake account without restriction.

Data transformations may not be required when combining Data Marketplace datasets with
existing data in Snowflake, as it depends on the specific data being used and how it needs to
be combined or analyzed.

See this link.

Question 26Skipped

What are ways to create and manage data shares in Snowflake? (Choose two.)

Correct selection

Through the Snowflake web interface (UI)

Through the DATA_SHARE=TRUE parameter

Using the CREATE SHARE AS SELECT * FROM TABLE command

Correct selection

Through SQL commands

Through the ENABLE_SHARE=TRUE parameter

Overall explanation

See this link.

Question 27Skipped

Which user preferences can be set for a user profile in Snowsight? (Choose two.)

Correct selection

Multi-Factor Authentication (MFA)

Correct selection

Notifications

chumma kizhii
#dm

Default database

Default schema

Username

Overall explanation

On your profile, you can review and set the following user details:

• Profile photo

• Username (cannot be changed)

• First Name

• Last Name

• Password

• Email

You can also set the following preferences:

• Default role & warehouse

• Default experience

• Language: Select the language to use for Snowsight. Snowflake currently supports
the following languages:

• English (US)

• Japanese (日本語)

• Notifications: Select whether to send a browser notification when a query finishes


running in the background. When you set this preference for the first time, your
browser prompts you to allow notifications from Snowflake.

If your active role has access to set up resource monitor notifications, you can also select a
checkbox to set up Email notifications from resource monitors.

• Multi-factor authentication: Select whether to enroll in multi-factor authentication


(MFA).

See this link.

Question 28Skipped

What can be used to view warehouse usage over time? (Choose two.)

Correct selection

chumma kizhii
#dm

The billing and usage tab in the Snowflake web UI

Correct selection

The WAREHOUSE_METERING_HISTORY view

The query history view

The SHOW WAREHOUSES command

The LOAD HISTORY view

Overall explanation

WAREHOUSE_METERING_HISTORY View

This Account Usage view can be used to return the hourly credit usage for a single
warehouse (or all the warehouses in your account) within the last 365 days (1 year). See
this link.

Snowsight can be used to view complute cost. See this link.

Question 29Skipped

How does the PARTITION BY option affect an expression for a COPY INTO command?

The unload operation partitions table rows into separate files unloaded to the specified
table.

A single file will be loaded with a Snowflake-defined partition key and Snowflake will use
this key for pruning.

Correct answer

The unload operation partitions table rows into separate files unloaded to the specified
stage.

A single file will be loaded with a user-defined partition key and the user can use this
partition key for clustering.

Overall explanation

The PARTITION BY copy option accepts an expression by which the unload operation
partitions table rows into separate files unloaded to the specified stage.

See this link.

Question 30Skipped

In (at least), how many availability zones does Snowflake replicate your data to?

Two.

chumma kizhii
#dm

One.

Correct answer

Three.

It depends of the Snowflake Edition.

Overall explanation

Cloud storage synchronously and automatically replicates the stored data across multiple
devices and at least three availability zones. You can read more information at the following
link.

Question 31Skipped

Which database objects can be shared with the Snowflake secure data sharing feature?
(Choose two.)

Files

Sequences

Correct selection

External tables

Correct selection

Secure User-Defined Functions (UDFs)

Streams

Overall explanation

Secure Data Sharing lets you share selected objects in a database in your account with other
Snowflake accounts. You can share the following Snowflake database objects:

• Tables

• External tables

• Secure views

• Secure materialized views

• Secure UDFs

See this link.

Question 32Skipped

chumma kizhii
#dm

Column level security in Snowflake allows the application of a masking policy to a column
within a table or view. Which two features are related to column-level security? (Choose
two.)

Correct selection

Dynamic Data Masking.

Correct selection

External Tokenization

Data Encryption

Conditional Tokenization.

Lock Databases.

Overall explanation

Dynamic Data Masking is a security feature in Snowflake that enables you to mask sensitive
data (for example, credit card numbers or passwords) in real-time, based on the user's
permissions and role. When a user with restricted access attempts to access the masked
data, the data is replaced with a masked value or redacted to ensure that sensitive
information is not exposed. You can see how it works in the image below.

External Tokenization is a data protection method in Snowflake that allows organizations to


tokenize sensitive data before loading that data into Snowflake and dynamically detokenize
data at query runtime using masking policies with Writing External Functions.

Both features require Enterprise Edition (or higher).

chumma kizhii
#dm

Question 33Skipped

Which service does Snowflake use to provide the Zero-Copy cloning functionality?

Cache.

SSD Cache of the Virtual Warehouses.

Backup management services.

Correct answer

Metadata from the service layer.

Overall explanation

Zero-Copy cloning does NOT duplicate data; it duplicates the metadata of the micro-
partitions. For this reason, Zero-Copy cloning doesn’t consume storage. When you modify
some cloned data, it will consume storage because Snowflake has to recreate the micro-
partitions.

You can see this behavior in the following image:

chumma kizhii
#dm

Question 34Skipped

What file format provides the fastest load performance?

Parquet.

JSON.

Avro.

Correct answer

CSV.

Overall explanation

When we talk about loading data, you get the most significant speed at loading CSV files.
However, Snowflake is fast and flexible, and you can also use other formats like JSON or
Parquet. You can see an article talking about it at the following link.

Question 35Skipped

What is the expiration period for a file URL used to access unstructured data in cloud
storage?

The length of time specified in the expiration_time argument

The remainder of the session

The same length of time as the expiration period for the query results cache

Correct answer

An unlimited amount of time

chumma kizhii
#dm

Overall explanation

File Url Expiration time is permanent.

See this link.

Question 36Skipped

What is the purpose of a resource monitor in Snowflake?

To create and suspend virtual warehouses automatically

To monitor the query performance of virtual warehouses

Correct answer

To control costs and credit usage by virtual warehouses

To manage cloud services needed for virtual warehouses

Overall explanation

See this link.

Question 37Skipped

Which statements about Snowflake tasks are true? (Choose two.)

Correct selection

A task can execute a call to a Stored Procedure.

Correct selection

A task can execute a single SQL Statement.

A task can not execute a call to a Stored Procedure.

A task can execute a function.

A task can execute multiple SQL Statements.

Overall explanation

Only one SQL statement is allowed to be executed through a task. If you need to execute
multiple statements, build a procedure.

Question 38Skipped

Which statement is true about running tasks in Snowflake?

Correct answer

A task allows a user to execute a single SQL statement/command using a predefined


schedule.

chumma kizhii
#dm

A task can be called using a CALL statement to run a set of predefined SQL commands.

A task can be executed using a SELECT statement to run a predefined SQL command.

A task allows a user to execute a set of SQL commands on a predefined schedule.

Overall explanation

A task can execute any one of the following types of SQL code:

Single SQL statement

Call to a stored procedure

Procedural logic using Snowflake Scripting Developer Guide

See this link.

Question 39Skipped

What operations can be performed while loading a simple CSV file into a Snowflake table
using the COPY INTO command? (Choose two.)

Selecting the first few rows

Correct selection

Reordering the columns

Performing aggregate calculations

Correct selection

Converting the datatypes

Grouping by operations

Overall explanation

The COPY command supports:

• Column reordering, column omission, and casts using a SELECT statement. There is
no requirement for your data files to have the same number and ordering of columns
as your target table.

• The ENFORCE_LENGTH | TRUNCATECOLUMNS option, which can truncate text strings


that exceed the target column length.

See this link.

Question 40Skipped

In which Snowflake edition is Tri-Secret Secure option available?

chumma kizhii
#dm

Correct answer

Business Critical or higher.

Tri-Secret Secure option is not available.

Standard or higher.

Enterprise or higher.

Overall explanation

Tri-Secret Secure combines a Snowflake-maintained key and a customer-managed key in the


cloud provider platform that hosts your Snowflake account to create a composite master key
to protect your Snowflake data. Customer-managed encryption keys through Tri-Secret
Secure are available in the Business Critical and VPS editions.

Question 41Skipped

Which statement describes Snowflake tables?

Snowflake tables are owned by a user.

Correct answer

Snowflake tables are logical representations of underlying physical data.

Snowflake tables are the physical instantiation of data loaded into Snowflake.

Snowflake tables require that clustering keys be defined to perform optimally.

Overall explanation

All data in Snowflake is stored in database tables, logically structured as collections of


columns and rows. To best utilize Snowflake tables, particularly large tables, it is helpful to
have an understanding of the physical structure behind the logical structure.

See this link.

Question 42Skipped

Which command is used to start configuring Snowflake for Single Sign-On (SSO)?

CREATE SESSION POLICY

CREATE PASSWORD POLICY

CREATE NETWORK RULE

Correct answer

CREATE SECURITY INTEGRATION

chumma kizhii
#dm

Overall explanation

Snowflake uses a SAML2 security integration to integrate with the IdP you are using to
implement federated authentication. Use the CREATE SECURITY INTEGRATION command to
start configuring Snowflake for SSO.

See this link.

Question 43Skipped

What is the minimum Snowflake edition required for row level security?

Standard

Virtual Private Snowflake

Correct answer

Enterprise

Business Critical

Overall explanation

chumma kizhii
#dm

See this link.

Question 44Skipped

You have the following data in a variant column from the table “myTable”. How can you
query the favorite technology that Gonzalo uses?

1. {

2. "name": "Chris Snow",

3. "favouriteTechnology": Snowflake,

4. "hobbies":[

5. {"name": "soccer"},

6. {"name": "music"},

chumma kizhii
#dm

7. {"name": "hiking"}

8. ]}

1. SELECT src:$favouriteTechnology FROM myTable;

1. SELECT favouriteTechnology FROM myTable;

1. SELECT CONVERT_JSON(src:favouriteTechnology) FROM myTable;

Correct answer

1. SELECT src:favouriteTechnology FROM myTable;

Overall explanation

See this link.

Question 45Skipped

Which command removes a role from another role or a user in Snowflake?

ALTER ROLE

Correct answer

REVOKE ROLE

USE ROLE

USE SECONDARY ROLES

Overall explanation

REVOKE ROLE

Removes a role from another role or a user.

See this link.

Question 46Skipped

A user needs to ingest 1 GB of data that is available in an external stage using a COPY INTO
command.

How can this be done with MAXIMUM performance and the LEAST cost?

Ingest the data in an uncompressed format as a single file.

Correct answer

Split the file into smaller files of 100-250 MB each, compress and ingest each of the
smaller files.

Ingest the data in a compressed format as a single file.

chumma kizhii
#dm

Split the file into smaller files of 100-250 MB each and ingest each of the smaller files in an
uncompressed format.

Overall explanation

See this link.

Question 47Skipped

What data type should be used to store JSON data natively in Snowflake?

Object

JSON

Correct answer

VARIANT

String

Overall explanation

See this link.

Question 48Skipped

What happens to the privileges granted to Snowflake system-defined roles?

Correct answer

The privileges cannot be revoked.

The privileges can be revoked by any user-defined role with appropriate privileges.

The privileges can be revoked by an ACCOUNTADMIN.

The privileges can be revoked by an ORGADMIN.

Overall explanation

System-defined roles cannot be dropped. In addition, the privileges granted to these roles by
Snowflake cannot be revoked.

See this link.

Question 49Skipped

What is the MINIMUM edition of Snowflake that is required to use a SCIM security
integration?

Business Critical Edition

Correct answer

chumma kizhii
#dm

Standard Edition

Enterprise Edition

Virtual Private Snowflake (VPS)

Overall explanation

Its available for ALL editions so Standard edition is the answer.

See this link.

Question 50Skipped

The Snowflake Search Optimization Services supports improved performance of which kind
of query?

Queries against a subset of columns in a table

Correct answer

Selective point lookup queries

Queries against tables larger than 1 TB

Queries against large tables where frequent DML occurs

Overall explanation

Keyword: selective

See this link.

Question 51Skipped

Which Snowflake feature allows administrators to identify unused data that may be archived
or deleted?

Object tagging

Dynamic Data Masking

Data classification

Correct answer

Access history

Overall explanation

Each row in the ACCESS_HISTORY view contains a single record per SQL statement. The
record describes the columns the query accessed directly and indirectly (i.e. the underlying
tables that the data for the query comes from). These records facilitate regulatory

chumma kizhii
#dm

compliance auditing and provide insights on popular and frequently accessed tables and
columns since there is a direct link between the user (i.e. query operator), the query, the
table or view, the column, and the data.

See this link.

Question 52Skipped

How can a producer share a table with a consumer located in a different region?

Unload all data to a stage and then deploy a pipeline to move data to the consumer's
stage in other region.

Create a script to replicate your data in the consumer account.

Correct answer

Replicate your account to another region and create a share from that region.

This is not a problem; producers and consumers can be in different regions.

Overall explanation

Data sharing works within the same region; however, you can replicate your account to
another region and then share data from that replicated account within that account’s
region. This is also true across cloud platforms. You can see this behavior in the following
image:

Question 53Skipped

Which encryption algorithm is used by Snowflake tables when we load data into them?

chumma kizhii
#dm

SHA 256.

AES 128.

Correct answer

AES 256.

SHA 128.

Overall explanation

All ingested data stored in Snowflake tables, and all files stored in internal stages for data
loading and unloading, are encrypted using AES-256 strong encryption. You can read more
information about the Snowflake security features at the following link.

Question 54Skipped

What is a limitation of a Materialized View?

A Materialized View can only reference up to two tables

A Materialized View cannot support any aggregate functions

A Materialized View cannot be joined with other tables

Correct answer

A Materialized View cannot be defined with a JOIN

Overall explanation

Materialized view can be joined with other tables. But you cannot include JOIN in a
materialized view definition

See this link.

Question 55Skipped

Which of the following objects can be directly restored using the UNDROP command?
(Choose two.)

User

Correct selection

Table

Role

View

Correct selection

chumma kizhii
#dm

Schema

Internal stage

Overall explanation

Account Objects:

UNDROP DATABASE

Database Objects:

UNDROP SCHEMA

UNDROP TABLE

UNDROP TAG

See this link.

Question 56Skipped

Which chart type is supported in Snowsight for Snowflake users to visualize data with
dashboards?

Correct answer

Heat grid

Pie chart

Area chart

Box plot

Overall explanation

Snowsight supports the following types of charts:

• Bar charts

• Line charts

• Scatterplots

• Heat grids

• Scorecards

See this link..

chumma kizhii
#dm

As of documentation date 2024-08. Snowsight evolves frequently, this question may be


updated as capabilities evolve.

Question 57Skipped

What is the default Time Travel retention period?

90 days

45 days

Correct answer

1 day

7 days

Overall explanation

Default of 1 day.

See this link.

Question 58Skipped

What does the worksheet and database explorer feature in Snowsight allow users to do?

Tag frequently accessed worksheets for ease of access.

Combine multiple worksheets into a single worksheet.

Correct answer

Move a worksheet to a folder or a dashboard.

Add or remove users from a worksheet.

Overall explanation

See this link.

Question 59Skipped

Query compilation occurs in which architecture layer of the Snowflake Cloud Data Platform?

Storage layer

Cloud infrastructure layer

Compute layer

Correct answer

Cloud services layer

chumma kizhii
#dm

Overall explanation

Query execution is different from Query processing. Query execution is performed in the
processing layer. While The services layer for Snowflake authenticates user sessions,
provides management, enforces security functions, performs query compilation and
optimization, results cache and coordinates all transactions

Question 60Skipped

How can a dropped internal stage be restored?

Execute the UNDROP command.

Correct answer

Recreate the dropped stage.

Enable Time Travel.

Clone the dropped stage.

Overall explanation

Recreate the dropped stage.

Dropped stages cannot be recovered; they must be recreated.

See this link.

Clone the dropped stage. - Incorrect, you cannot clone a previously dropped object.

Execute the UNDROP command - Incorrect, UNDROP command cannot be used with stages.

See this link.

Using Time Travel, you can perform the following actions within a defined period of time:

• Query data in the past that has since been updated or deleted.

• Create clones of entire tables, schemas, and databases at or before specific points in
the past.

• Restore tables, schemas, and databases that have been dropped.

See this link.

Question 61Skipped

What are best practice recommendations for using the ACCOUNTADMIN system-defined role
in Snowflake? (Choose two.)

The ACCOUNTADMIN role must be granted to only one user.

chumma kizhii
#dm

Correct selection

Assign the ACCOUNTADMIN role to at least two users, but as few as possible.

All users granted ACCOUNTADMIN role must also be granted SECURITYADMIN role.

All users granted ACCOUNTADMIN role must be owned by the ACCOUNTADMIN role.

Correct selection

Ensure all ACCOUNTADMIN roles use Multi-factor Authentication (MFA).

Overall explanation

See this link.

Question 62Skipped

What are benefits of using Snowpark with Snowflake? (Choose two.)

Snowpark automatically sets up Spark within Snowflake virtual warehouses.

Snowpark scale and compute management are handled by the user.

Correct selection

Snowpark executes as much work as possible by leveraging pushdown for all operations,
including user-defined functions (UDF).

Correct selection

Snowpark does not require that a separate cluster be running outside of Snowflake.

Snowpark uses a Spark engine to generate optimized SQL query plans.

Overall explanation

Benefits When Compared with the Spark Connector

In comparison to using the Snowflake Connector for Spark, developing with Snowpark
includes the following benefits:

• Support for interacting with data within Snowflake using libraries and patterns
purpose built for different languages without compromising on performance or
functionality.

• Support for authoring Snowpark code using local tools such as Jupyter, VS Code, or
IntelliJ.

• Support for pushdown for all operations, including Snowflake UDFs. This means
Snowpark pushes down all data transformation and heavy lifting to the Snowflake
data cloud, enabling you to efficiently work with data of any size.

chumma kizhii
#dm

• No requirement for a separate cluster outside of Snowflake for computations. All of


the computations are done within Snowflake. Scale and compute management are
handled by Snowflake.

See this link.

Question 63Skipped

If you want a dedicated virtual warehouse, which is the lowest Snowflake edition you should
opt for?

Virtual Private Snowflake.

Enterprise.

Business Critical.

Correct answer

Standard.

Overall explanation

In Snowflake, all the Virtual Warehouses are dedicated to the users. If you create a virtual
warehouse, you will only be the one using it.

Question 64Skipped

Which command can be added to the COPY command to make it load all files, whether or
not the load status of the files is known?

Correct answer

FORCE = TRUE

LOAD_UNCERTAIN_FILES = TRUE

LOAD_UNCERTAIN_FILES = FALSE

FORCE = FALSE

Overall explanation

To load files whose metadata has expired, set the LOAD_UNCERTAIN_FILES copy option to
true. The copy option references load metadata, if available, to avoid data duplication, but
also attempts to load files with expired load metadata.

Alternatively, set the FORCE option to load all files, ignoring load metadata if it exists. Note
that this option reloads files, potentially duplicating data in a table.

chumma kizhii
#dm

See this link.

Question 65Skipped

There are 300 concurrent users on a production Snowflake account using a single cluster
virtual warehouse. The queries are small, but the response time is very slow.

What is causing this to occur?

The queries are not taking advantage of the data cache.

The application is not using the latest native ODBC driver which is causing latency.

Correct answer

The warehouse is queuing the queries, increasing the overall query execution time.

The warehouse parameter STATEMENT_QUEUED_TIMEOUT_IN_SECONDS is set too low.

Overall explanation

High number of concurrent users with a single cluster virtual warehouse. This would be a
good use case for using multi cluster warehouse, which are designed to handle concurrency
problems.

See this link.

Question 66Skipped

A single user of a virtual warehouse has set the warehouse to auto-resume and auto-
suspend after 10 minutes. The warehouse is currently suspended and the user performs the
following actions:

1. Runs a query that takes 3 minutes to complete

2. Leaves for 15 minutes

3. Returns and runs a query that takes 10 seconds to complete

4. Manually suspends the warehouse as soon as the last query was completed

When the user returns, how much billable compute time will have been consumed?

4 minutes

Correct answer

chumma kizhii
#dm

14 minutes

10 minutes

24 minutes

Overall explanation

The user ran a query for 3 minutes. However, since the warehouse was set to auto-suspend
after 10 minutes, the compute cost was calculated as 3 + 10 minutes. The user then came
back and ran another query for 10 seconds. Even though the query only ran for 10 seconds,
the minimum billable unit is 1 minute. Therefore, the total compute cost was 13 minutes + 1
minute = 14 minutes.

Here is a breakdown of the compute cost:

3 minutes of compute for the first query = 3 minutes

10 minutes of compute for the auto-suspension = 10 minutes

1 minute of compute for the second query = 1 minute

Total compute cost = 3 minutes + 10 minutes + 1 minute = 14 minutes

Question 67Skipped

User1, who has the SYSADMIN role, executed a query on Snowsight. User2, who is in the
same Snowflake account, wants to view the result set of the query executed by User1 using
the Snowsight query history.

What will happen if User2 tries to access the query history?

If User2 has the SECURITYADMIN role they will be able to see the results.

If User2 has the ACCOUNTADMIN role they will be able to see the results.

Correct answer

User2 will be unable to view the result set of the query executed by User1.

If User2 has the SYSADMIN role they will be able to see the results.

Overall explanation

You can view results only for queries you have executed. If you have privileges to view
queries executed by another user, the Query Detail page displays the details for the query,
but, for data privacy reasons, the page does not display the actual query result.

See this link.

Question 68Skipped

chumma kizhii
#dm

When unloading data from Snowflake to AWS, what permissions are required? (Choose
two.)

Correct selection

s3:PutObject

s3:CopyObject

Correct selection

s3:DeleteObject

s3:GetBucketAcl

s3:GetBucketLocation

Overall explanation

Snowflake requires the following permissions on an S3 bucket and folder to create new files
in the folder (and any sub-folders):

• s3:DeleteObject

• s3:PutObject

See this link.

Question 69Skipped

Snowflake's approach to access control combines aspects of two different models. One
model remarks, "each object has an owner, who can in turn grant access to that object".
What is the name of this model?

Correct answer

Discretionary Access Control (DAC)

Role-based Access Control (RBAC)

Account ownership model.

Object ownership model.

Overall explanation

Discretionary Access Control (DAC) remarks that "each object has an owner, who can, in
turn, grant access to that object". In contrast, the Role-based Access Control (RBAC) remarks
that "access privileges are assigned to roles, which are in turn assigned to users".

Question 70Skipped

What should be considered when deciding to use a Secure View? (Choose two.)

chumma kizhii
#dm

Correct selection

No details of the query execution plan will be available in the query profiler.

Once created there is no way to determine if a view is secure or not.

The view definition of a secure view is still visible to users by way of the information
schema.

It is not possible to create secure materialized views.

Correct selection

Secure views do not take advantage of the same internal optimizations as standard views.

Overall explanation

The internals of a secure view are not exposed in Query Profile (in the web interface).

Some of the internal optimizations for views require access to the underlying data in the
base tables for the view. This access might allow data that is hidden from users of the view
to be exposed through user code, such as user-defined functions, or other programmatic
methods. Secure views do not utilize these optimizations, ensuring that users have no access
to the underlying data.

See this link.

Question 71Skipped

A task went into a loop. How long will the task run before Snowflake finishes it?

4 hours.

15 minutes.

30 minutes.

Correct answer

60 minutes.

Overall explanation

Tasks have a maximum duration of 60 minutes by default. If they haven't finished by then,
they will be automatically terminated. You can configure the time limit on a single task run
before it times out with the option "USER_TASK_TIMEOUT_MS" when creating the task.
However, before significantly increasing the time limit on a task, consider whether to
refactor the SQL statement or increase the warehouse size.

Question 72Skipped

The is the minimum Fail-safe retention time period for transient tables?

chumma kizhii
#dm

7 days

Correct answer

0 days

12 hours

1 day

Overall explanation

Transient tables are similar to permanent tables with the key difference that they do not
have a Fail-safe period.

See this link.

Question 73Skipped

At what levels can a resource monitor be configured? (Choose two.)

Correct selection

Virtual warehouse

Correct selection

Account

Organization

Schema

Database

Overall explanation

chumma kizhii
#dm

See this link.

Question 74Skipped

In which Snowflake layer does Snowflake reorganize data into its internal optimized,
compressed, columnar format?

Correct answer

Database Storage

Query Processing

Metadata Management

Cloud Services

Overall explanation

When data is loaded into Snowflake, Snowflake reorganizes that data into its internal
optimized, compressed, columnar format. Snowflake stores this optimized data in cloud
storage.

See this link.

Question 75Skipped

Which term is used to describe information about disk usage for operations where
intermediate results cannot be accommodated in a Snowflake virtual warehouse memory?

Queue overloading

Pruning

Join explosion

Correct answer

Spilling

chumma kizhii
#dm

Overall explanation

What is disk spilling? When Snowflake warehouse cannot fit an operation in memory, it
starts spilling (storing) data first to the local disk of a warehouse node, and then to remote
storage.

In such a case, Snowflake first tries to temporarily store the data on the warehouse local
disk. As this means extra IO operations, any query that requires spilling will take longer than
a similar query running on similar data that is capable to fit the operations in memory.

See this link.

Question 76Skipped

In the Snowflake access control model, which entity owns an object by default?

The SYSADMIN role

The user who created the object

Ownership depends on the type of object

Correct answer

The role used to create the object

Overall explanation

To own an object means that a role has the OWNERSHIP privilege on the object. Each
securable object is owned by a single role, which by default is the role used to create the
object.

Question 77Skipped

How does a Snowflake user reference a directory table created on stage mystage in a SQL
query?

Correct answer

1. SELECT * FROM DIRECTORY (@mystage)

1. SELECT * FROM TABLE (@mystage DIRECTORY)

1. SELECT * FROM @mystage::DIRECTORY

1. SELECT * FROM TO_TABLE (DIRECTORY @mystage)

Overall explanation

See this link.

Question 78Skipped

chumma kizhii
#dm

Which of the following accurately describes shares?

Access to a share cannot be revoked once granted

Shares can be shared

Data consumers can clone a new table from a share

Correct answer

Tables, secure views, and secure UDFs can be shared

Overall explanation

Snowflake enables sharing of data through named Snowflake objects called shares which
supports data sharing by sharing tables, secure views, and secure UDFs in our Snowflake
database (Data Provider) with other Snowflake accounts (Data Consumer).

See this link.

Question 79Skipped

Which of the following significantly improves the performance of selective point lookup
queries on a table?

Correct answer

Search Optimization Service

Clustering

Materialized Views

Zero-copy Cloning

Overall explanation

The search optimization service aims to significantly improve the performance of certain
types of queries on tables, including: Selective point lookup queries on tables.

A point lookup query returns only one or a small number of distinct rows.

(...)

See this link.

Question 80Skipped

When cloning a database containing stored procedures and regular views, that have fully
qualified table references, which of the following will occur?

An error will occur, as views with qualified references cannot be cloned.

chumma kizhii
#dm

An error will occur, as stored objects cannot be cloned.

The cloned views and the stored procedures will reference the cloned tables in the cloned
database.

Correct answer

The stored procedures and views will refer to tables in the source database.

Overall explanation

See this link.

Question 81Skipped

What step can reduce data spilling in Snowflake?

Increasing the amount of remote storage for the virtual warehouse

Using a Common Table Expression (CTE) instead of a temporary table

Increasing the virtual warehouse maximum timeout limit

Correct answer

Using a larger virtual warehouse

Overall explanation

Adjusting the available memory of a warehouse can improve performance because a query
runs substantially slower when a warehouse runs out of memory, which results in bytes
“spilling” onto storage.

See this link.

Question 82Skipped

What happens when a virtual warehouse is resized?

Correct answer

When reducing the size of a warehouse the compute resources are removed only when
they are no longer being used to execute any current statements.

The warehouse will be suspended while the new compute resource is provisioned and will
resume automatically once provisioning is complete.

Users who are trying to use the warehouse will receive an error message until the resizing
is complete.

When increasing the size of an active warehouse the compute resource for all running and
queued queries on the warehouse are affected.

chumma kizhii
#dm

Overall explanation

See this link.

Question 83Skipped

At what level is the MIN_DATA_RETENTION_TIME_IN_DAYS parameter set?

Schema

Correct answer

Account

Database

Table

Overall explanation

MIN_DATA_RETENTION_TIME_IN_DAYS - Account level

DATA_RETENTION_TIME_IN_DAYS - Object / Account level

See this link.

Question 84Skipped

Why would a Snowflake user create a secure view instead of a standard view?

With a secure view, the underlying data is replicated to a separate storage layer with
enhanced encryption.

Secure views support additional functionality that is not supported for standard views,
such as column masking and row level access policies.

The secure view is only available to end users with the corresponding SECURE_ACCESS
property.

Correct answer

End users are unable to see the view definition, and internal optimizations differ with a
secure view.

Overall explanation

See this link.

Question 85Skipped

Where can a user find and review the failed logins of a specific user for the past 30 days?

The ACCESS_HISTORY view in ACCOUNT_USAGE

chumma kizhii
#dm

The SESSIONS view in ACCOUNT_USAGE

The USERS view in ACCOUNT_USAGE

Correct answer

The LOGIN_HISTORY view in ACCOUNT_USAGE

Overall explanation

See this link.

Question 86Skipped

How can a Snowflake user load duplicate files with a COPY INTO command?

The COPY INTO options should be set to PURGE = FALSE

The COPY INTO options should be set to RETURN_FAILED_ONLY = FALSE

Correct answer

The COPY INTO options should be set to FORCE = TRUE

The COPY INTO options should be set to ON_ERROR = CONTINUE

Overall explanation

FORCE = TRUE | FALSE

Definition

Boolean that specifies to load all files, regardless of whether they’ve been loaded previously
and have not changed since they were loaded. Note that this option reloads files, potentially
duplicating data in a table.

See this link.

Question 87Skipped

How does a Snowflake stored procedure compare to a User-Defined Function (UDF)?

A single executable statement can call multiple stored procedures. In contrast, multiple
SQL statements can call the same UDFs.

A single executable statement can call only two stored procedures. In contrast, a single
SQL statement can call multiple UDFs.

Multiple executable statements can call more than one stored procedure. In contrast, a
single SQL statement can call multiple UDFs.

Correct answer

chumma kizhii
#dm

A single executable statement can call only one stored procedure. In contrast, a single SQL
statement can call multiple UDFs.

Overall explanation

Multiple UDFs may be called with one statement; a single stored procedure is called with
one statement

• A single SQL statement can call multiple UDFs.

• A single SQL statement can call only one stored procedure.

See this link.

Question 88Skipped

In which hierarchy is tag inheritance possible?

Account » User » Role

Correct answer

Schema » Table » Column

Account » User » Schema

Database » View » Column

Overall explanation

A tag is inherited based on the Snowflake securable object hierarchy.

chumma kizhii
#dm

See this link.

Question 89Skipped

Which feature is integrated to support Multi-Factor Authentication (MFA) at Snowflake?

Correct answer

Duo Security

RSA SecurID Access

Authy

One Login

Overall explanation

MFA support is provided as an integrated Snowflake feature, powered by the Duo Security
service, which is managed completely by Snowflake.

See this link.

Question 90Skipped

Which of the following statements apply to Snowflake in terms of security? (Choose two.)

chumma kizhii
#dm

Correct selection

All data in Snowflake is encrypted.

All data in Snowflake is compressed.

Correct selection

Snowflake leverages a Role-Based Access Control (RBAC) model.

Snowflake can run within a user's own Virtual Private Cloud (VPC).

Snowflake requires a user to configure an IAM user to connect to the database.

Overall explanation

Snowflake is a native 100% public cloud solution, you cannot host it on your OWN VPC. All
data micro partitions are encrypted.

Question 91Skipped

For how long are we billed if our warehouse runs for 48 seconds?

We are not going to be billed if the query scans less than 10 micro-partitions.

We are not going to be billed as the warehouse hasn’t run for 1 minute.

48 seconds.

Correct answer

1 minute.

Overall explanation

When compute resources are provisioned for a warehouse, the minimum billing charge for
provisioning compute resources is 1 minute. Even if your warehouse runs for less than a
minute, we will be billed for a minute.

Question 92Skipped

How many credits will consume a medium-size warehouse with 2 clusters running in auto-
scaled mode for 3 hours, considering that the first cluster runs continuously and the second
one runs for 30 minutes in the second hour?

8.

10.

16.

Correct answer

chumma kizhii
#dm

14.

Overall explanation

A medium size warehouse with one cluster consumes four credits per hour. The first cluster
will run continuously for three hours, consuming 12 credits. The second one will run for only
30 minutes, consuming two credits. The total of the warehouse will be 14 credits.

Question 93Skipped

In order to access Snowflake Marketplace listings, who needs to accept the Snowflake
Consumer Terms of Service?

SECURITYADMIN

SYSADMIN

Correct answer

ORGADMIN

ACCOUNTADMIN

Overall explanation

The organization administrator only needs to accept the Snowflake Provider and Consumer
Terms once for your organization. After the terms have been accepted, anyone in your
organization that has a role with the necessary privileges can become a consumer of listings.

chumma kizhii
#dm

Note

You must be an organization administrator (i.e. a user granted the ORGADMIN role) to
accept the terms.

See this link.

Question 94Skipped

Which Snowflake view is used to support compliance auditing?

COPY_HISTORY

ROW_ACCESS_POLICIES

QUERY_HISTORY

Correct answer

ACCESS_HISTORY

Overall explanation

The ACCESS_HISTORY view in Snowflake is primarily used to support compliance auditing.


This view contains information about historical access and usage patterns related to tables
and views within your Snowflake account. It provides details on who accessed the data,
when, and from which IP addresses, among other audit-related information.

See this link.

Question 95Skipped

Which commands can only be executed using SnowSQL? (Choose two.)

REMOVE

Correct selection

PUT

Correct selection

GET

LIST

COPY INTO

Overall explanation

Usage Notes for GET and PUT commands

chumma kizhii
#dm

• The command cannot be executed from the Worksheets page in either Snowflake
web interface; instead, use the SnowSQL client or Drivers to upload data files, or
check the documentation for a specific Snowflake client to verify support for this
command.

See this link.

Question 96Skipped

Which Query Profile result indicates that a warehouse is sized too small?

Correct answer

Bytes are spilling to external storage.

The number of partitions scanned is the same as partitions total.

There are a lot of filter nodes.

The number of processed rows is very high.

Overall explanation

See this link.

Question 97Skipped

Which statistics are displayed in a Query Profile that indicate that intermediate results do
not fit in memory? (Choose two.)

Bytes scanned

Percentage scanned from cache

Correct selection

Bytes spilled to remote storage

Partitions scanned

Correct selection

Bytes spilled to local storage

Overall explanation

• Spilling — information about disk usage for operations where intermediate results
do not fit in memory:

• Bytes spilled to local storage — volume of data spilled to local disk.

• Bytes spilled to remote storage — volume of data spilled to remote disk.

chumma kizhii
#dm

See this link.

Question 98Skipped

What does the LATERAL modifier for the FLATTEN function do?

Correct answer

Joins information outside the object with the flattened data

Casts the values of the flattened data

Retrieves a single instance of a repeating element in the flattened data

Extracts the path of the flattened data

Overall explanation

FLATTEN is a table function that produces a lateral view of a VARIANT, OBJECT, or ARRAY
column. The function returns a row for each object, and the LATERAL modifier joins the data
with any information outside of the object.

See this link.

Question 99Skipped

What Snowflake features allow virtual warehouses to handle high concurrency workloads?
(Choose two.)

Correct selection

The use of warehouse auto scaling

The ability to resize warehouses

Correct selection

Use of multi-clustered warehouses

The ability to scale up warehouses

The use of warehouse indexing

Overall explanation

See this link.

Question 100Skipped

How can the Query Profile be used to identify the costliest operator of a query?

Correct answer

chumma kizhii
#dm

Find the operator node with the highest fraction of time or percentage of total time.

Look at the number of rows between operator nodes across the operator tree.

Select any node in the operator tree and look at the number of micro-partitions scanned.

Select the TableScan operator node and look at the percentage scanned from cache.

Overall explanation

Operator Nodes by Execution Time:

A collapsible panel in the operator tree pane lists nodes by execution time in descending
order, enabling users to quickly locate the costliest operator nodes in terms of execution
time. The panel lists all nodes that lasted for 1% or longer of the total execution time of the
query (or the execution time for the displayed query step, if the query was executed in
multiple processing steps).

See this link.

Question 101Skipped

What is a machine learning and data science partner within the Snowflake Partner
Ecosystem?

Power BI

Informatica

Adobe

Correct answer

Data Robot

Overall explanation

See this link.

chumma kizhii
#dm

Question 102Skipped

Increasing the size of a virtual warehouse from an X-Small to an X-Large is an example of


which of the following?

Concurrent sizing

Correct answer

Scaling up

Scaling out

Right sizing

Overall explanation

See this link.

chumma kizhii
#dm

Question 103Skipped

A company needs to allow some users to see Personally Identifiable Information (PII) while
limiting other users from seeing the full value of the PII.

Which Snowflake feature will support this?

Correct answer

Data masking policies

Role based access control

Data encryption

Row access policies

Overall explanation

If you have a table with a column including PII, masking rows will not solve the issue. What
we need is to make the data in this column visible to some, and masked to some. Thus we
need to use dynamic data masking.

See this link.

Question 104Skipped

A user has enabled the STRIP_OUTER_ARRAY file format option for the COPY INTO {table}
command to remove the outer array structure.

What else will this format option and command do?

Correct answer

Load the records into separate table rows.

Unload the records from separate table rows.

Ensure each unique element stores values of a single native data type.

chumma kizhii
#dm

Export data files in smaller chunks.

Overall explanation

Enable the STRIP_OUTER_ARRAY file format option for the COPY INTO <table> command to
remove the outer array structure and load the records into separate table rows.

See this link.

Question 105Skipped

What is the purpose of an External Function?

To run a function in another Snowflake database

Correct answer

To call code that executes outside of Snowflake

To ingest data from on-premises data sources

To share data in Snowflake with external parties

Overall explanation

See this link.

Question 106Skipped

Which data types does Snowflake support when querying semi-structured data? (Choose
two.)

Correct selection

VARIANT

BLOB

VARCHAR

Correct selection

ARRAY

XML

Overall explanation

See this link.

Question 107Skipped

When using SnowSQL, which configuration options are required when unloading data from a
SQL query run on a local machine? (Choose two.)

chumma kizhii
#dm

Correct selection

output_format

force_put_overwrite

quiet

Correct selection

output_file

echo

Overall explanation

See this link.

Quiet is also correct, but this is optional. Not a REQUIRED parameter

Question 108Skipped

Which table function should be used to view details on a Directed Acyclic Graph (DAG) run
that is presently scheduled or is executing?

TASK_HISTORY

TASK_DEPENDENTS

Correct answer

CURRENT_TASK_GRAPHS

COMPLETE_TASK_GRAPHS

Overall explanation

CURRENT_TASK_GRAPHS

Returns the status of a graph run that is currently scheduled or is executing. A graph is
currently defined as a single scheduled task or a DAG of tasks composed of a scheduled root
task and one or more child tasks (i.e. tasks that have a defined predecessor task). For the
purposes of this function, root task refers to either the single scheduled task or the root task
in a DAG.

See this link.

Question 109Skipped

What are the main differences between the Virtual Private Snowflake (VPS) and Business
Critical Editions? (Select TWO)

Correct selection

chumma kizhii
#dm

Snowflake VPS provides a dedicated metadata store and pool of computing resources,
whereas it’s not included in the Business Critical Edition.

Snowflake VPS provides a direct proxy to virtual networks // on-premises data centers
using AWS PrivateLink, whereas it’s not included in the Business Critical Edition.

Correct selection

Snowflake VPS provides a completely separate Snowflake environment, isolated from all
other Snowflake accounts, whereas it’s not included in the Business Critical Edition.

Snowflake VPS provides customer-managed encryption keys through Tri-Secret secure,


whereas it’s not included in the Business Critical Edition.

Overall explanation

Tri-Secret secure and AWS PrivateLink are also provided in the Business Critical Edition. You
can see all the differences between Snowflake editions at the following link.

Question 110Skipped

A clustering key was defined on a table, but it is no longer needed.

How can the key be removed?

Correct answer

1. ALTER TABLE DROP CLUSTERING KEY

1. ALTER TABLE DELETE CLUSTERING KEY

1. ALTER TABLE PURGE CLUSTERING KEY

chumma kizhii
#dm

1. ALTER TABLE REMOVE CLUSTERING KEY

Overall explanation

ALTER TABLE <name> DROP CLUSTERING KEY

See this link.

Question 111Skipped

Which sequence (order) of object privileges should be used to grant a custom role read-only
access on a table?

Correct answer

chumma kizhii
#dm

Overall explanation

When designing a RBAC and assigning grants it is very important to follow the principle of
‘least privilege’.

See this link.

Question 112Skipped

Which function returns an integer between 0 and 100 when used to calculate the similarity
of two strings?

APPROXIMATE_SIMILARITY

APPROXIMATE_JACCARD_INDEX

Correct answer

JAROWINKLER_SIMILARITY

MINHASH_COMBINE

Overall explanation

Computes the Jaro-Winkler similarity between two input strings. The function returns an
integer between O and 100, where 0 indicates no similarity and 100 indicates an exact
match.

See this link.

Question 113Skipped

Which of the following is not a valid context functions in Snowflake?

1. SELECT CURRENT_IP_ADDRESS()

1. SELECT CURRENT_ACCOUNT()

Correct answer

1. SELECT CURRENT_PROVIDER()

1. SELECT CURRENT_CLIENT()

Overall explanation

chumma kizhii
#dm

You can see all the different context functions at the following link.

Question 114Skipped

What impacts the credit consumption of maintaining a materialized view? (Choose two.)

Correct selection

Whether the materialized view has a cluster key defined

Correct selection

How often the base table changes

Whether or not it is also a secure view

How often the materialized view is queried

How often the underlying base table is queried

Overall explanation

When a base table changes, all materialized views defined on the table are updated by a
background service that uses compute resources provided by Snowflake.

Maintaining clustering (of either a table or a materialized view) adds costs.

See this link.

Question 115Skipped

Which tasks are performed in the Snowflake Cloud Services layer? (Choose two.)

Correct selection

Management of metadata

Computing the data

Correct selection

Parsing and optimizing queries

Maintaining Availability Zones

Infrastructure security

Overall explanation

See this link.

Question 116Skipped

What is the MAXIMUM number of days that Snowflake resets the 24-hour retention period
for a query result every time the result is used?

chumma kizhii
#dm

1 day

10 days

Correct answer

31 days

60 days

Overall explanation

Each time the persisted result for a query is reused, Snowflake resets the 24-hour retention
period for the result, up to a maximum of 31 days from the date and time that the query was
first executed. After 31 days, the result is purged and the next time the query is submitted, a
new result is generated and persisted.

See this link.

Question 117Skipped

What happens when a network policy includes values that appear in both the allowed and
blocked IP address lists?

Those IP addresses are allowed access to the Snowflake account as Snowflake applies the
allowed IP address list first.

Snowflake issues an alert message and adds the duplicate IP address values to both the
allowed and blocked IP address lists.

Correct answer

Those IP addresses are denied access to the Snowflake account as Snowflake applies the
blocked IP address list first.

Snowflake issues an error message and adds the duplicate IP address values to both the
allowed and blocked IP address lists.

Overall explanation

When a network policy includes values in both the allowed and blocked IP address lists,
Snowflake applies the blocked IP address list first.

See this link.

Question 118Skipped

Which Snowflake tool would be BEST to troubleshoot network connectivity?

SnowUI

SnowSQL

chumma kizhii
#dm

SnowCLI

Correct answer

SnowCD

Overall explanation

SnowCD (i.e. Snowflake Connectivity Diagnostic Tool) helps users to diagnose and
troubleshoot their network connection to Snowflake.

See this link.

Question 119Skipped

What criteria does Snowflake use to determine the current role when initiating a session?
(Choose two.)

If a role was specified as part of the connection and that role has not been granted to the
Snowflake user, the role is automatically granted and it becomes the current role.

Correct selection

If a role was specified as part of the connection and that role has been granted to the
Snowflake user, the specified role becomes the current role.

If a role was specified as part of the connection and that role has not been granted to the
Snowflake user, it will be ignored and the default role will become the current role.

If no role was specified as part of the connection and a default role has not been set for
the Snowflake user, the session will not be initiated and the log in will fail.

Correct selection

If no role was specified as part of the connection and a default role has been defined for
the Snowflake user, that role becomes the current role.

Overall explanation

See this link.

Question 120Skipped

The following JSON is stored in a VARIANT column called src of the CAR_SALES table:

chumma kizhii
#dm

A user needs to extract the dealership information from the JSON.

How can this be accomplished?

Correct answer

1. select src:dealership from car_sales;

1. select src.dealership from car_sales;

1. select src:Dealership from car_sales;

1. select dealership from car_sales;

Overall explanation

Insert a colon : between the VARIANT column name and any first-level element:
<column>:<level1_element>.

See this link.

Question 121Skipped

Which database objects can be shared using Snowflake Secure Data Sharing?

Tables, External tables, Secure views.

Correct answer

Tables, External tables, Secure views, Secure materialized views, Secure UDFs.

Tables, External tables, Secure views, Secure materialized views.

Tables, External tables.

chumma kizhii
#dm

Tables.

Overall explanation

Secure Data Sharing lets you share selected objects in a database in your account with other
Snowflake accounts. You can share all the previous database objects.

Question 122Skipped

A warehouse ran for 62 seconds, and it was suspended. After some time, it ran for another
20 seconds. For how many seconds will you be billed?

92 seconds.

Correct answer

122 seconds.

62 seconds.

20 seconds.

Overall explanation

You will be billed for 122 seconds (62 + 60 seconds) because warehouses are billed for a
minimum of one minute. The price would be different if the warehouse wasn't suspended
before executing the second query.

For example, if we had only run a query, and it had only run for 62 seconds, you would be
billed for these 62 seconds. If it had only run for 20 seconds, you would've been billed for 60
seconds.

Question 123Skipped

Which Snowflake edition offers the highest level of security for organizations that have the
strictest requirements?

Standard

Business Critical

Enterprise

Correct answer

Virtual Private Snowflake (VPS)

Overall explanation

Virtual Private Snowflake offers our highest level of security for organizations that have the
strictest requirements

chumma kizhii
#dm

See this link.

Question 124Skipped

Which objects will incur storage costs associated with Fail-safe?

Data files available in internal stages

External tables

Data files available in external stages

Correct answer

Permanent tables

Overall explanation

See this link.

Question 125Skipped

Which virtual warehouse consideration can help lower compute resource credit
consumption?

Correct answer

Automating the virtual warehouse suspension and resumption settings

Setting up a multi-cluster virtual warehouse

Increasing the maximum cluster count parameter for a multi-cluster virtual warehouse

Resizing the virtual warehouse to a larger size

Overall explanation

See this link.

Question 126Skipped

What is the abbreviated form to get all the files in the stage for the current user?

1. SHOW @%;

1. LIST @~;

Correct answer

1. LS @~;

1. LS @usr;

Overall explanation

chumma kizhii
#dm

LIST (normal form)

LS (abbrev form)

See this link.

Question 127Skipped

What is true about sharing data in Snowflake? (Choose two.)

The Provider is charged for compute resources used by the Data Consumer to query the
shared data.

The shared data is copied into the Data Consumer account, so the Consumer can modify it
without impacting the base data of the Provider.

Correct selection

The Data Consumer pays only for compute resources to query the shared data.

Correct selection

A Snowflake account can both provide and consume shared data.

The Data Consumer pays for data storage as well as for data computing.

Overall explanation

With Secure Data Sharing, no actual data is copied or transferred between accounts. The
only charges to consumers are for the compute resources (i.e. virtual warehouses) used to
query the shared data.

Any full Snowflake account can both provide and consume shared data.

See this link.

Question 128Skipped

Which Snowflake role can set up a Snowflake Share by default?

SYSADMIN

Correct answer

ACCOUNTADMIN

SECURITYADMIN

PUBLIC

Overall explanation

chumma kizhii
#dm

Only the ACCOUNTADMIN role has the "CREATE SHARE" privilege by default. The privilege
can be granted to additional roles as needed.

Question 129Skipped

What is a characteristic of a tag associated with a masking policy?

Correct answer

A tag can have only one masking policy for each data type.

A tag can have multiple masking policies with varying data types.

A tag can have multiple masking policies for each data type.

A tag can be dropped after a masking policy is assigned.

Overall explanation

The tag can support one masking policy for each data type that Snowflake supports.

See this link.

Question 130Skipped

Which SQL commands should be used to write a recursive query if the number of levels is
unknown? (Choose two.)

Correct selection

CONNECT BY

LISTAGG

MATCH RECOGNIZE

QUALIFY

Oracle Cloud

Correct selection

WITH

Overall explanation

See this link.

Question 131Skipped

Which statement is true about Multi-Factor Authentication (MFA) in Snowflake?

MFA can be enforced or applied for a given role.

Snowflake users are automatically enrolled in MFA.

chumma kizhii
#dm

Users enroll in MFA by submitting a request to Snowflake Support.

Correct answer

Any Snowflake user can self-enroll in MFA through the web interface.

Overall explanation

See this link.

Question 132Skipped

How is the data storage cost computed for Snowflake?

Based on the amount of uncompressed data stored on the last day of the month.

Based on the amount of compressed data stored on the last day of the month.

Based on the average daily amount of uncompressed data stored.

Correct answer

Based on the average daily amount of compressed data stored.

Overall explanation

Storage costs benefit from the automatic compression of all data stored, and the total
compressed file size is used to calculate the storage bill for an account.

Question 133Skipped

What is a core benefit of clustering?

To improve performance by creating a separate file for point lookups

Correct answer

To increase scan efficiency in queries by improving pruning

To provide data redundancy by duplicating micro-partitions

To guarantee uniquely identifiable records in the database

Overall explanation

See this link.

Question 134Skipped

Which applications can use key pair authentication? (Choose two).

Correct selection

SnowSQL

chumma kizhii
#dm

Snowflake Marketplace

SnowCD

Correct selection

Snowflake connector for Python

Snowsight

Overall explanation

See this link.

Question 135Skipped

When can user session variables be accessed in a Snowflake scripting procedure?

When the procedure is defined to execute as OWNER.

When the procedure is defined with an argument that has the same name and type as the
session variable.

When the procedure is defined as STRICT.

Correct answer

When the procedure is defined to execute as CALLER.

Overall explanation

See this link.

Question 136Skipped

What command will you execute if you want to disable the query cache?

1. ALTER SESSION SET USE_CACHED_RESULT = OFF;

Correct answer

1. ALTER SESSION SET USE_CACHED_RESULT = FALSE;

1. ALTER SESSION SET USE_CACHED_RESULT = ON;

1. ALTER SESSION SET USE_CACHED_RESULT = TRUE;

Overall explanation

This command turns off the query result caching feature for the current session. When the
caching is disabled, the results of queries are not stored in the cache, and subsequent
executions of the same query will not use the cached results. This can negatively affect

chumma kizhii
#dm

query performance, especially for queries executed frequently or with long execution times,
but it might be useful for development purposes.

Question 137Skipped

Which of the following roles or privileges are required to view the table function
TASK_HISTORY? (Choose three.)

SECURITYADMIN.

Correct selection

Task owner (OWNERSHIP privilege)

Correct selection

ACCOUNTADMIN.

SYSADMIN.

Correct selection

MONITOR EXECUTION privilege.

Overall explanation

The function returns the history of task usage for your entire Snowflake account or a
specified task. It returns results for the ACCOUNTADMIN role, the task owner, or a role with
the global MONITOR EXECUTION privilege.

It returns task activity within the last 7 days or the next scheduled execution within the next
8 days.

Question 138Skipped

If a transaction disconnects and goes into a detached state, which cannot be committed or
rolled back, how long will Snowflake take to abort the transaction?

Correct answer

4 hours.

60 minutes.

15 minutes.

12 hours.

Overall explanation

If the transaction is left open or not aborted by the user, Snowflake automatically rolls back
the transaction after being idle for four hours.

chumma kizhii
#dm

You can still abort a running transaction with the system


function: SYSTEM$ABORT_TRANSACTION.

Question 139Skipped

Which of the following commands cannot be executed from the Snowflake UI? (Choose
two.)

LIST <stages>

COPY INTO.

Correct selection

PUT.

SHOW.

Correct selection

GET.

Overall explanation

These two commands cannot be executed from the Snowflake web interface; instead, you
should use the SnowSQL client to GET or PUT data files.

Question 140Skipped

What can a Snowflake user do with the information included in the details section of a
Query Profile?

Correct answer

Determine the total duration of the query.

Determine the source system that the queried table is from.

Determine the role of the user who ran the query.

Determine if the query was on structured or semi-structured data.

Overall explanation

See this link.

Question 141Skipped

What causes objects in a data share to become unavailable to a consumer account?

The consumer account runs the GRANT INPORTED PRIVILEGES command on the data share
every 24 hours.

chumma kizhii
#dm

The consumer account acquires the data share through a private data exchange.

The DATA_RETENTION_TIME_IN_DAYS parameter in the consumer account is set to 0.

Correct answer

The objects in the data share are being deleted and the grant pattern is not re-applied
systematically.

Overall explanation

Any objects that you remove from a share are instantly unavailable to the consumers
accounts who have created databases from the share.

For example, if you remove a table from a share, users in consumer accounts can no longer
query the data in the table as soon as the table is removed from the share.

See this link.

Question 142Skipped

We need to temporarily store intermediate data, which an ETL process will only use. We
don't need the data outside the ETL process.

If you want to optimize storage cost, what type of table will you create to store this data?

External.

Correct answer

Temporary.

Transient.

Permanent.

Overall explanation

With temporary tables, you can optimize storage costs, as when the Snowflake session ends,
data stored in the table is entirely purged from the system. But they also require storage
costs while the session is active.

A temporary table is purged once the session ends, so the retention period is for 24 hours or
the remainder of the session.

Question 143Skipped

What is the minimum Snowflake edition required to use Dynamic Data Masking?

Business Critical

Correct answer

chumma kizhii
#dm

Enterprise

Virtual Private Snowflake (VPC)

Standard

Overall explanation

Dynamic data masking require Enterprise edition or higher edition.

See this link.

Question 144Skipped

In an auto-scaling multi-cluster virtual warehouse with the setting SCALING_POLICY =


ECONOMY enabled, when is another cluster started?

When the system has enough load for 8 minutes

chumma kizhii
#dm

When the system has enough load for 2 minutes

Correct answer

When the system has enough load for 6 minutes

When the system has enough load for 10 minutes

Overall explanation

Only if the system estimates there’s enough query load to keep the cluster busy for at least 6
minutes

See this link.

Question 145Skipped

A user is unloading data to a stage using this command:

copy into @message from (select object_construct('id', 1, 'first_name', 'Snowflake',


'last_name', 'User', 'city', 'Bozeman')) file_format = (type = json)

What will the output file in the stage be?

Multiple compressed JSON files with a single VARIANT column

Correct answer

A single compressed JSON file with a single VARIANT column

A single uncompressed JSON file with multiple VARIANT columns

Multiple uncompressed JSON files with multiple VARIANT columns

Overall explanation

You can use the OBJECT_CONSTRUCT function combined with the COPY command to
convert the rows in a relational table to a single VARIANT column and unload the rows into a
file.

See this link.

Question 146Skipped

A user has unloaded data from a Snowflake table to an external stage.

Which command can be used to verify if data has been uploaded to the external stage
named my_stage?

show @my_stage

view @my_stage

chumma kizhii
#dm

display @my_stage

Correct answer

list @my_stage

Overall explanation

See this link.

Question 147Skipped

A permanent table and temporary table have the same name, TBL1, in a schema.

What will happen if a user executes select * from TBL1;?

Correct answer

The temporary table will take precedence over the permanent table.

An error will say there cannot be two tables with the same name in a schema.

The table that was created most recently will take precedence over the older table.

The permanent table will take precedence over the temporary table.

Overall explanation

All queries and other operations performed in the session on the table affect only the
temporary table.

See this link.

Question 148Skipped

What is a recommended approach for optimizing query performance in Snowflake?

Correct answer

Use a smaller number of larger tables rather than a larger number of smaller tables.

Select all columns from tables, even if they are not needed in the query.

Use a large number of joins to combine data from multiple tables.

Use subqueries whenever possible.

Overall explanation

Snowflake makes use of clustering on tables. Users can utilize cluster key to enhance query
performance (partition pruning) on large tables. So, the fewer and larger the tables, the
better the pruning and clustering will work.

chumma kizhii
#dm

The lesser the number of joins between several tables, the better performance will be in
general.

Other recommendations like Select all columns is the complete opposite of what Snowflake
recommends. Selecting only the required columns is a common query optimization
technique in Snowflake that can improve query performance and reduce resource
consumption.

Subqueries can be useful, but they can also slow down your queries.

Question 149Skipped

What action can a Resource Monitor not take when it hits the limit?

Notify & Suspend.

Notify & Suspend Immediately.

Correct answer

Notify & Increase the limit.

Notify.

Overall explanation

- Notify --> It performs no action but sends an alert notification (email/web UI).

- Notify & Suspend --> It sends a notification and suspends all assigned warehouses after all
statements being executed by the warehouse (s) have been completed.

- Notify & Suspend Immediately --> It sends a notification and suspends all assigned
warehouses immediately.

Question 150Skipped

chumma kizhii
#dm

After how many days does the load history of Snowpipe expire?

90 days.

180 days.

Correct answer

14 days.

1 day.

Overall explanation

The load history is stored in the metadata of the pipe for 14 days. Must be requested from
Snowflake via a REST endpoint, SQL table function, or ACCOUNT_USAGE view.

Question 151Skipped

Which of the following file formats are supported by Snowflake? (Choose three.)

Correct selection

Avro.

XLSX.

TXT.

Correct selection

CSV.

Correct selection

XML.

HTML.

Overall explanation

A File Format object describes and stores the format information required to load data into
Snowflake tables. You can specify different parameters, such as the file’s delimiter, if you
want to skip the header or not, etc. Snowflake supports both Structured and Semi-
Structured Data. You can see the different file formats in the following image:

chumma kizhii
#dm

Question 152Skipped

Which services does the Snowflake Cloud Services layer manage? (Choose two.)

Compute resources

Query execution

Correct selection

Authentication

Data storage

Correct selection

Metadata

Overall explanation

See this link.

Question 153Skipped

When reviewing the load for a warehouse using the load monitoring chart, the chart
indicates that a high volume of queries is always queuing in the warehouse.

According to recommended best practice, what should be done to reduce the queue
volume? (Choose two.)

Scale up the warehouse size to allow queries to execute faster.

Limit user access to the warehouse so fewer queries are run against it.

Correct selection

Use multi-clustered warehousing to scale out warehouse capacity.

Correct selection

Migrate some queries to a new warehouse to reduce load.

Stop and start the warehouse to clear the queued queries.

chumma kizhii
#dm

Overall explanation

If the running query load is high or there’s queuing, consider starting a separate warehouse
and moving queued queries to that warehouse. Alternatively, if you are using multi-cluster
warehouses, you could change your multi-cluster settings to add additional clusters to
handle higher concurrency going forward.

See this link.

Question 154Skipped

Which possibilities give us Snowflake to resize a Warehouse? (Choose two.)

Creating a new warehouse with the new size.

Correct selection

Using the Snowflake UI in the Warehouse configuration.

Correct selection

Using the SQL command ALTER WAREHOUSE <name> SET warehouse_size=<SIZE>

Using the SQL command ALTER WAREHOUSE <name> MODIFY warehouse_size=<SIZE>

Using the SQL command CHANGE WAREHOUSE <name> SET warehouse_size=<SIZE>

Overall explanation

Resizing a warehouse to a larger size is helpful to improve the performance of large, complex
queries against large data sets; and improve performance while loading and unloading
significant amounts of data.

Question 155Skipped

What would happen if we executed the following command?

1. CREATE OR REPLACE TABLE newTable

2. CLONE table1;

Correct answer

Snowflake creates a new entry in the metadata store to keep track of the new clone. The
existing micro-partitions of “table1” are mapped to the new table.

“newTable” is created, and Snowflake internally executes a batch job to copy all the data
from “table1”

“newTable” is created with all the data from “table1”

chumma kizhii
#dm

“newTable” is created, and Snowflake internally executes a pipe to copy all the data from
“table1”

Overall explanation

Zero-Copy cloning does NOT duplicate data; it duplicates the metadata of the micro-
partitions. When you modify some cloned data, it will consume storage.

SET 4
Question 1Skipped

Which of the following describes how clustering keys work in Snowflake?

Clustering keys establish a hashed key on each node of a virtual warehouse to optimize
joins at run-time.

Clustering keys create a distributed, parallel data structure of pointers to a table's rows
and columns.

Clustering keys update the micro-partitions in place with a full sort, and impact the DML
operations.

Correct answer

chumma kizhii
#dm

Clustering keys sort the designated columns over time, without blocking DML operations.

Overall explanation

See this link.

Question 2Skipped

What is the maximum time that Snowflake can run a query?

1 day.

2 days.

Correct answer

7 days.

Unlimited.

Overall explanation

The default value for the command “STATEMENT_TIMEOUT_IN_SECONDS” is 172800


seconds, which is two days, but the maximum time we can configure is seven days.

Question 3Skipped

A user has 10 files in a stage containing new customer data. The ingest operation completes
with no errors, using the following command:

COPY INTO my_table FROM @my_stage;

The next day the user adds 10 files to the stage so that now the stage contains a mixture of
new customer data and updates to the previous data. The user did not remove the 10
original files.

If the user runs the same COPY INTO command what will happen?

Correct answer

All data from only the newly-added files will be appended to the table.

All data from all of the files on the stage will be appended to the table.

Only data about new customers from the new files will be appended to the table.

The operation will fail with the error UNCERTAIN FILES IN STAGE.

Overall explanation

chumma kizhii
#dm

COPY command maintains historic load metadata with target table, so day 1 , 10 files will will
not be loaded again.

See this link.

Question 4Skipped

A view is defined on a permanent table. A temporary table with the same name is created in
the same schema as the referenced table.

What will the query from the view return?

An error stating that the referenced object could not be uniquely identified.

Correct answer

The data from the temporary table.

The data from the permanent table.

An error stating that the view could not be compiled.

Overall explanation

Similar to the other table types (transient and permanent), temporary tables belong to a
specified database and schema; however, because they are session-based, they aren’t
bound by the same uniqueness requirements. This means you can create temporary and
non-temporary tables with the same name within the same schema.

However, note that the temporary table takes precedence in the session over any other
table with the same name in the same schema.

See this link.

Question 5Skipped

What tasks can be completed using the COPY command? (Choose two.)

Correct selection

Columns can be reordered.

Correct selection

Columns can be omitted.

Columns can be aggregated.

Data can be loaded without the need to spin up a virtual warehouse.

Columns can be joined with an existing table.

Overall explanation

chumma kizhii
#dm

Columns can be reordered. Columns can be omitted.

See this link.

Question 6Skipped

What does the orange bar on an operator represent when reviewing the Query Profile?

The fraction of data scanned from cache versus remote disk for the operator.

Correct answer

The fraction of time that this operator consumed within the query step.

A measure of progress of the operator's execution.

The cost of the operator in terms of the virtual warehouse CPU utilization.

Overall explanation

Fraction of time that this operator consumed within the query step (e.g. 25% for Aggregate
[5]). This information is also reflected in the orange bar at the bottom of the operator node,
allowing for easy visual identification of performance-critical operators.

See this link.

Question 7Skipped

A user unloaded a Snowflake table called mytable to an internal stage called mystage.

Which command can be used to view the list of files that has been uploaded to the stage?

Correct answer

list @mystage;

list @%mystage;

list @%mytable;

chumma kizhii
#dm

list @mytable;

Overall explanation

See this link.

Question 8Skipped

Which command is used to unload data from a Snowflake table to an external stage?

COPY INTO followed by PUT

GET

Correct answer

COPY INTO

COPY INTO followed by GET

Overall explanation

Use the COPY INTO <location> command to copy the data from the Snowflake database
table into one or more files in a Snowflake stage.

See this link.

Question 9Skipped

What action should be taken if a Snowflake user wants to share a newly created object in a
database with consumers?

Correct answer

Use the GRANT privilege ... TO SHARE command to grant the necessary privileges.

Drop the object and then re-add it to the database to trigger sharing.

Use the automatic sharing feature for seamless access.

Recreate the object with a different name in the database before sharing.

Overall explanation

GRANT <privilege> … TO SHARE

Grants access privileges for databases and other supported database objects (schemas,
UDFs, tables, and views) to a share. Granting privileges on these objects effectively adds the
objects to the share, which can then be shared with one or more consumer accounts.

See this link.

And see this link.

chumma kizhii
#dm

Question 10Skipped

How many credits will consume a medium-size warehouse with 2 clusters running in
maximized mode for 3 hours?

8.

32.

16.

Correct answer

24.

Overall explanation

A medium size warehouse with one cluster consumes four credits per hour. As we have two
clusters, it will consume eight credits per hour. In three hours, it will consume 24 credits.

Question 11Skipped

The COPY INTO command can unload data from a table directly into which locations?
(Choose two.)

Correct selection

A named internal stage

chumma kizhii
#dm

A local directory or folder on a client machine

A network share on a client machine

Correct selection

A named external stage that references an external cloud location

A Snowpipe REST endpoint

Overall explanation

See this link.

Question 12Skipped

To which entity do we grant privileges?

Account.

Correct answer

Roles.

Groups.

Users.

Overall explanation

Privileges are granted to roles, and roles are granted to users.

Question 13Skipped

We want to generate a JSON object with the data from a table called users_table, composed
of two columns (AGE and NAME), ordered by the name column. How can we do it?

1. SELECT to_object(*) as users_object

2. FROM users_table

3. order by users_object[‘NAME’];

1. SELECT object_deconstruct(*) as users_object

2. FROM users_table

3. order by users_object[‘NAME’];

1. SELECT to_json_object(*) as users_object

2. FROM users_table

3. order by users_object[‘NAME’];

chumma kizhii
#dm

Correct answer

1. SELECT object_construct(*) as users_object

2. FROM users_table

3. order by users_object[‘NAME’];

Overall explanation

The OBJECT_CONSTRUCT command returns an OBJECT constructed from the arguments. In


the following example (left), the arguments come from the table, whereas in the second
(right), we send the arguments to the function.

Question 14Skipped

There are two Snowflake accounts in the same cloud provider region: one is production and
the other is non-production.

How can data be easily transferred from the production account to the non-production
account?

Create a reader account using the production account and link the reader account to the
non-production account.

Clone the data from the production account to the non-production account.

Create a subscription in the production account and have it publish to the non-production
account.

Correct answer

Create a data share from the production account to the non-production account.

Overall explanation

chumma kizhii
#dm

Zero Copy Clone would be the Snowflake tool to cover this case but the question refers that
we want to do it between 2 accounts of the even if they are in the same Cloud Provider. It is
not possible to use Zero Copy Clone between different accounts, so we have to use Data
Sharing.

Question 15Skipped

chumma kizhii
#dm

Which permission on a Snowflake virtual warehouse allows the role to resize the
warehouse?

ALTER

USAGE

Correct answer

MODIFY

MONITOR

Overall explanation

See this link.

Question 16Skipped

Which of the following roles are recommended to create and manage users and roles?
(Choose two.

SYSADMIN

Correct selection

SECURITYADMIN

Correct selection

USERADMIN

ACCOUNTADMIN

PUBLIC

Overall explanation

Correct Answer : SECURITYADMIN and USERADMIN

See this link.

Question 17Skipped

Snowflake’s access control framework combines which models for securing data? (Choose
two.)

Access Control List (ACL)

Attribute-based Access Control (ABAC)

Correct selection

chumma kizhii
#dm

Role-based Access Control (RBAC)

Rule-based Access Control (RuBAC)

Correct selection

Discretionary Access Control (DAC)

Overall explanation

Snowflake’s approach to access control combines aspects from both of the following models:

• Discretionary Access Control (DAC): Each object has an owner, who can in turn grant
access to that object.

• Role-based Access Control (RBAC): Access privileges are assigned to roles, which are
in turn assigned to users.

See this link.

Question 18Skipped

Which command can we use to query the table <my_table> as it was 15 minutes ago?

1. SELECT *

2. FROM my_table

3. AT(offset => 15);

Correct answer

1. SELECT *

2. FROM my_table

3. AT(offset => -60*15);

1. SELECT *

2. FROM my_table

3. AT(offset => -3600*15);

We should’ve created a backup of that table. Otherwise, we are not able to do it.

Overall explanation

Thanks to the Time Travel functionality, it’s possible to query a table as it was some time
ago. We need to put the time in seconds as the offset parameter, in this case, 15 minutes *
60. We’ll add the “-” symbol as we are querying in the past.

Question 19Skipped

chumma kizhii
#dm

The time-travel retention period of a table is configured to be ten days. You now increase the
retention period to 20 days. What will happen with the table's data after this increment?
(Choose two.)

Any data that is between 10 and 20 days older will have time-travel extended.

Correct selection

Any data which has not reached the ten days time-travel period will now have time-travel
extended for 20 days.

Correct selection

Any data that is ten days older and moved to fail-safe will not have any impact.

Changes will impact only new data.

Overall explanation

Increasing the time-travel retention period impacts the new data and the data that hasn't
reached the time-travel period. In this case, the new data and the data that hasn't reached
the ten days will be extended to 20 days.

Question 20Skipped

When creating a table using the command:

1. CREATE TABLE MY_TABLE

2. (NAME STRING(100));

What would the command "DESC TABLE MY_TABLE;" display as the column type?

Char.

String.

Correct answer

Varchar.

Text.

Overall explanation

Varchar has different synonyms, like STRING , TEXT , NVARCHAR , CHAR , CHARACTER…, but
in the end, they are all VARCHAR type when describing the table. Take a look at the following
example, where all the column types are VARCHAR:

chumma kizhii
#dm

Question 21Skipped

What is the maximum number of clusters in a multi-cluster warehouse?

64.

100.

Unlimited, there is not a maximum number of clusters.

Correct answer

10.

Overall explanation

To define a multi-cluster warehouse, the maximum number of clusters has to be greater


than 1 (up to 10). Also, the minimum number of clusters must equal to or less than the
maximum.

Question 22Skipped

Which file function generates a Snowflake-hosted file URL to a staged file using the stage
name and relative file path as inputs?

GET_RELATIVE_PATH

GET_STAGE_LOCATION

GET_ABSOLUTE_PATH

Correct answer

BUILD_STAGE_FILE_URL

chumma kizhii
#dm

Overall explanation

BUILD_STAGE_FILE_URL Generates a Snowflake-hosted file URL to a staged file using the


stage name and relative file path as inputs. A file URL permits prolonged access to a
specified file. That is, the file URL does not expire.

See this link.

Question 23Skipped

What is the impact on queries that are being executed when a resource monitor set to the
“Notify & Suspend” threshold level is exceeded?

Correct answer

All statements being executed are completed.

All statements being executed are queued.

All statements being executed are cancelled.

All statements being executed are restarted.

Overall explanation

Notify & Suspend

Send a notification (to all account administrators with notifications enabled) and suspend all
assigned warehouses after all statements being executed by the warehouse(s) have
completed.

Notify & Suspend Immediately

Send a notification (to all account administrators with notifications enabled) and suspend all
assigned warehouses immediately, which cancels any statements being executed by the
warehouses at the time.

See this link.

Question 24Skipped

What activities can a user with the ORGADMIN role perform? (Choose two.)

Create INFORMATION_SCHEMA in a database.

Enable database cloning for an account in the organization.

Correct selection

Enable database replication for an account in the organization.

Correct selection

chumma kizhii
#dm

View usage information for all accounts in the organization.

View micro-partition information for all accounts in the organization.

Overall explanation

See this link.

Question 25Skipped

Which types of charts does Snowsight support? (Choose two.)

Column charts

Correct selection

Bar charts

Correct selection

Scorecards

Area charts

Radar charts

Overall explanation

Snowsight supports the following types of charts:

• Bar charts.

• Line charts.

• Scatterplots.

• Heat grids.

• Scorecards.

See this link.

Question 26Skipped

Which parameter allows us to schedule the task_1 to run every day with a CRON expression?

Correct answer

1. SET SCHEDULE

1. SET FIXED_TIME

1. SET INITIALIZATION

1. SET CRON

chumma kizhii
#dm

Overall explanation

An example can be:

1. ALTER TASK TASK_1 SET SCHEDULE = 'USING CRON */3 * * * * UTC';

Question 27Skipped

When unloading data to an external stage, which compression format can be used for
Parquet files with the COPY INTO command?

BROTLI

GZIP

Correct answer

LZO

ZSTD

Overall explanation

See this link.

Question 28Skipped

How can a user change which columns are referenced in a view?

Correct answer

Recreate the view with the required changes

Materialize the view to perform the changes

Modify the columns in the underlying table

Use the ALTER VIEW command to update the view

Overall explanation

Currently the only supported operations are:

Renaming a view.

Converting to (or reverting from) a secure view.

Adding, overwriting, removing a comment for a view.

Note that you cannot use this command to change the definition for a view. To change the
view definition, you must drop the view and then recreate it.
https://docs.snowflake.com/en/sql-reference/sql/alter-view.html#alter-view

chumma kizhii
#dm

Question 29Skipped

Which parameter can be used to instruct a COPY command to verify data files instead of
loading them into a specified table?

SKIP_BYTE_ORDER_MARK

STRIP_NULL_VALUES

Correct answer

VALIDATION_MODE

REPLACE_INVALID_CHARACTERS

Overall explanation

VALIDATION_MODE: This instructs the command to validate the data files instead of loading
them into target tables and allows you to perform the dry run to ensure the fail-safe delivery
of data.

Question 30Skipped

Which Snowflake table is an implicit object layered on a stage, where the stage can be either
internal or external?

A table with a materialized view

Transient table

Temporary table

Correct answer

Directory table

Overall explanation

A directory table is a Snowflake table that is an implicit object layered on a stage, where the
stage can be either internal or external. A directory table allows querying the metadata and
contents of the files in the stage using standard SQL statements.

See this link.

Question 31Skipped

Which command will use warehouse credits?

Correct answer

1. SELECT MAX(ID)

2. FROM MYTABLE

chumma kizhii
#dm

3. GROUP BY ID

1. SELECT COUNT(*)

2. FROM MYTABLE

1. SELECT MIN(ID)

2. FROM MYTABLE

1. SELECT MAX(ID)

2. FROM MYTABLE

Overall explanation

You can test it by going to the query profile of each query. The first three queries use the
metadata cache, whereas the last one doesn’t do it because of the GROUP BY.

Question 32Skipped

What actions are supported by Snowflake resource monitors? (Choose two.)

Alert

Suspend immediately

Correct selection

Notify

Abort

Correct selection

Notify and suspend

Overall explanation

Resource monitors support the following actions:

Notify & Suspend Send a notification (to all account administrators with notifications
enabled) and suspend all assigned warehouses after all statements being executed by the
warehouse(s) have completed.

Notify & Suspend Immediately Send a notification (to all account administrators with
notifications enabled) and suspend all assigned warehouses immediately, which cancels any
statements being executed by the warehouses at the time.

Notify Perform no action, but send an alert notification (to all account administrators with
notifications enabled).

Question 33Skipped

chumma kizhii
#dm

Which of the following operations require the use of a running virtual warehouse? (Choose
two.)

Correct selection

Querying data from a materialized view

Correct selection

Executing a stored procedure

Altering a table

Downloading data from an internal stage

Listing files in a stage

Overall explanation

See this link.

Question 34Skipped

Which Snowflake object stores DML change made to tables and metadata about each
change?

Account Streams.

Pipes.

Tables.

Correct answer

Table Streams.

Overall explanation

Streams are Snowflake objects that record data manipulation language (DML) changes made
to tables and views, including INSERTS, UPDATES, and DELETES, as well as metadata about
each change. A stream can also be referred to as a “table stream”.

Question 35Skipped

Which function can be used with the COPY INTO statement to convert rows from a relational
table to a single VARIANT column, and to unload rows into a JSON file?

TO_VARIANT

OBJECT_AS

Correct answer

chumma kizhii
#dm

OBJECT_CONSTRUCT

FLATTEN

Overall explanation

You can use the OBJECT_CONSTRUCT function combined with the COPY command to
convert the rows in a relational table to a single VARIANT column and unload the rows into a
file.

See this link.

Question 36Skipped

Which DDL/DML operation is allowed on an inbound data share?

INSERT INTO

ALTER TABLE

Correct answer

SELECT

MERGE

Overall explanation

Shared databases are read-only. Users in a consumer account can view/query data, but
cannot insert or update data, or create any objects in the database.

See this link.

Question 37Skipped

What is a characteristic of the maintenance of a materialized view?

An additional set of scripts is needed to refresh data in materialized views.

A materialized view can be set up with the auto-refresh feature using the SQL SET
command.

Materialized views cannot be refreshed automatically.

Correct answer

A materialized view is automatically refreshed by a Snowflake managed warehouse.

Overall explanation

Materialized views are automatically and transparently maintained by Snowflake. A


background service updates the materialized view after changes are made to the base table.

chumma kizhii
#dm

This is more efficient and less error-prone than manually maintaining the equivalent of a
materialized view at the application level.

See this link.

Question 38Skipped

A PUT command can be used to stage local files from which Snowflake interface?

Snowsight

Snowflake classic web interface (UI)

Correct answer

SnowSQL

.NET driver

Overall explanation

PUT command Usage The command cannot be executed from the Worksheets page in the
Snowflake web interface; instead, use the SnowSQL client to upload data files, or check the
documentation for the specific Snowflake client to verify support for this command.

Question 39Skipped

What are potential impacts of storing non-native values like dates and timestamps in a
VARIANT column in Snowflake?

Correct answer

Slower query performance and increased storage consumption

Slower query performance and decreased storage consumption

Faster query performance and decreased storage consumption

Faster query performance and increased storage consumption

Overall explanation

For non-native data (such as dates and timestamps), the values are stored as strings when
loaded into a VARIANT column. Therefore, operations on these values could be slower and
also consume more space than when stored in a relational column with the corresponding
data type.

See this link.

Question 40Skipped

What statements are true about the Snowflake History Page? (Choose two.)

chumma kizhii
#dm

You can see the results of other users.

Correct selection

You cannot see the results of other users.

The History page allows you to view and drill into the details of all queries executed in the
last 24 days.

You cannot see the queries of other users.

Correct selection

The History page allows you to view and drill into the details of all queries executed in the
last 14 days.

Overall explanation

You can view results only for queries you have executed. You can also see other users’
queries but cannot see their results. If you have privileges to view queries executed by
another user, the Query Detail page displays the details for the query, but it won’t show the
actual query result.

However, if you have the same role, perform the same query, and the data has not changed,
you’ll use the Query Result Cache and get the same result. In this example, you can see the
execution of a query that uses the Query Results Cache, only spending 141ms to be
executed.

Question 41Skipped

Which of these commands require a running warehouse?

1. SELECT MAX(AGE)

2. FROM USERS_TABLE;

Correct answer

1. SELECT *

2. FROM USERS_TABLE

3. WHERE email=’[email protected]’;

1. EXPLAIN USING TABULAR

2. SELECT *

chumma kizhii
#dm

3. FROM USERS_TABLE

4. WHERE email=’[email protected]’;

1. SELECT COUNT(*)

2. FROM USERS_TABLE;

Overall explanation

The first query will use compute power, whereas the others don’t need a warehouse
running. This is an excellent example to see the use of the EXPLAIN command, which returns
the logical execution plan for the specified SQL statement. An explained plan shows the
operations (for example, table scans and joins) that Snowflake would perform to execute the
query.

For example, running the previous command in my Snowflake account, I generated this
result in 101ms:

Although EXPLAIN does not consume any compute credits, the compilation of the query
does consume Cloud Service credits, just as other metadata operations do. The output is the
same as the output of the command EXPLAIN_JSON.

Question 42Skipped

What compute resource is used when loading data using Snowpipe?

Snowpipe uses cloud platform compute resources provided by the user.

Correct answer

Snowpipe uses compute resources provided by Snowflake.

Snowpipe uses an Apache Kafka server for its compute resources.

Snowpipe uses virtual warehouses provided by the user.

Overall explanation

Snowpipe uses compute resources provided by Snowflake (i.e. a serverless compute model).

See this link.

Question 43Skipped

chumma kizhii
#dm

Which Snowflake partner category is represented at the top of this diagram (labeled 1)?

Machine Learning and Data Science

Business Intelligence

Correct answer

Data Integration

Security and Governance

Overall explanation

See this link.

Question 44Skipped

Which is the MINIMUM required Snowflake edition that a user must have if they want to use
AWS/Azure Privatelink or Google Cloud Private Service Connect?

Enterprise

Standard

Correct answer

Business Critical

Premium

chumma kizhii
#dm

Overall explanation

This feature requires Business Critical (or higher).

See this link.

Question 45Skipped

What is the purpose of enabling Federated Authentication on a Snowflake account?

Correct answer

Allows users to connect using secure single sign-on (SSO) through an external identity
provider

Allows dual Multi-Factor Authentication (MFA) when connecting to Snowflake

Forces users to connect through a secure network proxy

Disables the ability to use key pair and basic authentication (e.g., username/password)
when connecting

Overall explanation

In a federated environment, user authentication is separated from user access through the
use of one or more external entities that provide independent authentication of user
credentials. The authentication is then passed to one or more services, enabling users to
access the services through SSO.

See this link.

Question 46Skipped

What table functions in the Snowflake Information Schema can be queried to retrieve
information about directory tables? (Choose two.)

Correct selection

AUTO_REFRESH_REGISTRATION_HISTORY

EXTERNAL_TABLE_FILE_REGISTRATION_HISTORY

EXTERNAL_TABLE_FILES

MATERIALIZED_VIEW_REFRESH_HISTORY

Correct selection

STAGE_DIRECTORY_FILE_REGISTRATION_HISTORY

Overall explanation

chumma kizhii
#dm

AUTO_REFRESH_REGISTRATION_HISTORY: Retrieve the history of data files registered in the


metadata of specified objects and the credits billed for these operations.
STAGE_DIRECTORY_FILE_REGISTRATION_HISTORY: Retrieve information about the metadata
history for a directory table, including any errors found when refreshing the metadata.

See this link.

Question 47Skipped

What happens when a suspended virtual warehouse is resized in Snowflake?

It will return an error.

Correct answer

The additional compute resources are provisioned when the warehouse is resumed.

It will return a warning

The suspended warehouse is resumed and new compute resources are provisioned
immediately.

Overall explanation

Resizing a suspended warehouse does not provision any new compute resources for the
warehouse. It simply instructs Snowflake to provision the additional compute resources
when the warehouse is next resumed, at which time all the usage and credit rules associated
with starting a warehouse apply.

See this link.

Question 48Skipped

When data is loaded into Snowflake, what formats does Snowflake use internally to store
the data in cloud storage? (Choose two.)

Correct selection

Compressed

Key-value

Correct selection

Columnar

Document

Graph

Overall explanation

chumma kizhii
#dm

When data is loaded into Snowflake, Snowflake reorganizes that data into its internal
optimized, compressed, columnar format.

Typical question about how Snowflake stores information (key concept).

See this link.

Question 49Skipped

Which pipes are cloned when cloning a database or schema?

Correct answer

Pipes that reference external stages.

Both.

Pipes cannot be cloned.

Pipes that reference internal stages.

Overall explanation

Internal named stages are NEVER cloned, so pipes that reference internal stages are not
cloned.

Question 50Skipped

After how many hours does Snowflake cancel our running SQL statement by default?

10 hours.

1 hour.

24 hours.

Correct answer

48 hours.

Overall explanation

The default value for the command “STATEMENT_TIMEOUT_IN_SECONDS” is 172800


seconds, which is 48 hours. A query constantly running can be dangerous, so Snowflake kills
it in 48 hours.

Question 51Skipped

Which functionality is not provided by the query profile?

Details and statistics for the overall query.

Correct answer

chumma kizhii
#dm

Hints for improving the query performance.

Statistics for each component of the query.

Graphical representation of the main components of the processing plan of the query.

Overall explanation

You can see the query profiler in the following picture. We can see the graphical
representation of the components and some statistics for the overall query and for each
component of the query. Still, unfortunately, there are no hints to improve it, so we need to
become good Snowflake developers to spot bottlenecks by ourselves!

Question 52Skipped

Which role in Snowflake allows a user to enable replication for multiple accounts?

SYSADMIN

Correct answer

ORGADMIN

SECURITYADMIN

ACCOUNTADMIN

Overall explanation

See this link.

chumma kizhii
#dm

Question 53Skipped

Snowflake’s hierarchical key mode includes which keys? (Choose two.)

Schema master keys

Correct selection

Account master keys

Secure view keys

Database master keys

Correct selection

File keys

Overall explanation

Snowflake’s hierarchical key model consists of four levels of keys: the root key, account
master keys, table master keys, and file keys. Each account master key corresponds to one
customer account in Snowflake. Each table master key corresponds to one database table in
a database.

See this link.

Question 54Skipped

When is the result set cache no longer available? (Choose two.)

Correct selection

When the underlying data has changed

chumma kizhii
#dm

When another user executes the query

When another warehouse is used to execute the query

When the warehouse used to execute the query is suspended

Correct selection

When it has been 24 hours since the last query

Overall explanation

See this link.

Question 55Skipped

The bulk data load history that is available upon completion of the COPY statement is stored
where and for how long?

Correct answer

In the metadata of the target table for 64 days

In the metadata of the target table for 14 days

In the metadata of the pipe for 64 days

In the metadata of the pipe for 14 days

Overall explanation

Bulk data load

Stored in the metadata of the target table for 64 days. Available upon completion of the
COPY statement as the statement output.

See this link.

Question 56Skipped

In a managed access schema, who can grant privileges on objects in the schema to other
roles? (Choose two.)

The USERADMIN system role

The ORGADMIN system role

Correct selection

The role with the MANAGE GRANTS privilege

Correct selection

The schema owner role

chumma kizhii
#dm

The role that owns the object in the schema

Overall explanation

In managed access schemas, object owners lose the ability to make grant decisions. Only the
schema owner or a role with the MANAGE GRANTS privilege can grant privileges on objects
in the schema, including future grants, centralizing privilege management.

See this link.

Question 57Skipped

What is the maximum length of time travel available in the Snowflake Standard Edition?

30 Days

7 Days

Correct answer

1 Day

90 Days

Overall explanation

90 days in enterprise or higher edition only.

Question 58Skipped

Which type of charts are supported by Snowsight? (Choose two.)

Gantt charts

Correct selection

Line charts

Pie charts S3

chumma kizhii
#dm

Flowcharts

Correct selection

Scatterplots

Overall explanation

Snowsight supports the following types of charts:

• Bar charts

• Line charts

• Scatterplots

• Heat grids

• Scorecards

See this link.

Question 59Skipped

The owner of a task (the one who has the OWNERSHIP privilege) is deleted. What will
happen to the task?

Correct answer

The task will belong to the role that dropped the owner’s role.

The task will belong to the ACCOUNTADMIN role.

The task is deleted.

The task is suspended.

Overall explanation

When the owner of a task is deleted, the task is "re-possessed" to the role that dropped the
owner's role. This ensures that ownership moves to a role closer to the role hierarchy's root.
In this case, the task will have to be resumed explicitly by the new owner, as it's
automatically paused.

Question 60Skipped

Which of the following Snowflake capabilities are available in all Snowflake editions?
(Choose two.)

Column-level security to apply data masking policies to tables and views

Correct selection

chumma kizhii
#dm

Object-level access control

Correct selection

Automatic encryption of all data

Up to 90 days of data recovery through Time Travel

Customer-managed encryption keys through Tri-Secret Secure

Overall explanation

See this link.

Question 61Skipped

Which statement describes pruning?

The return of micro-partitions values that overlap with each other to reduce a query's
runtime.

chumma kizhii
#dm

The ability to allow the result of a query to be accessed as if it were a table.

A service that is handled by the Snowflake Cloud Services layer to optimize caching.

Correct answer

The filtering or disregarding of micro-partitions that are not needed to return a query.

Overall explanation

See this link.

Question 62Skipped

If all virtual warehouse resources are maximized while processing a query workload, what
will happen to new queries that are submitted to the warehouse?

The warehouse will move to a suspended state.

Correct answer

New queries will be queued and executed when capacity is available.

The warehouse will scale out automatically

All queries will terminate when the resources are maximized.

Overall explanation

The keyword here maximized means MIN_CLUSTER_COUNT = MAX_CLUSTER_COUNT

Question 63Skipped

Which of the following file formats is not supported by Snowflake to unload data?

Correct answer

Avro.

Parquet.

CSV.

JSON.

Overall explanation

A File Format object describes and stores the format information required to load data into
Snowflake tables. You can specify different parameters, such as the file’s delimiter, if you
want to skip the header or not, etc. You can see the different file formats in the following
image:

chumma kizhii
#dm

Question 64Skipped

Which commands should be used to grant the privilege allowing a role to select data from all
current tables and any tables that will be created later in a schema? (Choose two.)

1. grant SELECT on future tables in database DB1 to role MYROLE;

Correct selection

1. grant SELECT on all tables in schema DB1.SCHEMA to role MYROLE;

1. grant USAGE on future tables in schema DB1.SCHEMA to role MYROLE;

Correct selection

1. grant SELECT on future tables in schema DB1.SCHEMA to role MYROLE;

1. grant SELECT on all tables in database DB1 to role MYROLE;

1. grant USAGE on all tables in schema DB1.SCHEMA to role MYROLE;

Overall explanation

See this link.

Question 65Skipped

What is the COPY INTO command option default for unloading data into multiple files?

SINGLE = NULL

SINGLE = TRUE

SINGLE = 0

Correct answer

SINGLE = FALSE

Overall explanation

The COPY INTO <location> command provides a copy option (SINGLE) for unloading data into
a single file or multiple files. The default is SINGLE = FALSE (i.e. unload into multiple files).

See this link.

chumma kizhii
#dm

Question 66Skipped

Using variables in Snowflake is denoted by using which SQL character?

&

Correct answer

Overall explanation

See this link.

Question 67Skipped

Which privileges does a role need to clone a table? (Choose two.)

CREATE privilege on the source table.

Correct selection

SELECT privilege on the source table.

SHARE privilege on the database of the source table.

SELECT privilege on the database of the source table.

Correct selection

USAGE privilege on the schema of the source table.

Overall explanation

Your current role must have SELECT privilege on the source table to create a clone. In
addition, to clone a schema or an object within a schema, you’d need privileges on the
container object(s) for both the source and the clone. You’d need the OWNERSHIP privilege
for pipes, streams, and tasks.

You can see the necessary privileges to create a clone in the following table:

chumma kizhii
#dm

Question 68Skipped

Who can activate a network policy for users in a Snowflake account? (Choose two.)

PUBLIC

Correct selection

Any role that has the global ATTACH POLICY privilege

Correct selection

ACCOUNTADMIN

USERADMIN

SYSADMIN

Overall explanation

Only security administrators (i.e. users with the SECURITYADMIN role) or higher or a role
with the global ATTACH POLICY privilege can activate a network policy for an account. Once
the policy is associated with your account, Snowflake restricts access to your account based
on the allowed list and blocked list.

See this link.

Question 69Skipped

Which Snowflake tool is recommended for data batch processing?

The Snowflake API

Snowsight

Correct answer

SnowSQL

SnowCD

chumma kizhii
#dm

Overall explanation

SnowSQL is the command line client for connecting to Snowflake to execute SQL queries and
perform all DDL and DML operations, including loading data into and unloading data out of
database tables.

See this link.

Snowsight is the snowflake web interface.

SnowCD (i.e. Snowflake Connectivity Diagnostic Tool) helps users to diagnose and
troubleshoot their network connection to Snowflake.

The Snowflake SQL API is a REST API that you can use to access and update data in a
Snowflake database.

Question 70Skipped

Which programming languages are supported for Snowflake User-Defined Functions (UDFs)?
(Choose two.)

Correct selection

JavaScript

TypeScript

C#

Correct selection

Python

PHP

Overall explanation

See this link.

Question 71Skipped

A JSON file that contains lots of dates and arrays needs to be processed in Snowflake. The
user wants to ensure optimal performance while querying the data.

How can this be achieved?

Store the data in a table with a VARIANT data type. Query the table.

Store the data in a table with a VARIANT data type and include STRIP_NULL_VALUES while
loading the table. Query the table.

Store the data in an external stage and create views on top of it. Query the views.

chumma kizhii
#dm

Correct answer

Flatten the data and store it in structured data types in a flattened table. Query the table.

Overall explanation

For better pruning and less storage consumption, we recommend flattening your OBJECT
and key data into separate relational columns if your semi-structured data includes:

• Dates and timestamps, especially non-ISO 8601 dates and timestamps, as string
values

• Numbers within strings

• Arrays

Non-native values (such as dates and timestamps in JSON) are stored as strings when loaded
into a VARIANT column, so operations on these values could be slower and also consume
more space than when stored in a relational column with the corresponding data type.

See this link.

Question 72Skipped

What tasks can an account administrator perform in the Data Exchange? (Choose two.)

Transfer listing ownership.

Delete data categories.

Correct selection

Approve and deny listing approval requests.

Transfer ownership of a provider profile.

Correct selection

Add and remove members.

Overall explanation

By default, only an account administrator (a user with the ACCOUNTADMIN role) in the Data
Exchange administrator account can manage a Data Exchange, which includes the following
tasks:

• Add or remove members.

• Approve or deny listing approval requests.

• Approve or deny provider profile approval requests.

• Show categories.

chumma kizhii
#dm

See this link.

Question 73Skipped

A task is still being executed before the next scheduled task. What is going to happen with
the new scheduled task?

Snowflake will abort it.

Snowflake will execute it.

Snowflake will wait for the previous task to complete.

Correct answer

Snowflake will skip it.

Overall explanation

Snowflake ensures that only one instance of a task with a schedule is executed at a given
time. If a task is still running when the next scheduled execution time occurs, then that
scheduled time is skipped.

Question 74Skipped

Which of these Snowflake components/objects is NOT typically used in building continuous


ELT pipelines?

Streams.

Snowflake Connector for Kafka.

Correct answer

Data Exchange.

Snowpipe.

Overall explanation

Data Exchange is your own data hub for securely collaborating around data between a
selected group of members you invite. It enables providers to publish data that consumers
can then discover. You can use your Data Exchange to exchange data between business units
internal to your company and collaborate with external parties such as vendors, suppliers,
partners, and customers.

Question 75Skipped

Which of the following indicates that it may be appropriate to use a clustering key for a
table? (Choose two.)

Correct selection

chumma kizhii
#dm

The clustering depth for the table is large.

Correct selection

Queries on the table are running slower than expected.

The table contains a column that has very low cardinality.

The table has a small number of micro-partitions.

DML statements that are being issued against the table are blocked.

Overall explanation

Queries on the table are running slower than expected or have noticeably degraded over
time.

The clustering depth for the table is large.

See this link.

Question 76Skipped

What is the maximum size supported by the VARIANT column?

Correct answer

16 MB.

8 MB.

64 MB.

128 MB.

Overall explanation

The VARIANT data type imposes a 16 MB size limit on individual rows. If the data exceeds 16
MB, enable the STRIP_OUTER_ARRAY file format option for the COPY INTO <table>
command to remove the outer array structure and load the records into separate table rows.

Question 77Skipped

Which statements are true concerning Snowflake’s underlying cloud infrastructure? (Choose
three.)

Snowflake can be deployed in a customer’s private cloud using the customer’s own
compute and storage resources for Snowflake compute and storage.

Correct selection

Snowflake data and services are deployed in at least three availability zones within a cloud
provider’s region.

chumma kizhii
#dm

Snowflake data and services are deployed in a single availability zone within a cloud
provider’s region.

Correct selection

All three layers of Snowflake’s architecture (storage, compute, and cloud services) are
deployed and managed entirely on a selected cloud platform.

Snowflake data and services are available in a single cloud provider and a single region;
the use of multiple cloud providers is not supported.

Correct selection

Snowflake uses the core compute and storage services of each cloud provider for its own
compute and storage.

Overall explanation

Snowflake is provided as Software-as-a-Service (SaaS) that runs completely on cloud


infrastructure. This means that all three layers of Snowflake’s architecture (storage,
compute, and cloud services) are deployed and managed entirely on a selected cloud
platform.

In addition, Snowflake’s virtual warehouses and cloud services layers are similarly deployed
across three availability zones in a region.

See this link.

Question 78Skipped

Which of the following are benefits of micro-partitioning? (Choose two.)

Rows are automatically stored in sorted order within micro-partitions.

Micro-partitions cannot overlap in their range of values.

Micro-partitions can be defined on a schema-by-schema basis.

Correct selection

Micro-partitions are immutable objects that support the use of Time Travel.

Correct selection

Micro-partitions can reduce the amount of I/O from object storage to virtual warehouses.

Overall explanation

As wrote here:

chumma kizhii
#dm

- In contrast to traditional static partitioning, Snowflake micro-partitions are derived


automatically; they don’t need to be explicitly defined up-front or maintained by users.

- As the name suggests, micro-partitions are small in size (50 to 500 MB, before
compression), which enables extremely efficient DML and fine-grained pruning for faster
queries.

- Micro-partitions can overlap in their range of values, which, combined with their uniformly
small size, helps prevent skew.

- Columns are stored independently within micro-partitions, often referred to as columnar


storage. This enables efficient scanning of individual columns; only the columns referenced
by a query are scanned.

Question 79Skipped

Which commands support a multiple-statement request to access and update Snowflake


data? (Choose two.)

GET

Correct selection

ROLLBACK

PUT

Correct selection

COMMIT

CALL

Overall explanation

A transaction is a sequence of SQL statements that are processed as an atomic unit. All
statements in the transaction are either applied (i.e. committed) or undone (i.e. rolled back)
together. Snowflake transactions guarantee ACID properties.

A transaction can be ended explicitly by executing COMMIT or ROLLBACK. Snowflake


supports the synonym COMMIT WORK for COMMIT, and the synonym ROLLBACK
WORK for ROLLBACK.

See this link.

Question 80Skipped

Which data modeling concepts can be used in Snowflake (Choose two.)

Correct selection

chumma kizhii
#dm

Primary Key.

Unique Index.

Non-Unique Index.

Distribution Key.

Correct selection

Foreign Key.

Overall explanation

See following link.

Constraints define integrity and consistency rules for data stored in tables. You can specify a
CONSTRAINT clause in a CREATE TABLE or ALTER TABLE statement. A table can have multiple
unique keys and foreign keys, but only one primary key. Snowflake supports defining and
maintaining constraints, but does not enforce them, except for NOT NULL constraints, which
are always enforced. For example:

1. CREATE TABLE MY_TABLE (

2. col1 INTEGER NOT NULL

3. );

Question 81Skipped

Snowflake's approach to the management of system access combines which of the following
models? (Choose two.)

Identity Access Management (IAM)

Mandatory Access Control (MAC)

Correct selection

Discretionary Access Control (DAC)

Create, Read, Update, and Delete (CRUD)

Correct selection

Role-Based Access Control (RBAC)

Security Assertion Markup Language (SAML)

Overall explanation

chumma kizhii
#dm

Snowflake’s approach to access control combines aspects from both of the following models:

Discretionary Access Control (DAC): Each object has an owner, who can in turn grant access
to that object.

Role-based Access Control (RBAC): Access privileges are assigned to roles, which are in turn
assigned to users.

Question 82Skipped

When can a newly configured virtual warehouse start running SQL queries?

Correct answer

When the warehouse provisioning is completed

After 50% of the warehouse provisioning has completed

After the warehouse replication is completed

During the time slots defined by the ACCOUNTADMIN

Overall explanation

Snowflake does not begin executing SQL statements submitted to a warehouse until all of
the compute resources for the warehouse are successfully provisioned, unless any of the
resources fail to provision.

See this link.

Question 83Skipped

What is the recommended file size for the best load performance and to avoid size
limitations?

Files shouldn’t exceed 10-100 MB (or larger) in size uncompressed.

Correct answer

Files shouldn’t exceed 100-250 MB (or larger) in size compressed.

Files shouldn’t exceed 10-100 MB (or larger) in size compressed.

Files shouldn’t exceed 100-250 MB (or larger) in size uncompressed.

Overall explanation

This is the best size to get the best load performance. Suppose you still have to load a big
file, for example, 100GB. In that case, you should carefully consider the ON_ERROR copy
option value, as there is a maximum allowed duration of 24 hours, and the operation could
be aborted without any portion of the file being committed.

chumma kizhii
#dm

Question 84Skipped

What privilege is needed for a Snowflake user to see the definition of a secure view?

USAGE

MODIFY

CREATE

Correct answer

OWNERSHIP

Overall explanation

The definition of a secure view is only exposed to authorized users (i.e. users who have been
granted the role that owns the view).

However, users that have been granted IMPORTED PRIVILEGES privilege on the SNOWFLAKE
database or another shared database have access to secure view definitions via the VIEWS
Account Usage view.

Users granted the ACCOUNTADMIN role or the SNOWFLAKE.OBJECT_VIEWER database role


can also see secure view definitions via this view. The preferred, least-privileged means of
access is the SNOWFLAKE.OBJECT_VIEWER database role.

See this link.

Question 85Skipped

For a multi-cluster virtual warehouse, which parameters are used to calculate the number of
credits billed? (Choose two.)

Number of queries executed

Correct selection

Warehouse size

Cache size

Volume of data processed

Correct selection

Number of clusters

Overall explanation

See this link.

Question 86Skipped

chumma kizhii
#dm

By default, how many inbound share(s) have every Snowflake account?

Three, the ACCOUNT_USAGE, the DATA, and the INFORMATION shares.

One, the ACCOUNT_USAGE share.

Correct answer

Two, the ACCOUNT_USAGE and the SAMPLE_DATA shares.

Two, the ACCOUNT_USAGE and the INFORMATION shares.

Overall explanation

Snowflake shares metadata information about the usage of your account in the
ACCOUNT_USAGE share. You can also access different sample data sets for learning and
testing Snowflake’s functionalities with the SAMPLE_DATA share. You can see them in the
Snowflake UI if you have enough privileges:

Question 87Skipped

What is a key benefit of using organizations in Snowflake?

Ability to use zero-copy cloning across accounts

Correct answer

Ability to consolidate account management and billing

Ability to access new releases for testing and validation purposes

Ability to use ACCOUNT_USAGE views

Overall explanation

Ability to consolidate account management and billing. The rest of the options are not
technically possible or are technically possible but do not depend on the use of
organizations.

See this link.

chumma kizhii
#dm

Question 88Skipped

What are the benefits of the replication feature in Snowflake? (Choose two.)

Correct selection

Database failover and failback

Data security

Correct selection

Disaster recovery

Fail-safe

Time Travel

Question 89Skipped

You have a multi-cluster warehouse running with the standard scaling policy. The maximum
number of clusters is set to 8. If a lot of queries are queried, and the warehouse is constantly
starting new clusters, what is the maximum time the warehouse will start all the clusters?

They all start at the same time.

8 minutes.

Correct answer

160 seconds.

80 seconds.

Overall explanation

See this link.

Each successive cluster waits to start 20 seconds after the prior one has started. For
example, if your warehouse is configured with ten max clusters, it can take 200+ seconds to
start all 10 clusters. This doesn’t happen using the economy policy, that it will only start new
clusters if the system estimates there’s enough query load to keep the cluster busy for at
least 6 minutes. You can take a look to the following picture to know the differences
between these scaling policies:

chumma kizhii
#dm

Question 90Skipped

Any user with the appropriate privileges can view data storage for individual tables by using
which queries? (Choose two.)

Correct selection

TABLE_STORAGE_METRICS view in the INFORMATION_SCHEMA schema

STORAGE_USAGE view in the ACCOUNT_USAGE schema

Correct selection

TABLE_STORAGE_METRICS view in the ACCOUNT_USAGE schema

METERING_HISTORY view in the ACCOUNT_USAGE schema

METERING_DAILY_HISTORY view in the ORGANIZATION_USAGE schema

Overall explanation

Any user with the appropriate privileges can view data storage for individual tables.
Snowflake provides the following methods for viewing table data storage:

Classic Console

Click on Databases

» <db_name> » Tables

SQL

Execute a SHOW TABLES command.

or

chumma kizhii
#dm

Query either of the following:

• TABLE_STORAGE_METRICS view (in the Snowflake Information Schema).

• TABLE_STORAGE_METRICS View view (in Account Usage).

See this link.

Question 91Skipped

What is the recommended file sizing for data loading using Snowpipe?

A compressed file size greater than 10 MB, and up to 100 MB

Correct answer

A compressed file size greater than 100 MB, and up to 250 MB

A compressed file size greater than 1 GB, and up to 2 GB

A compressed file size greater than 100 GB, and up to 250 GB

Overall explanation

Loading data files roughly 100-250 MB in size or larger reduces the overhead charge relative
to the amount of total data loaded to the point where the overhead cost is immaterial.

Question 92Skipped

Which function should be used to insert JSON formatted string data into a VARIANT field?

CHECK_JSON

FLATTEN

TO_VARIANT

Correct answer

PARSE_JSON

Overall explanation

PARSE_JSON

Interprets an input string as a JSON document, producing a VARIANT value.

See this link.

Question 93Skipped

What activities can be monitored by a user directly from Snowsight's Activity tab without
using the Account_Usage views? (Choose two.)

Event usage history

chumma kizhii
#dm

Virtual warehouse metering history

Correct selection

Query history

Correct selection

Copy history

Login history

Overall explanation

It is highly recommended before the exam to browse the new Snowflake Snowsight UI and
familiarize yourself with the various capabilities it offers. Some questions like this will appear
on the exam.

See this link.

Question 94Skipped

Which service or feature in Snowflake is used to improve the performance of certain types
of lookup and analytical queries that use an extensive set of WHERE conditions?

Tagging

Correct answer

Search optimization service

Query acceleration service

Data classification

Overall explanation

The search optimization service can significantly improve the performance of certain types
of lookup and analytical queries that use an extensive set of predicates for filtering.

See this link.

And this link.

Question 95Skipped

Which feature of Snowflake’s Continuous Data Protection (CDP) has associated costs?

Multi-Factor Authentication (MFA)

End-to-end encryption

Correct answer

chumma kizhii
#dm

Fail-safe

Network policies

Overall explanation

See this link.

Question 96Skipped

In which Snowflake editions is the Snowflake Marketplace not available?

Snowflake Marketplace is available in all Snowflake editions.

Business Critical Edition.

Enterprise edition.

Correct answer

Virtual Private Snowflake (VPS) edition.

Standard edition.

Overall explanation

There are some restrictions in the Snowflake VPS edition. You can check all of them in the
following image (via docs.snowflake.com):

Question 97Skipped

Which ALTER commands will impact a column's availability in Time Travel?

1. ALTER TABLE … DROP COLUMN …

1. ALTER TABLE … SET NOT NULL …

1. ALTER TABLE … RENAME COLUMN …

Correct answer

1. ALTER TABLE … SET DATA TYPE …

Overall explanation

chumma kizhii
#dm

Decreasing the precision of a number column can impact Time Travel, for example,
converting from NUMBER(20,2) to NUMBER(10,2). SET DATA TYPE is the command that can
make that.

Question 98Skipped

Which data type can be used to store geospatial data in Snowflake?

Object

Correct answer

Geometry

Variant

It is not possible to store Geospatial data in Snowflake.

Overall explanation

See this link.

Question 99Skipped

While using a COPY command with a Validation_mode parameter, which of the following
statements will return an error?

Statements that have a specific data type in the source

Statements that insert a duplicate record during a load

Correct answer

Statements that transform data during a load

Statements that have duplicate file names

Overall explanation

The VALIDATION_MODE parameter does not support COPY statements that transform data
during a load.

See this link.

Question 100Skipped

A JSON document is stored in the source_column of type VARIANT. The document has an
array called elements. The array contains the name key that has a string value.

How can a Snowflake user extract the name from the first element?

Correct answer

chumma kizhii
#dm

source_column:elements[0].name

source_column:elements[1].name

source_column.elements[1]:name

source_column.elements[0]:name

Overall explanation

See this link.

It is important to be familiar with the syntax for querying semi-structured data.

Question 101Skipped

To add or remove search optimization for a table, a user must have which of the following
privileges or roles? (Choose two.)

A SECURITYADMIN role

The SELECT privilege on the table

Correct selection

The OWNERSHIP privilege on the table

The MODIFY privilege on the table

Correct selection

The ADD SEARCH OPTIMIZATION privilege on the schema that contains the table

Overall explanation

To add, configure, or remove search optimization for a table, you must have the following
privileges:

You must have OWNERSHIP privilege on the table.

You must have ADD SEARCH OPTIMIZATION privilege on the schema that contains the table.

See this link.

Question 102Skipped

When does Snowflake automatically encrypt data that is loaded into Snowflake? (Choose
two.)

Correct selection

After loading the data into a table.

Only when using an encrypted stage.

chumma kizhii
#dm

After the data is micro-partitioned.

Correct selection

After loading the data into an internal stage.

After loading data into an external stage.

Overall explanation

1. If the stage is an external stage (Image A), the user may optionally encrypt the data files
using client-side encryption (see Client-Side Encryption for more information). We
recommend client-side encryption for data files in external stages; but if the data is not
encrypted, Snowflake immediately encrypts the data when it is loaded into a table.

If the stage is an internal (i.e. Snowflake) stage (Image B) data files are automatically
encrypted by the Snowflake client on the user’s local machine prior to being transmitted to
the internal stage, in addition to being encrypted after they are loaded into the stage.

2. The user loads the data from the stage into a table.

The data is transformed into Snowflake’s proprietary file format and stored in a cloud
storage container. In Snowflake, all data at rest is always encrypted and encrypted with TLS
in transit. Snowflake also decrypts data when data is transformed or operated on in a table,
and then re-encrypts the data when the transformations and operations are complete.

The figure illustrates the E2EE system in Snowflake:

chumma kizhii
#dm

See this link.

Question 103Skipped

In addition to performing all the standard steps to share data, which privilege must be
granted on each database referenced by a secure view in order to be shared?

REFERENCES

Correct answer

REFERENCE_USAGE

READ

USAGE

Overall explanation

You must grant the REFERENCE_USAGE privilege separately on each database referenced in
a secure view, before granting the secure view to a share.

See this link.

chumma kizhii
#dm

Question 104Skipped

A user executes the following SQL query:

create table SALES_BKP like SALES;

What are the cost implications for processing this query?

Storage costs will be generated based on the size of the data.

Processing costs will be generated based on how long the query takes.

Correct answer

No costs will be incurred as the query will use metadata.

The cost for running the virtual warehouse will be charged by the second.

Overall explanation

CREATE TABLE … LIKE (creates an empty copy of an existing table)

See this link.

Question 105Skipped

How many cluster keys can we create for a Snowflake table?

Unlimited.

Correct answer

One.

A maximum of three or four cluster keys.

Two.

Overall explanation

You can enable clustering on specific tables by specifying ONE clustering key for each table.
We can only create one cluster key, but we can have several columns or expressions in that
cluster key.

Question 106Skipped

Which types of tables typically benefit from creating a cluster key?

Medium tables (around 10GB of data)

Large tables (around 100GB of data)

Correct answer

chumma kizhii
#dm

Very large tables (multi-terabytes of data)

Small tables (around 1GB of data)

Overall explanation

Although clustering can substantially improve the performance and reduce the cost of some
queries, the compute resources used to perform clustering consume credits. As such, you
should cluster only when queries will benefit substantially from the clustering in huge tables.

Question 107Skipped

At which point is data encrypted when using a PUT command?

When it gets micro-partitioned

Correct answer

Before it is sent from the user's machine

When it reaches the virtual warehouse

After it reaches the internal stage

Overall explanation

Always end-to-end encryption

See this link.

Question 108Skipped

Which Snowflake URL type is used by directory tables?

Pre-signed

Scoped

Virtual-hosted style

Correct answer

File

Overall explanation

Conceptually, directory tables are similar to external tables in that they store file-level
metadata about the data files in a stage. Query a directory table to retrieve the Snowflake-
hosted file URL to each file in the stage. A file URL permits prolonged access to a specified
file. That is, the file URL does not expire. The same file URL is returned by calling the
BUILD_STAGE_FILE_URL function.

See this link.

chumma kizhii
#dm

Question 109Skipped

What is a responsibility of Snowflake’s virtual warehouses?

Query parsing and optimization

Infrastructure management

Correct answer

Query execution

Management of the storage layer

Metadata management

Overall explanation

A warehouse provides the required resources, such as CPU, memory, and temporary storage,
to perform the following operations in a Snowflake session:

• Executing SQL SELECT statements that require compute resources (e.g. retrieving
rows from tables and views).

• Performing DML operations, such as:

• Updating rows in tables.

• Loading data into tables .

• Unloading data from tables.

Question 110Skipped

What is the output of the command.

1. SELECT TOP 100 AGE

2. FROM USERS;

The TOP 100 grades ordered by the creation date of the data.

Correct answer

Non-deterministic list of 100 grades.

The TOP 100 grades in descendent order.

The TOP 100 grades in ascendent order.

Overall explanation

We’d need the ORDER BY clause if we want to generate the other results.

chumma kizhii
#dm

Question 111Skipped

How is the MANAGE GRANTS privilege applied?

Correct answer

Globally

At the schema level

At the table level

At the database level

Overall explanation

In general, a role with any one of the following sets of privileges can grant privileges on an
object to other roles:

The global MANAGE GRANTS privilege.

Only the SECURITYADMIN and ACCOUNTADMIN system roles have the MANAGE GRANTS
privilege; however, the privilege can be granted to custom roles.

See this link.

Question 112Skipped

What actions will prevent leveraging of the ResultSet cache?

If the result has not been reused within the last 12 hours

Stopping the virtual warehouse that the query is running against

Executing the RESULTS_SCAN() table function

Correct answer

Removing a column from the query SELECT list

Overall explanation

Query results are reused if all of the following conditions are met:

• The new query syntactically matches the previously-executed query.

• ...

See this link.

Question 113Skipped

Which command should be used to look into the validity of an XML object in Snowflake?

TO_XML

chumma kizhii
#dm

Correct answer

CHECK_XML

PARSE_XML

XMLGET

Overall explanation

CHECK_XML

Checks the validity of an XML document. If the input string is NULL or a valid XML document,
the output is NULL. In case of an XML parsing error, the output string contains the error
message.

See this link.

Question 114Skipped

How can a Snowflake user access a JSON object, given the following table? (Choose two.)

Correct selection

1. SRC:salesPerson.name

1. src:salesPerson.Name

1. src:salesperson.name

1. SRC:salesPerson.Name

Correct selection

1. src:salesPerson.name

Overall explanation

Regardless of which notation you use, the column name is case-insensitive but element
names are case-sensitive.

For example, in the following list, the first two paths are equivalent, but the third is not:

• src:salesperson.name

chumma kizhii
#dm

• SRC:salesperson.name

• SRC:Salesperson.Name

See this link.

Question 115Skipped

What will happen to ALTER a column setting it to NOT NULL if it contains NULL values?

NULL values are changed to 0.

Snowflake deletes the rows with NULL values.

NULL values are changed to an empty string " "

Correct answer

Snowflake returns an error.

Overall explanation

When setting a column to NOT NULL, if the column contains NULL values, an error is
returned and no changes are applied to the column. This restriction prevents inconsistency
between values in rows inserted before the column was added and rows inserted after the
column was added.

Question 116Skipped

For directory tables, what stage allows for automatic refreshing of metadata?

User stage

Table stage

Correct answer

Named external stage

Named internal stage

Overall explanation

You can automatically refresh the metadata for a directory table by using the following event
notification services:

• Amazon S3: Amazon SQS (Simple Queue Service)

• Google Cloud Storage: Google Cloud Pub/Sub

• Microsoft Azure: Microsoft Azure Event Grid

See this link.

chumma kizhii
#dm

Question 117Skipped

What are the two models that Snowflake combines as an approach to access control?

MAC & RBAC.

Correct answer

DAC & RBAC.

DAC & ABAC.

MAC & ABAC.

Overall explanation

Snowflake combines Discretionary Access Control (DAC) and Role-Based Access Control
(RBAC). Discretionary Access Control (DAC) remarks that "each object has an owner, who
can, in turn, grant access to that object". In contrast, the Role-based Access Control (RBAC)
remarks that "access privileges are assigned to roles, which are in turn assigned to users".

Question 118Skipped

Which of the following are best practice recommendations that should be considered when
loading data into Snowflake? (Choose two.)

Load files that are approximately 25 MB or smaller.

Correct selection

Load files that are approximately 100-250 MB (or larger).

Remove all dates and timestamps.

Correct selection

Avoid using embedded characters such as commas for numeric data types.

Remove semi-structured data types.

Overall explanation

See this link.

Question 119Skipped

Masking policies can be applied to which of the following Snowflake objects? (Choose two.)

Correct selection

A table

A User-Defined Function (UDF)

chumma kizhii
#dm

A stream

A stored procedure

A pipe

Correct selection

A materialized view

Overall explanation

In Snowflake, masking policies are schema-level objects, which means a database and
schema must exist in Snowflake before a masking policy can be applied to a column.
Currently, Snowflake supports using Dynamic Data Masking on tables and views.

Question 120Skipped

Which of the following listing types are not available in the Snowflake Data Marketplace?

Personalized Listing.

Free Listing.

Paid Listing.

Correct answer

Private Listing.

Overall explanation

Free Listing (also known as Standard Listing) is the best for providing generic, aggregated, or
non-customer-specific data. In contrast, consumers can request specific datasets from
providers using Personalized Listing. Snowflake recently added Paid Listings, where, as a
provider, you can charge consumers to access or use your listing.

There is another type, Private Listings, where you can use listings to share data and other
information directly with other Snowflake accounts. However, they are unavailable in the
Data Marketplace, as the question asks for. Here you have an example of a Personalized
Listing in the Snowflake Marketplace:

Question 121Skipped

chumma kizhii
#dm

Why should a Snowflake user configure a secure view? (Choose two.)

Correct selection

To protect hidden data from other users

To improve the performance of a query

Correct selection

To hide the view definition from other users

To execute faster than a standard view

To encrypt the data in transit

Overall explanation

Some of the internal optimizations for views require access to the underlying data in the
base tables for the view. This access might allow data that is hidden from users of the view
to be exposed through user code, such as user-defined functions, or other programmatic
methods. Secure views do not utilize these optimizations, ensuring that users have no access
to the underlying data.

For security or privacy reasons, you might not wish to expose the underlying tables or
internal structural details for a view. With secure views, the view definition and details are
visible only to authorized users (i.e. users who are granted the role that owns the view).

See this link.

Question 122Skipped

Which command will fail if you have a table created with the following DDL query?

1. CREATE TABLE MYTABLE

2. (ID INTEGER, NAME VARCHAR)

1. SELECT * FROM MYTABLE

1. SELECT * FROM Mytable

1. SELECT * FROM “MYTABLE”

Correct answer

1. SELECT * FROM “Mytable”

Overall explanation

If you use the symbol " ", you have to specify the exact name of the table.

chumma kizhii
#dm

Question 123Skipped

Which roles can create, alter or drop network policies? (Choose two.)

USERADMIN.

Correct selection

SECURITYADMIN.

ORGADMIN

SYSADMIN.

Correct selection

ACCOUNTADMIN.

Overall explanation

Network policies allow restricting access to your account based on user IP address.
Snowflake applies the blocked IP address list when a network policy includes values in both
the allowed and blocked IP address lists.

Only security administrators (i.e., users with the SECURITYADMIN role) or higher or a role
with the global CREATE NETWORK POLICY privilege can create network policies.

Only the network policy owner (i.e., role with the OWNERSHIP privilege on the network
policy) or higher can alter a network policy.

Question 124Skipped

What is used to limit the credit usage of a virtual warehouse within a Snowflake account?

Query Profile

Stream

Correct answer

Resource monitor

Load monitor

Overall explanation

See this link.

Question 125Skipped

chumma kizhii
#dm

What is it called when a customer managed key is combined with a Snowflake managed key
to create a composite key for encryption?

Correct answer

Tri-secret secure encryption

Client-side encryption

Hierarchical key model

Key pair authentication

Overall explanation

Tri-Secret Secure is the combination of a Snowflake-maintained key and a customer-


managed key in the cloud provider platform that hosts your Snowflake account to create a
composite master key to protect your Snowflake data.

See this link.

Question 126Skipped

The Query Profile in the image is for a query executed in Snowsight. Four of the key nodes
are highlighted in yellow.

chumma kizhii
#dm

Which highlighted node will be the MOST expensive?

Aggregate[1]

Join[5]

chumma kizhii
#dm

TableScan[2]

Correct answer

TableScan[3]

Overall explanation

What we can see in this image is que Operator Tree of a Query profile.

The tree provides a graphical representation of the operator nodes that comprise a query
and the links that connect each operator:

• Operators are the functional building blocks of a query. They are responsible for
different aspects of data management and processing, including data access,
transformations and updates. Each operator node in the tree includes some basic
attributes:

<Type> [#]

Operator type and ID number. ID can be used to uniquely identify an operator within a query
profile (e.g. Aggregate [1] and Join [5] in the screenshot above).

For descriptions of all the types, see Operator Types below.

Percentage

Fraction of time that this operator consumed within the query step (e.g. 53.4% for
TableScan[3]). This information is also reflected in the bar at the bottom of the operator
node, allowing for easy visual identification of performance-critical operators.

Label

Operator-specific additional information.

• Links represent the data flowing between each operator node. Each link provides the
number of records that were processed.

Therefore, we can conclude that it is the node TableScan[3] that has the highest percentage
of time.

See this link.

Question 127Skipped

Which account usage view in Snowflake can be used to identify the most-frequently
accessed tables?

Tables

Object_Dependencies

chumma kizhii
#dm

Correct answer

Access_History

Table_Storage_Metrics

Overall explanation

The Access_History view in Snowflake can be used to identify the most frequently accessed
tables. This view contains information about the historical access patterns for tables and
views in your Snowflake account, including details on queries, users, and access frequency.
By querying this view, you can analyze which tables are being accessed most frequently in
your Snowflake environment.

See this link.

Question 128Skipped

A user created a database and set the DATA_RETENTION_TIME_IN_DAYS to 30, but did not
set the DATA_RETENTION_TIME_IN_DAYS in table T1. After 5 days, the user accidentally
drops table T1.

What are the considerations for recovering table T1?

The table cannot be recovered because the DATA_RETENTION_TIME_IN_DAYS was not set
for table T1.

The user can recover the table T1 after 30 days.

The table can only be recovered by contacting Snowflake Support to recover the table
from Fail-safe.

Correct answer

The table can be recovered because the table retention period default is at the database
level.

Overall explanation

By default, the maximum retention period is 1 day (i.e. one 24 hour period). With Snowflake
Enterprise Edition (and higher), the default for your account can be set to any value up to 90
days:

• When creating a table, schema, or database, the account default can be overridden
using the DATA_RETENTION_TIME_IN_DAYS parameter in the command.

• If a retention period is specified for a database or schema, the period is inherited by


default for all objects created in the database/schema.

See this link.

chumma kizhii
#dm

Question 129Skipped

Which of the below columns are usually a good choice for clustering keys?

Timestamp in a 10TB table.

Correct answer

Store_id in a 2TB table.

Gender male/female in a 20TB table.

UUID column from a Customer in a 10TB table.

Overall explanation

A column with very low cardinality (gender in this case) might yield minimal pruning. On the
other hand, a column with very high cardinality (UUID or timestamp in this case) is also
typically not a good candidate to use directly as a clustering key, as there will be a lot of
values. Store_id is the most convenient option.

Question 130Skipped

For non-materialized views, what column in Information Schema and Account Usage
identifies whether a view is secure or not?

CHECK_OPTION

Correct answer

IS_SECURE

IS_UPDATEABLE

TABLE_NAME

Overall explanation

See this link.

Question 131Skipped

Which command can we use to refresh a materialized view?

1. DROP VIEW <view>;

2. CREATE MATERIALIZED VIEW <view>;

1. ALTER MATERIALIZED_VIEW <view> REFRESH=TRUE

Correct answer

Materialized views are automatically refreshed by Snowflake.

chumma kizhii
#dm

1. RESTART MATERIALIZED_VIEW <view>

Overall explanation

Snowflake automatically and transparently maintains materialized views. To see the last time
that Snowflake refreshed a materialized view, check the REFRESHED_ON and BEHIND_BY
columns in the output of the command SHOW MATERIALIZED VIEWS.

Question 132Skipped

Which command should be used to implement a masking policy that was already created in
Snowflake?

1. CREATE MASKING POLICY

1. APPLY MASKING POLICY

Correct answer

1. SET MASKING POLICY

1. ALTER MASKING POLICY

Overall explanation

Example:

1. ALTER TABLE <name> { ALTER | MODIFY } COLUMN <col1_name> SET MASKING


POLICY <policy_name> [ USING ( <col1_name> , cond_col_1 , ... ) ]

See this link.

Question 133Skipped

Which cache type is used to cache data output from SQL queries?

Local file cache

Remote cache

Correct answer

Result cache

Metadata cache

Overall explanation

Result Cache: Which holds the results of every query executed in the past 24 hours. These
are available across virtual warehouses, so query results returned to one user is available to
any other user on the system who executes the same query, provided the underlying data
has not changed.

chumma kizhii
#dm

Local Disk Cache: Which is used to cache data used by SQL queries. Whenever data is
needed for a given query it's retrieved from the Remote Disk storage, and cached in SSD and
memory.

Question 134Skipped

What happens when the values for both an ALLOWED_IP_LIST and a BLOCKED_IP_LIST are
used in a network policy?

Correct answer

Snowflake applies the BLOCKED_IP_LIST first.

Snowflake ignores the BLOCKED_IP_LIST first.

Snowflake applies the ALLOWED_IP_LIST first.

Snowflake ignores the ALLOWED_IP_LIST first.

Overall explanation

When a network policy includes values in both the allowed and blocked IP address lists,
Snowflake applies the blocked IP address list first.

See this link.

Question 135Skipped

What happens when an external or an internal stage is dropped? (Choose two.)

When dropping an internal stage, the files are deleted with the stage and the files are
recoverable.

Correct selection

When dropping an external stage, the files are not removed and only the stage is dropped.

When dropping an internal stage, only selected files are deleted with the stage and are not
recoverable.

When dropping an external stage, both the stage and the files within the stage are
removed.

Correct selection

When dropping an internal stage, the files are deleted with the stage and the files are not
recoverable.

Overall explanation

chumma kizhii
#dm

For an internal stage, all of the files in the stage are purged from Snowflake, regardless of
their load status. This prevents the files from continuing to using storage and, consequently,
accruing storage charges.

However, this also means that the staged files cannot be recovered after a stage is dropped.

For an external stage, only the stage itself is dropped; any data files in the referenced
external location (Amazon S3, Google Cloud Storage, or Microsoft Azure) are not removed.

See this link.

Question 136Skipped

Which statement accurately describes a characteristic of a materialized view?

Data accessed through materialized views can be stale.

Correct answer

A materialized view can query only a single table.

Querying a materialized view is slower than executing a query against the base table of
the view.

Materialized view refreshes need to be maintained by the user.

Overall explanation

A materialized view can query only a single table.

See this link.

It is recommended to be familiar with the limitations of the materialized views. This is often
a common question.

Question 137Skipped

Which command can we use to access sequences in queries as expressions?

Correct answer

<seq_name>.NEXTVAL

<seq_name>.THISVAL

<seq_name>.CURRENTVAL

<seq_name>.GETVAL

Overall explanation

chumma kizhii
#dm

We use sequences to generate unique numbers across sessions and statements, including
concurrent statements. You can use them to generate values for a primary key or any
column that requires a unique value. Let’s see an example.

Imagine we’ve created a sequence with an initial value of 1 and an interval of 2. If we


execute the following code:

1. INSERT INTO PEOPLE (ID, NAME) VALUES

2. (PEOPLE_SEQ.nextval, "Gonzalo"),

3. (PEOPLE_SEQ.nextval, "Nacho"),

4. (PEOPLE_SEQ.nextval, "Megan"),

5. (PEOPLE_SEQ.nextval, "Angel")

It will generate the following result:

Question 138Skipped

What technique does Snowflake use to limit the number of micro-partitions scanned by
each query?

Indexing

Map reduce

Correct answer

Pruning

B-tree

chumma kizhii
#dm

Overall explanation

See this link.

Question 139Skipped

Which SQL command can be used to verify the privileges that are granted to a role?

SHOW ROLES

SHOW GRANTS ON ROLE

SHOW GRANTS FOR ROLE

Correct answer

SHOW GRANTS TO ROLE

Overall explanation

See this link.

Question 140Skipped

Which privilege is required on a virtual warehouse to abort any existing executing queries?

MODIFY

USAGE

Correct answer

OPERATE

MONITOR

Overall explanation

OPERATE

Enables changing the state of a warehouse (stop, start, suspend, resume). In addition,
enables viewing current and past queries executed on a warehouse and aborting any
executing queries.

See this link.

Question 141Skipped

Which Snowflake object enables loading data from files as soon as they are available in a
cloud storage location?

Stream

Correct answer

chumma kizhii
#dm

Pipe

Task

External stage

Overall explanation

Snowpipe enables loading data from files as soon as they’re available in a stage. This means
you can load data from files in micro-batches, making it available to users within minutes,
rather than manually executing COPY statements on a schedule to load larger batches.

See this link.

Question 142Skipped

What is the function of the PUBLIC schema in Snowflake?

It’s the schema where information about pricing will be stored.

It’s the schema where non-PII data will be stored.

It’s the schema that even non-Snowflake users will be able to access, as it’s public.

Correct answer

Default schema where objects are going to be created.

Overall explanation

A schema is a logical grouping of database objects (tables, views, etc.), and each schema
belongs to a single database. The PUBLIC schema is the default schema for a database, and
all objects are, by default, created inside it if no other schema is specified.

Question 143Skipped

Which schema has the RESOURCE_MONITORS view?

ACCOUNT_USAGE

Correct answer

READER_ACCOUNT_USAGE

WAREHOUSE_USAGE_SCHEMA

INFORMATION_SCHEMA

Overall explanation

See this link.

chumma kizhii
#dm

Question 144Skipped

What happens to the shared objects for users in a consumer account from a share, once a
database has been created in that account?

The shared objects are copied.

Correct answer

The shared objects become accessible.

The shared objects can be re-shared.

The shared objects are transferred.

Overall explanation

Once a database is created (in a consumer account) from a share, all the shared objects are
accessible to users in the consumer account.

See this link.

Question 145Skipped

Which of the following SQL statements will list the version of the drivers currently being
used?

Execute SELECT CURRENT_JDBC_VERSION(); from SnowSQL

Correct answer

Execute SELECT CURRENT_CLIENT(); from an application

Execute SELECT CURRENT_VERSION(); from the Python Connector

Execute SELECT CURRENT_ODBC_CLIENT(); from the Web UI

Overall explanation

CURRENT_CLIENT() -- name and version of connected client CURRENT_VERSION() -- version


of Snowflake

See this link.

Question 146Skipped

A Snowflake user has two tables that contain numeric values and is trying to find out which
values are present in both tables.

Which set operator should be used?

Correct answer

chumma kizhii
#dm

INTERSECT

MINUS

UNION

MERGE

Overall explanation

INTERSECT

Returns rows from one query’s result set which also appear in another query’s result set,
with duplicate elimination.

See this link.

Question 147Skipped

Which of the following query profiler variables will indicate that a virtual warehouse is not
sized correctly for the query being executed?

Correct answer

Remote spillage

Synchronization

Initialization

Bytes sent over the network

Overall explanation

For some operations (e.g. duplicate elimination for a huge data set), the amount of memory
available for the compute resources used to execute the operation might not be sufficient to
hold intermediate results. As a result, the query processing engine will start spilling the data
to local disk. If the local disk space is not sufficient, the spilled data is then saved to remote
disks.

See this link.

Question 148Skipped

According to Snowflake best practice recommendations, which role should be used to create
databases?

USERADMIN

SECURITYADMIN

ACCOUNTADMIN

chumma kizhii
#dm

Correct answer

SYSADMIN

Overall explanation

The system administrator (SYSADMIN) role includes the privileges to create warehouses,
databases, and all database objects (schemas, tables, etc.).

See this link.

Question 149Skipped

You have the following data in a variant column called “person_data” from the table
“myTable”. How can you query the second hobby of Chris (“music”)?

1. {

2. "name": "Chris",

3. "favouriteTechnology": Snowflake,

4. "hobbies":[

5. {"name": "soccer"},

6. {"name": "music"},

7. {"name": "hiking"}

8. ]}

1. SELECT person_data:hobbies[1]

1. SELECT person_data:hobbies(1)

Correct answer

1. SELECT person_data:hobbies[1].name

1. SELECT person_data:hobbies(1).name

Overall explanation

See this link.

Question 150Skipped

What is a best practice after creating a custom role?

Correct answer

Assign the custom role to the SYSADMIN role.

chumma kizhii
#dm

Assign the custom role to the PUBLIC role.

Create the custom role using the SYSADMIN role.

Add _CUSTOM to all custom role names.

Overall explanation

Custom roles (i.e. any roles other than the system-defined roles) can be created by the
USERADMIN role (or a higher role) as well as by any role to which the CREATE ROLE privilege
has been granted. By default, a newly-created role is not assigned to any user, nor granted to
any other role.

When creating roles that will serve as the owners of securable objects in the system,
Snowflake recommends creating a hierarchy of custom roles, with the top-most custom role
assigned to the system role SYSADMIN.

See this link.

Question 151Skipped

What do temporary and transient tables have in common in Snowflake? (Choose two.)

For both tables, the retention period ends when the tables are dropped.

Correct selection

Both tables have data retention period maximums of one day.

Both tables are visible only to a single user session.

Correct selection

Both tables have no Fail-safe period.

For both tables, the retention period does not end when the session ends.

Overall explanation

See this link.

chumma kizhii
#dm

Question 152Skipped

Which features make up Snowflake's column level security? (Choose two.)

Continuous Data Protection (CDP)

Correct selection

External Tokenization

Correct selection

Dynamic Data Masking

Key pair authentication

Row access policies

Overall explanation

See this link.

Question 153Skipped

What is the name of the SnowSQL file that can store connection information?

snowsql.pubkey

history

Correct answer

config

snowsql.cnf

Overall explanation

SnowSQL supports multiple configuration files that allow organizations to define base values
for connection parameters, default settings, and variables while allowing individual users to
customize their personal settings in their own <HOME_DIR>/.snowsql/config files.

chumma kizhii
#dm

See this link.

Question 154Skipped

What general guideline does Snowflake recommend when setting the auto-suspension time
limit?

Set query warehouses for suspension after 15 minutes.

Correct answer

Set tasks for suspension after 5 minutes.

Set query warehouses for suspension after 30 minutes.

Set tasks for immediate suspension.

Overall explanation

If you enable auto-suspend, we recommend setting it to a low value (e.g. 5 or 10 minutes or


less) because Snowflake utilizes per-second billing. This will help keep your warehouses from
running (and consuming credits) when not in use.

See this link.

Question 155Skipped

What role has the privileges to create and manage data shares by default?

Correct answer

ACCOUNTADMIN

USERADMIN

SYSADMIN

SECURITYADMIN

Overall explanation

By default, the privileges required to create and manage shares are granted only to
the ACCOUNTADMIN role, ensuring that only account administrators can perform these
tasks.

See this link.

chumma kizhii
#dm

SET 5

Question 1Skipped

What feature can be used to reorganize a very large table on one or more columns?

Correct answer

Clustering keys

Micro-partitions

Key partitions

Clustered partitions

Overall explanation

See this link.

Question 2Skipped

When a Snowflake user loads CSV data from a stage, which COPY INTO [table] command
guideline should they follow?

The data file must have the same number of columns as the target table.

Correct answer

The number of columns in each row should be consistent.

The data file in the stage must be in a compressed format.

The CSV field delimiter must be a comma character (',').

Overall explanation

See this link.

Question 3Skipped

Which of the following are characteristics of security in Snowflake?

Account and user authentication is only available with the Snowflake Business Critical
edition.

Correct answer

chumma kizhii
#dm

Periodic rekeying of encrypted data is available with the Snowflake Enterprise edition and
higher

Support for HIPAA and GDPR compliance is available for UI Snowflake editions.

Private communication to internal stages is allowed in the Snowflake Enterprise edition


and higher.

Overall explanation

chumma kizhii
#dm

See this link.

Question 4Skipped

Which feature allows a user the ability to control the organization of data in a micro-
partition?

chumma kizhii
#dm

Correct answer

Automatic Clustering

Horizontal Partitioning

Search Optimization Service

Range Partitioning

Overall explanation

Search Optimization Service has nothing to do with the organization of data in micro-
partitions. Its clustering only which organizes data in micro-partitions

See this link.

Question 5Skipped

Which of the following are handled by the cloud services layer of the Snowflake
architecture? (Choose two.)

Time Travel data

Query execution

Data loading

Correct selection

Security

Correct selection

Authentication and access control

Overall explanation

The cloud services layer is a collection of services that coordinate activities across Snowflake.
These services tie together all of the different components of Snowflake in order to process
user requests, from login to query dispatch. The cloud services layer also runs on compute
instances provisioned by Snowflake from the cloud provider. Services managed in this layer
include:

• Authentication

• Infrastructure management

• Metadata management

• Query parsing and optimization

• Access control

chumma kizhii
#dm

See this link.

Question 6Skipped

How many days is load history for Snowpipe retained?

64 days

Correct answer

14 days

1 day

7 days

Overall explanation

COPY INTO - Stored in the metadata of the target table for 64 days

SNOWPIPE - Stored in the metadata of the pipe for 14 days

See this link.

Question 7Skipped

Which Snowflake privilege is required on a pipe object to pause or resume pipes?

SELECT

READ

USAGE

Correct answer

OPERATE

Overall explanation

ALTER PIPE

Modifies a limited set of properties for an existing pipe object. Also supports the following
operations:

• Pausing the pipe.

• Refreshing a pipe (i.e. copying the specified staged data files to the Snowpipe ingest
queue for loading into the target table).

• Adding/overwriting/removing a comment for a pipe.

• Setting/unsetting a tag on a pipe.

chumma kizhii
#dm

A non-owner role with the OPERATE privilege on the pipe can pause or resume a pipe (using
ALTER PIPE … SET PIPE_EXECUTION_PAUSED = TRUE | FALSE).

SQL operations on schema objects also require the USAGE privilege on the database and
schema that contain the object.

See this link.

Question 8Skipped

What is the default character set used when loading CSV files into Snowflake?

ISO 8859-1

ANSI_X3.4

UTF-16

Correct answer

UTF-8

Overall explanation

For delimited files (CSV, TSV, etc.), the default character set is UTF-8.

See this link.

Question 9Skipped

Which of the following is a valid source for an external stage when the Snowflake account is
located on Microsoft Azure?

An HTTPS server with WebDAV

An FTP server with TLS encryption

Correct answer

A Google Cloud storage bucket

A Windows server file share on Azure

Overall explanation

Loading data from any of the following cloud storage services is supported regardless of the
cloud platform that hosts your Snowflake account:

• Amazon S3

• Google Cloud Storage

• Microsoft Azure

chumma kizhii
#dm

See this link.

Question 10Skipped

A Snowflake user wants to share unstructured data through the use of secure views.

Which URL types can be used? (Choose two.)

Correct selection

Pre-signed URL

Cloud storage URL

Correct selection

Scoped URL

HTTPS URL

File URL

Overall explanation

See this link.

Question 11Skipped

While attempting to avoid data duplication, which COPY INTO option should be used to load
files with expired load metadata?

Correct answer

LOAD_UNCERTAIN_FILES

VALIDATION_MODE

LAST_MODIFIED

FORCE

Overall explanation

To load files whose metadata has expired, set the LOAD_UNCERTAIN_FILES copy option to
true. The copy option references load metadata, if available, to avoid data duplication, but
also attempts to load files with expired load metadata.

See this link.

Question 12Skipped

Which statements are true of micro-partitions? (Choose two.)

Correct selection

chumma kizhii
#dm

They are immutable

They are approximately 50-500MB, after compression

They are only encrypted in the Enterprise edition and above

Correct selection

They are approximately 50-500MB, before compression

They are stored compressed only if COMPRESS=TRUE on Table

Overall explanation

See this link.

Question 13Skipped

What is an advantage of using an explain plan instead of the query profiler to evaluate the
performance of a query?

The explain plan output is available graphically.

An explain plan will handle queries with temporary tables and the query profiler will not.

Correct answer

An explain plan can be used to conduct performance analysis without executing a query.

An explain plan's output will display automatic data skew optimization information.

Overall explanation

EXPLAIN compiles the SQL statement, but does not execute it, so EXPLAIN does not require a
running warehouse.

Although EXPLAIN does not consume any compute credits, the compilation of the query
does consume Cloud Service credits, just as other metadata operations do.

See this link.

Question 14Skipped

A data provider wants to share data with a consumer who does not have a Snowflake
account. The provider creates a reader account for the consumer following these steps:

1. Created a user called "CONSUMER"


2. Created a database to hold the share and an extra-small warehouse to query the data
3. Granted the role PUBLIC the following privileges: Usage on the warehouse, database, and
schema, and SELECT on all the objects in the share

chumma kizhii
#dm

Based on this configuration what is true of the reader account?

The reader account will automatically use the Standard edition of Snowflake.

Correct answer

The reader account compute will be billed to the provider account.

The reader account can create a copy of the shared data using CREATE TABLE AS...

The reader account can clone data the provider has shared, but cannot re-share it.

Overall explanation

The user has not a snowflake account, so the user compute will be billed to provider
account.

See this link.

Question 15Skipped

Which of the following can be used when unloading data from Snowflake? (Choose two.)

When unloading semi-structured data, it is recommended that the STRIP_OUTER_ARRAY


option be used.

Correct selection

By using the SINGLE = TRUE parameter, a single file up to 5 GB in size can be exported to
the storage layer.

Use the PARSE_JSON function to ensure structured data will be unloaded into the
VARIANT data type.

Correct selection

The OBJECT_CONSTRUCT function can be used to convert relational data to semi-


structured data.

Use the ENCODING file format option to change the encoding from the default UTF-8.

Overall explanation

See this link.

Question 16Skipped

Which of the following can be executed/called with Snowpipe?

A stored procedure

A User Defined Function (UDF)

chumma kizhii
#dm

A single INSERT_INTO statement

Correct answer

A single COPY_INTO statement

Overall explanation

See this link.

Question 17Skipped

Which of the following is an example of an operation that can be completed without


requiring compute, assuming no queries have been executed previously?

1. SELECT ORDER_AMT * ORDER_QTY FROM SALES;

1. SELECT AVG(ORDER_QTY) FROM SALES;

1. SELECT SUM (ORDER_AMT) FROM SALES;

Correct answer

1. SELECT MIN(ORDER_AMT) FROM SALES;

Overall explanation

In this question it is important the detail of "assuming that no query has been executed
before". This means that options that could take advantage of the cache and not require
computation are wrong.

The correct option is the SELECT MIN because Snowflake stores the MIN and MAX-values of
each column of each micro-partition in it's Cloud Services Layer and also the COUNT
DISTINCT. I quote from the documentation:

Snowflake stores metadata about all rows stored in a micro-partition, including:

The range of values for each of the columns in the micro-partition.

The number of distinct values.

Additional properties used for both optimization and efficient query processing.

See this link.

Question 18Skipped

How does Snowflake describe its unique architecture?

chumma kizhii
#dm

Correct answer

A multi-cluster shared data architecture using a central data repository and massively
parallel processing (MPP)

A single-cluster shared nothing architecture using a siloed data repository and symmetric
multiprocessing (SMP)

A single-cluster shared data architecture using a central data repository and massively
parallel processing (MPP)

A multi-cluster shared nothing architecture using a siloed data repository and symmetric
multiprocessing (SMP)

Overall explanation

A simple and straightforward question, but one that always appears in the exam.

See this link.

Question 19Skipped

What features that are part of the Continuous Data Protection (CDP) feature set in
Snowflake do not require additional configuration? (Choose two.)

External tokenization

Correct selection

Time Travel

Row level access policies

Correct selection

Data encryption

Data masking policies

Overall explanation

See this link.

Question 20Skipped

What are advantages clones have over tables created with CREATE TABLE AS SELECT
statement? (Choose two.)

The clone will have time travel history from the original table.

The clone has better query performance.

The clone always stays in sync with the original table.

chumma kizhii
#dm

Correct selection

The clone saves space by not duplicating storage.

Correct selection

The clone is created almost instantly.

Overall explanation

Cloning is fast, but not instantaneous, particularly for large objects (e.g. tables). - so ALMOST
is correct.

Clone is not a Physical copy of the actual data instead it is a pointer to the original micro-
partitions and only when we modify the cloned data.

See this link.

Question 21Skipped

Which of the following statements describes a schema in Snowflake?

A uniquely identified Snowflake account within a business entity

Correct answer

A logical grouping of objects that belongs to a single database

A named Snowflake object that includes all the information required to share a database

A logical grouping of objects that belongs to multiple databases

Overall explanation

chumma kizhii
#dm

See this link.

Question 22Skipped

Which type of join will list all rows in the specified table, even if those rows have no match in
the other table?

Correct answer

Outer join

Cross join

Natural join

Inner join

Overall explanation

See this link.

Question 23Skipped

Which of the following are valid methods for authenticating users for access into Snowflake?
(Choose three.)

SCIM

Correct selection

Key-pair authentication

OCSP authentication

chumma kizhii
#dm

TLS 1.2

Correct selection

Federated authentication

Correct selection

OAuth

Overall explanation

See this link.

Question 24Skipped

When does a materialized view get suspended in Snowflake?

When the base table is reclustered

When a column is added to the base table

Correct answer

When a column is dropped from the base table

When a DML operation is run on the base table

Overall explanation

If a base table is altered so that existing columns are changed or dropped, then all
materialized views on that base table are suspended.

See this link.

Question 25Skipped

Which privileges apply to stored procedures? (Choose two.)

MONITOR

OPERATE

Correct selection

OWNERSHIP

Correct selection

USAGE

MODIFY

Overall explanation

chumma kizhii
#dm

Similar to other database objects (tables, views, UDFs, etc.), stored procedures are owned by
a role and have one or more privileges that can be granted to other roles.

Currently, the following privileges apply to stored procedures:

• USAGE

• OWNERSHIP

See this link.

Question 26Skipped

What is the default file size when unloading data from Snowflake using the COPY command?

Correct answer

16 MB

32 MB

8 GB

4 MB

Overall explanation

By default, COPY INTO location statements separate table data into a set of output files to
take advantage of parallel operations. The maximum size for each file is set using the
MAX_FILE_SIZE copy option. The default value is 16777216 (16 MB) but can be increased to
accommodate larger files.

See this link.

Question 27Skipped

Which of the following Snowflake objects can be shared using a secure share? (Choose two.)

Sequences

Correct selection

Secure User Defined Functions (UDFs)

Procedures

Materialized views

Correct selection

Tables

Overall explanation

chumma kizhii
#dm

The following Snowflake database objects can be shared:

• Tables

• External tables

• Secure views

• Secure materialized views

• Secure UDFs

See this link.

Question 28Skipped

While clustering a table, columns with which data types can be used as clustering keys?
(Choose two.)

GEOGRAPHY

VARIANT

OBJECT

Correct selection

BINARY

Correct selection

GEOMETRY

Overall explanation

It can be any data type except GEOGRAPHY, VARIANT, OBJECT, or ARRAY

See this link.

Question 29Skipped

What does the TableScan operator represent in the Query Profile?

The access to data stored in stage objects

The records generated using the TABLE(GENERATOR(...)) construct

Correct answer

The access to a single table

The list of values provided with the VALUES clause

Overall explanation

chumma kizhii
#dm

TableScan: Represents access to a single table.

See this link.

Question 30Skipped

A materialized view should be created when which of the following occurs? (Choose two.)

The query is highly optimized and does not consume many compute resources.

Correct selection

The query consumes many compute resources every time it runs.

The base table gets updated frequently.

Correct selection

The results of the query do not change often and are used frequently.

There is minimal cost associated with running the query.

Overall explanation

See this link.

Question 31Skipped

Which semi-structured file format is a compressed, efficient, columnar data representation?

Correct answer

Parquet

JSON

TSV

Avro

Overall explanation

Parquet is a compressed, efficient columnar data representation designed for projects in the
Hadoop ecosystem.

See this link.

Question 32Skipped

Which SQL command, when committed, will consume a stream and advance the stream
offset?

1. ALTER TABLE AS SELECT FROM STREAM

Correct answer

chumma kizhii
#dm

1. INSERT INTO TABLE SELECT FROM STREAM

1. BEGIN COMMIT

1. SELECT FROM STREAM

Overall explanation

The stream position (i.e. offset) is advanced when the stream is used in a DML statement.
The position is updated at the end of the transaction to the beginning timestamp of the
transaction. The stream describes change records starting from the current position of the
stream and ending at the current transactional timestamp.

See this link.

Question 33Skipped

What is the purpose of a Query Profile?

Correct answer

To profile a particular query to understand the mechanics of the query, its behavior, and
performance.

To profile the user and/or executing role of a query and all privileges and policies applied
on the objects within the query.

To profile which queries are running in each warehouse and identify proper warehouse
utilization and sizing for better performance and cost balancing.

To profile how many times a particular query was executed and analyze its usage statistics
over time.

Overall explanation

Query Profile is a powerful tool for understanding the mechanics of queries. It can be used
whenever you want or need to know more about the performance or behavior of a
particular query.

See this link.

Question 34Skipped

What are the correct parameters for time travel and fail-safe in the Snowflake Enterprise
Edition?

Default Time Travel Retention is set to 0 days. Maximum Time Travel Retention is 90 days.
Fail Safe retention time is 7 days.

Correct answer

chumma kizhii
#dm

Default Time Travel Retention is set to 1 day. Maximum Time Travel Retention is 90 days.
Fail Safe retention time is 7 days.

Default Time Travel Retention is set to 1 day. Maximum Time Travel Retention is 365 days.
Fail Safe retention time is 7 days.

Default Time Travel Retention is set to 7 days. Maximum Time Travel Retention is 1 day.
Fail Safe retention time is 90 days.

Default Time Travel Retention is set to 0 days. Maximum Time Travel Retention is 30 days.
Fail Safe retention time is 1 day.

Default Time Travel Retention is set to 90 days. Maximum Time Travel Retention is 7 days.
Fail Safe retention time is 356 days.

Overall explanation

chumma kizhii
#dm

Question 35Skipped

Which of the following commands cannot be used within a reader account?

1. ALTER WAREHOUSE

Correct answer

chumma kizhii
#dm

1. CREATE SHARE

1. DROP ROLE

1. DESCRIBE TABLE

1. SHOW SCHEMAS

Overall explanation

See this link.

In Snowflake, a reader account is a special type of user account that has read-only access to
data in Snowflake. This means that reader accounts can only perform actions that are
related to querying data, such as running SELECT statements and viewing metadata.

As a result, reader accounts cannot perform actions that modify the data or metadata stored
in Snowflake, such as creating new objects, modifying existing objects, or dropping objects.
This includes the CREATE SHARE command, which is used to create a new share and make it
available to other users.

Question 36Skipped

A user is preparing to load data from an external stage.

Which practice will provide the MOST efficient loading performance?

Load the data in one large file

Correct answer

Organize files into logical paths

Use pattern matching for regular expression execution

Store the files on the external stage to ensure caching is maintained

Overall explanation

When staging regular data sets, we recommend partitioning the data into logical paths that
include identifying details such as geographical location or other source identifiers, along
with the date when the data was written.

Organizing your data files by path lets you copy any fraction of the partitioned data into
Snowflake with a single command. This allows you to execute concurrent COPY statements
that match a subset of files, taking advantage of parallel operations.

See this link.

Question 37Skipped

chumma kizhii
#dm

Which URL type allows users to access unstructured data without authenticating into
Snowflake or passing an authorization token?

File URL

Scoped URL

Correct answer

Pre-signed URL

Signed URL

Overall explanation

Pre-signed URLs are used to download or access files, via a web browser for example,
without authenticating into Snowflake or passing an authorization token. These URLs are
ideal for business intelligence applications or reporting tools that need to display the
unstructured file contents.

See this link.

Question 38Skipped

What features does Snowflake Time Travel enable?

Analyzing data usage/manipulation over all periods of time

Conducting point-in-time analysis for BI reporting

Correct answer

Restoring data-related objects that have been deleted within the past 90 days

Querying data-related objects that were created within the past 365 days

Overall explanation

Snowflake Time Travel enables accessing historical data (i.e. data that has been changed or
deleted) at any point within a defined period. It serves as a powerful tool for performing the
following tasks:

• Restoring data-related objects (tables, schemas, and databases) that might have
been accidentally or intentionally deleted.

• Duplicating and backing up data from key points in the past.

• Analyzing data usage/manipulation over specified periods of time.

See this link.

Question 39Skipped

chumma kizhii
#dm

If a Snowflake user decides a table should be clustered, what should be used as the cluster
key?

The columns with very high cardinality.

The columns that are queried in the select clause.

Correct answer

The columns most actively used in the select filters.

The columns with many different values.

Overall explanation

Snowflake recommends prioritizing keys in the order below:

Cluster columns that are most actively used in selective filters. For many fact tables involved
in date-based queries (for example “WHERE invoice_date > x AND invoice date <= y”),
choosing the date column is a good idea. For event tables, event type might be a good
choice, if there are a large number of different event types. (If your table has only a small
number of different event types, then see the comments on cardinality below before
choosing an event column as a clustering key.)

If there is room for additional cluster keys, then consider columns frequently used in join
predicates, for example “FROM table1 JOIN table2 ON table2.column_A = table1.column_B”.

See this link.

Question 40Skipped

How often are the Account and Table master keys automatically rotated by Snowflake?

Correct answer

30 Days

60 Days

365 Days

90 Days

Overall explanation

All Snowflake-managed keys are automatically rotated by Snowflake when they are more
than 30 days old.

See this link.

Question 41Skipped

chumma kizhii
#dm

Which of the following are considerations when using a directory table when working with
unstructured data? (Choose two.)

Directory table data can not be refreshed manually.

A directory table is a separate database object.

A directory table will be automatically added to a stage.

Correct selection

Directory tables do not have their own grantable privileges.

Correct selection

Directory tables store data file metadata.

Overall explanation

See this link.

Question 42Skipped

Which stage type can be altered and dropped?

Correct answer

External stage

Table stage

User stage

Database stage

Overall explanation

We cannot drop the stage associated with a table or user; only named stages (internal or
external) can be dropped.

See this link.

Question 43Skipped

In which use case does Snowflake apply egress charges?

Query result retrieval

Correct answer

Database replication

Data sharing within a specific region

Loading data into Snowflake

chumma kizhii
#dm

Overall explanation

Snowflake charges a per-byte fee for data egress when users transfer data from a Snowflake
account into a different region on the same cloud platform or into a completely different
cloud platform. Data transfers within the same region are free.

See this link.

Question 44Skipped

How are serverless features billed?

Serverless features are not billed, unless the total cost for the month exceeds 10% of the
warehouse credits, on the account

Correct answer

Per second multiplied by an automatic sizing for the job

Per second multiplied by the size, as determined by the SERVERLESS_FEATURES_SIZE


account parameter

Per minute multiplied by an automatic sizing for the job, with a minimum of one minute

Overall explanation

Charges for serverless features are calculated based on total usage of snowflake-managed
compute resources measured in compute-hours. Compute-Hours are calculated on a per
second basis, rounded up to the nearest whole second. The number of credits consumed per
compute hour varies depending on the serverless feature. To learn how many credits are
consumed by a serverless feature, refer to the “Serverless Feature Credit Table” in the
Snowflake service consumption table.

See this link.

Question 45Skipped

Which categories are included in the execution time summary in a Query Profile? (Choose
two.)

Correct selection

Initialization

Correct selection

Local Disk I/O

Percentage of data read from cache

Pruning

chumma kizhii
#dm

Spilling

Overall explanation

Execution Time

Execution time provides information about “where the time was spent” during the
processing of a query. Time spent can be broken down into the following categories,
displayed in the following order:

• Processing — time spent on data processing by the CPU.

• Local Disk IO — time when the processing was blocked by local disk access.

• Remote Disk IO — time when the processing was blocked by remote disk access.

• Network Communication — time when the processing was waiting for the network
data transfer.

• Synchronization — various synchronization activities between participating


processes.

• Initialization — time spent setting up the query processing.

See this link.

Question 46Skipped

A size 3X-Large multi-cluster warehouse runs one cluster for one full hour and then runs two
clusters for the next full hour.

What would be the total number of credits billed?

Correct answer

192

128

64

149

Overall explanation

First hour, 1 cluster: 64

Second hour, 2 clusters: 64x2

See this link.

Question 47Skipped

chumma kizhii
#dm

A company needs to read multiple terabytes of data for an initial load as part of a Snowflake
migration. The company can control the number and size of CSV extract files.

How does Snowflake recommend maximizing the load performance?

Produce the largest files possible, reducing the overall number of files to process.

Correct answer

Produce a larger number of smaller files and process the ingestion with size Small virtual
warehouses.

Use an external tool to issue batched row-by-row inserts within BEGIN TRANSACTION and
COMMIT commands.

Use auto-ingest Snowpipes to load large files in a serverless model.

Overall explanation

The key is that the question mentions that we have to read 'multiple terabytes of data'. As a
good practice, Snowflake recommends that the files to be ingested should be between 100
and 250 MB compressed, so the best option is to split the source data into smaller files.

Question 48Skipped

When unloading data to an external stage, what is the MAXIMUM file size supported?

10 GB

Correct answer

5 GB

16 GB

1 GB

Overall explanation

By default, COPY INTO location statements separate table data into a set of output files to
take advantage of parallel operations. The maximum size for each file is set using the
MAX_FILE_SIZE copy option. The default value is 16777216 (16 MB) but can be increased to
accommodate larger files. The maximum file size supported is 5 GB for Amazon S3, Google
Cloud Storage, or Microsoft Azure stages.

See this link.

Question 49Skipped

Which Snowflake function will interpret an input string as a JSON document, and produce a
VARIANT value?

chumma kizhii
#dm

json_extract_path_text()

flatten

object_construct()

Correct answer

parse_json()

Overall explanation

Interprets an input string as a JSON document, producing a VARIANT value.

Syntax PARSE_JSON( <expr> )

See this link.

Question 50Skipped

Which snowflake objects will incur both storage and cloud compute charges? (Choose two.)

Sequence

Correct selection

Clustered table

Transient table

Correct selection

Materialized view

Secure view

Overall explanation

Clustered table needs to undergo clustering as the data changes and Materialized view also
undergoes changes every time the underlying data changes or when the view is set to
refresh.

chumma kizhii
#dm

Question 51Skipped

What information is found within the Statistic output in the Query Profile Overview?

Nodes by execution time

Operator tree

Correct answer

Table pruning

Most expensive nodes

Overall explanation

See this link.

Question 52Skipped

The Snowflake VARIANT data type imposes a 16 MB size limit on what?

A view

A file in a stage

An individual column

Correct answer

An individual row

Overall explanation

The VARIANT data type imposes a 16 MB size limit on individual rows.

chumma kizhii
#dm

See this link.

Question 53Skipped

Which activities are included in the Cloud Services layer? (Choose two.)

Correct selection

Infrastructure management

Correct selection

User authentication

Partition scanning

Dynamic data masking

Data storage

Overall explanation

The Cloud Services layer in Snowflake is responsible for critical data-related activities.
Services managed in this layer include:

• Authentication

• Infrastructure management

• Metadata management

• Query parsing and optimization

• Access control

See this link.

Question 54Skipped

Which use case does the search optimization service support?

Join predicates on VARIANT columns

Correct answer

Conjunctions (AND) of multiple equality predicates

LIKE/ILIKE/RLIKE join predicates

Disjuncts (OR) in join predicates

Overall explanation

Supported Predicate Types

chumma kizhii
#dm

Search optimization can improve the performance of queries using these kinds of predicates:

• Point lookup queries using equality and IN.

• Substring queries using wildcards and regular expressions.

• Searches in semi-structured data.

• Geospatial queries.

• Queries using conjunctions (AND) and disjunctions (OR).

See this link.

The search optimization service does not directly improve the performance of joins.
However, it can improve the performance of filtering rows from either table prior to the join.

Question 55Skipped

Which privilege must be granted to a share to allow secure views the ability to reference
data in multiple databases?

SELECT on tables used by the secure view

Correct answer

REFERENCE_USAGE on databases

SHARE on databases and schemas

CREATE_SHARE on the account

Overall explanation

See this link.

Question 56Skipped

How does the search optimization service help Snowflake users improve query
performance?

Correct answer

It maintains a persistent data structure that keeps track of the values of the table’s
columns in each of its micro-partitions.

It scans the micro-partitions based on the joins used in the queries and scans only join
columns.

It keeps track of running queries and their results and saves those extra scans on the table.

It scans the local disk cache to avoid scans on the tables used in the query.

chumma kizhii
#dm

Overall explanation

To improve performance of search queries, the search optimization service creates and
maintains a persistent data structure called a search access path. The search access path
keeps track of which values of the table’s columns might be found in each of its micro-
partitions, allowing some micro-partitions to be skipped when scanning the table.

See this link.

Question 57Skipped

A virtual warehouse is created using the following command:

1. Create warehouse my_WH with -

2. warehouse_size = MEDIUM

3. min_cluster_count = 1

4. max_cluster_count = 1

5. auto_suspend = 60

6. auto_resume = true;

The image below is a graphical representation of the warehouse utilization across two days.

chumma kizhii
#dm

What action should be taken to address this situation?

Lower the value of the parameter STATEMENT_QUEUED_TIMEOUT_IN_SECONDS.

Correct answer

Configure the warehouse to a multi-cluster warehouse.

Increase the warehouse size from Medium to 2XL.

Increase the value for the parameter MAX_CONCURRENCY_LEVEL.

Overall explanation

Multi-cluster is the solution for concurrency. Default MAX_CONCURRENCY_LEVEL is 8.


Therefore, concurrency is not an issue. We need multi-cluster warehouses for scale-out.

See this link.

Question 58Skipped

Which methods can be used to delete staged files from a Snowflake stage? (Choose two.)

Correct selection

Specify the PURGE copy option in the COPY INTO command.

Specify the TEMPORARY option when creating the file format.

Correct selection

Use the REMOVE command after the load completes.

Use the DROP command after the load completes.

Use the DELETE LOAD HISTORY command after the load completes.

Overall explanation

See this link.

And this link.

Question 59Skipped

Which of the following is the Snowflake Account_Usage.Metering_History view used for?

Correct answer

Gathering the hourly credit usage for an account

Calculating the funds left on an account's contract

Compiling an account's average cloud services cost over the previous month

chumma kizhii
#dm

Summarizing the throughput of Snowpipe costs for an account

Overall explanation

The METERING_HISTORY view in the ACCOUNT_USAGE schema can be used to return the
hourly credit usage for an account within the last 365 days (1 year).

See this link.

Question 60Skipped

User INQUISITIVE_PERSON has been granted the role DATA_SCIENCE. The role
DATA_SCIENCE has privileges OWNERSHIP on the schema MARKETING of the database
ANALYTICS_DW.

Which command will show all privileges granted to that schema?

1. SHOW GRANTS ON ROLE DATA_SCIENCE

1. SHOW GRANTS TO USER INQUISITIVE_PERSON

Correct answer

1. SHOW GRANTS ON SCHEMA ANALYTICS_DW.MARKETING

1. SHOW GRANTS OF ROLE DATA_SCIENCE

Overall explanation

See this link.

Question 61Skipped

In which Snowsight section can a user switch roles, modify their profile, and access
documentation?

The content pane

The activity page

The worksheets page

Correct answer

The user menu

Overall explanation

You can expect some questions about the operation of Snowsight in the exam. It is advisable
to navigate through the various sections to familiarize yourself with the interface.

See this link.

chumma kizhii
#dm

Question 62Skipped

Query parsing and compilation occurs in which architecture layer of the Snowflake Cloud
Data Platform?

Storage layer

Correct answer

Cloud services layer

Cloud agnostic layer

Compute layer

Overall explanation

See this link.

Question 63Skipped

What is the effect of configuring a virtual warehouse auto-suspend value to ‘0’?

The warehouse will not resume automatically.

Correct answer

The warehouse will never suspend.

All clusters in the multi-cluster warehouse will resume immediately.

The warehouse will suspend immediately upon work completion.

Overall explanation

Specifies the number of seconds of inactivity after which a warehouse is automatically


suspended.

Valid values

Any integer 0 or greater, or NULL:

• Setting a value less than 60 is allowed, but may not result in the desired/expected
behavior because the background process that suspends a warehouse runs
approximately every 60 seconds and, therefore, is not intended for enabling exact
control over warehouse suspension.

• Setting a 0 or NULL value means the warehouse never suspends.

See this link.

Question 64Skipped

chumma kizhii
#dm

Which minimum Snowflake edition allows for a dedicated metadata store?

Standard

Enterprise

Business Critical

Correct answer

Virtual Private Snowflake

Overall explanation

Dedicated metadata store and pool of compute resources (used in virtual warehouses) is
offered through VPS

See this link.

Question 65Skipped

Which of the following features are available with the Snowflake Enterprise edition? (Choose
two.)

Customer managed keys (Tri-secret secure)

Correct selection

Extended time travel

Automated index management

Correct selection

Native support for geospatial data

Database replication and failover

Overall explanation

Database replication and failover - Business

Automated index management - No index in Snowflake

Customer managed keys (Tri-secret secure) - Business

Extended time travel - Enterprise

Native support for geospatial data - Standard

chumma kizhii
#dm

See this link.

Question 66Skipped

A company strongly encourages all Snowflake users to self-enroll in Snowflake's default


Multi-Factor Authentication (MFA) service to provide increased login security for users

chumma kizhii
#dm

connecting to Snowflake.

Which application will the Snowflake users need to install on their devices in order to
connect with MFA?

Okta Verify

Correct answer

Duo Mobile

Google Authenticator

Microsoft Authenticator

Overall explanation

Snowflake integrates with the Duo Mobile app to provide multi-factor authentication (MFA)
for users connecting to Snowflake. This means that in order to use MFA, Snowflake users
need to have the Duo Mobile app installed on their devices and enroll in Snowflake's MFA
service.

Okta Verify, Microsoft Authenticator, and Google Authenticator are alternative MFA apps,
but they are not directly integrated with Snowflake and may not be supported for use with
Snowflake's MFA service.

See this link.

Question 67Skipped

Credit charges for Snowflake virtual warehouses are calculated based on which of the
following considerations? (Choose two.)

The number of active users assigned to the warehouse

Correct selection

The length of time the warehouse is running

The number of queries executed

Correct selection

The size of the virtual warehouse

The duration of the queries that are executed

Overall explanation

Snowflake credits are charged based on the number of virtual warehouses you use, how
long they run, and their size.

chumma kizhii
#dm

See this link.

Question 68Skipped

The following settings are configured:

THE MIN_DATA_RETENTION_TIME_IN_DAYS is set to 5 at the account level.

THE DATA_RETENTION_TIME_IN_DAYS is set to 2 at the object level.

For how many days will the data be retained at the object level?

Correct answer

Overall explanation

The MIN_DATA_RETENTION_TIME_IN_DAYS account parameter can be set by users with the


ACCOUNTADMIN role to set a minimum retention period for the account. This parameter
does not alter or replace the DATA_RETENTION_TIME_IN_DAYS parameter value. However it
may change the effective data retention time. When this parameter is set at the account
level, the effective minimum data retention period for an object is determined by
MAX(DATA_RETENTION_TIME_IN_DAYS, MIN_DATA_RETENTION_TIME_IN_DAYS).

See this link.

Question 69Skipped

What does the “percentage scanned from cache” represent in the Query Profile?

The percentage of data scanned from the query cache

Correct answer

The percentage of data scanned from the local disk cache

The percentage of data scanned from the remote disk cache

The percentage of data scanned from the result cache

Overall explanation

chumma kizhii
#dm

See this link.

Question 70Skipped

Which commands can a Snowflake user execute to specify a cluster key for a table? (Choose
two.)

SHOW

Correct selection

ALTER

UPDATE

Correct selection

CREATE

SET

Overall explanation

A clustering key can be defined at table creation (using the CREATE TABLE command) or
afterward (using the ALTER TABLE command)..

See this link.

Question 71Skipped

Why should a Snowflake user implement a secure view? (Choose two.)

To optimize query concurrency and queuing

Correct selection

To limit access to sensitive data

Correct selection

To hide view definition and details from unauthorized users

To increase query performance

To store unstructured data

Overall explanation

See this link.

Question 72Skipped

What is the recommended compressed file size range for continuous data loads using
Snowpipe?

chumma kizhii
#dm

8-16 MB

Correct answer

100-250 MB

16-24 MB

10-99 MB

Overall explanation

Snowpipe is typically used to load data that is arriving continuously. File sizing plays an
important role in Snowpipe's performance. The recommended file size for data loading is
100-250MB compressed, however, if data is arriving continuously, then try to stage the data
within one-minute intervals.

Question 73Skipped

Which query profile statistics help determine if efficient pruning is occurring? (Choose two.)

Correct selection

Partitions total

Bytes spilled to local storage

Correct selection

Partitions scanned

Percentage scanned from cache

Bytes sent over network

Overall explanation

See this link.

Question 74Skipped

Which statement about billing applies to Snowflake credits?

Credits are consumed based on the number of credits billed for each hour that a
warehouse runs.

Credits are used to pay for cloud data storage usage.

Correct answer

Credits are consumed based on the warehouse size and the time the warehouse is
running.

chumma kizhii
#dm

Credits are billed per-minute with a 60-minute minimum.

Overall explanation

A virtual warehouse is one or more clusters of compute resources that enable executing
queries, loading data, and performing other DML operations.

Snowflake credits are used to pay for the processing time used by each virtual warehouse.
Snowflake credits are charged based on the number of virtual warehouses you use, how
long they run, and their size.

See this link.

Question 75Skipped

What file formats does Snowflake support for loading semi-structured data? (Choose three.)

TSV

Correct selection

Avro

JPEG

PDF

Correct selection

JSON

Correct selection

Parquet

Overall explanation

Snowflake can import semi-structured data from JSON, Avro, ORC, Parquet, and XML formats
and store it in Snowflake data types designed specifically to support semi-structured data.

See this link.

Question 76Skipped

What is the minimum Snowflake edition needed for database failover and fail-back between
Snowflake accounts for business continuity and disaster recovery?

Correct answer

Business Critical

Enterprise

chumma kizhii
#dm

Standard

Virtual Private Snowflake

Overall explanation

Requires Business Critical (or higher).

See this link.

Question 77Skipped

Which ACCOUNT_USAGE views are used to evaluate the details of dynamic data masking?
(Choose two.)

ROLES

ACCESS_HISTORY

QUERY_HISTORY

RESOURCE_MONITORS

Correct selection

POLICY_REFERENCES

Correct selection

MASKING_POLICIES

Overall explanation

Snowflake provides two Account Usage views to obtain information about masking policies:

1. The MASKING POLICIES view provides a list of all masking policies in your Snowflake
account.

2. The POLICY_REFERENCES view provides a list of all objects in which a masking policy is set.

See this link.

Question 78Skipped

Which Snowflake object uses credits for maintenance?

Regular view

Regular table

Cached query result

Correct answer

Materialized view

chumma kizhii
#dm

Overall explanation

The automatic maintenance of materialized views consumes credits.

See this link.

Question 79Skipped

Which SQL command will list the files in a named stage?

get @%mytable;

Correct answer

list @my_stage;

get @my_stage;

list @~;

Overall explanation

LIST

Returns a list of files that have been staged (i.e. uploaded from a local file system or
unloaded from a table) in one of the following Snowflake stages:

• Named internal stage.

• Named external stage.

• Stage for a specified table.

• Stage for the current user.

See this link.

Question 80Skipped

Which SQL command can be used to see the CREATE definition of a masking policy?

DESCRIBE MASKING POLICY

SHOW MASKING POLICIES

LIST MASKING POLICIES

Correct answer

GET_DDL

Overall explanation

GET_DDL returns the create statement to recreate the object.

chumma kizhii
#dm

DESCRIBE will show the sql behind the policy but not in the form of a create statement.

See this link.

Question 81Skipped

What can be done to reduce queueing on a virtual warehouse?

Increase the warehouse size.

Lower the MAX_CONCURRENCY_LEVEL setting for the warehouse.

Correct answer

Change the warehouse to a multi-cluster warehouse.

Increase the AUTO_SUSPEND setting for the warehouse.

Overall explanation

Multi-cluster warehouses are best utilized for scaling resources to improve concurrency for
users/queries.

See this link.

Question 82Skipped

What is the recommended way to change the existing file format type in my_format from
CSV to JSON?

1. REPLACE FILE FORMAT my_format TYPE=JSON;

1. ALTER FILE FORMAT my_format SWAP TYPE WITH JSON;

1. ALTER FILE FORMAT my_format SET TYPE=JSON;

Correct answer

1. CREATE OR REPLACE FILE FORMAT my_format TYPE=JSON;

Overall explanation

ALTER FILE FORMAT does not support the following actions:

- Changing the type (CSV, JSON, etc.) for the file format.

To make any of these changes, you must recreate the file format.

See this link.

Question 83Skipped

Which key governance feature in Snowflake allows users to identify automatically data
objects that contain sensitive data and their related objects?

chumma kizhii
#dm

Column-level security

Object tagging

Correct answer

Data classification

Row access policy

Overall explanation

Data classification in Snowflake is a feature that allows users to automatically identify and
classify columns in their tables containing personal or sensitive data.

Data classification is a multi-step process that associates Snowflake-defined tags (i.e. system
tags) to columns by analyzing the cells and metadata for personal data.

Based on the tracking information and related audit processes, the data engineer can
protect the column containing personal or sensitive data with a masking policy or the table
containing this column with a row access policy.

See this link.

Question 84Skipped

Which of the following activities consume virtual warehouse credits in the Snowflake
environment? (Choose two.)

Correct selection

Running COPY commands

Cloning a database

Caching query results

Correct selection

Running a custom query

Running EXPLAIN and SHOW commands

Overall explanation

A warehouse provides the required resources, such as CPU, memory, and temporary storage,
to perform the following operations in a Snowflake session:

• Executing SQL SELECT statements that require compute resources (e.g. retrieving
rows from tables and views).

chumma kizhii
#dm

• Performing DML operations, such as: Updating rows in tables (DELETE , INSERT ,
UPDATE).

• Loading data into tables (COPY INTO <table>).

• Unloading data from tables (COPY INTO <location>).

Note - To perform these operations, a warehouse must be running and in use for the
session. While a warehouse is running, it consumes Snowflake credits.

See this link.

Question 85Skipped

What is the purpose of multi-cluster virtual warehouses?

Correct answer

To eliminate or reduce queuing of concurrent queries

To allow users the ability to choose the type of compute nodes that make up a virtual
warehouse cluster

To create separate data warehouses to increase query optimization

To allow the warehouse to resize automatically

Overall explanation

To enable fully automated scaling for concurrency, Snowflake recommends multi-cluster


warehouses, which provide essentially the same benefits as creating additional warehouses
and redirecting queries, but without requiring manual intervention.

See this link.

Question 86Skipped

Which function generates a Snowflake hosted file URL to a staged file using the stage name
and relative file path as inputs?

BUILD_SCOPED_FILE_URL

Correct answer

BUILD_STAGE_FILE_URL

GET_STAGE_LOCATION

GET_PRESIGNED_URL

Overall explanation

BUILD_STAGE_FILE_URL

chumma kizhii
#dm

Generates a Snowflake-hosted file URL to a staged file using the stage name and relative file
path as inputs. A file URL permits prolonged access to a specified file. That is, the file URL
does not expire.

See this link.

This question can be tricky because there are very similar functions with small details that
differentiate them.

Question 87Skipped

Which Snowflake data governance feature can support auditing when a user query reads
column data?

Object dependencies

Correct answer

Access History

Column-level security

Data classification

Overall explanation

Access History in Snowflake refers to when the user query reads data and when the SQL
statement performs a data write operation, such as INSERT, UPDATE, and DELETE along with
variations of the COPY command, from the source data object to the target data object. The
user access history can be found by querying the Account Usage ACCESS_HISTORY view.

See this link.

Question 88Skipped

How does Snowflake handle the data retention period for a table if a stream has not been
consumed?

The data retention period is permanently extended for the table.

Correct answer

The data retention period is temporarily extended to the stream’s offset.

The data retention period is not affected by the stream consumption.

The data retention period s reduced to a minimum of 14 days.

Overall explanation

If the data retention period for a table is less than 14 days, and a stream has not been
consumed, Snowflake temporarily extends this period to prevent it from going stale. The

chumma kizhii
#dm

period is extended to the stream’s offset, up to a maximum of 14 days by default, regardless


of the Snowflake edition for your account. The maximum number of days for which
Snowflake can extend the data retention period is determined by the
MAX_DATA_EXTENSION_TIME_IN_DAYS parameter value. When the stream is consumed,
the extended data retention period is reduced to the default period for the table.

See this link.

Question 89Skipped

How long is Snowpipe data load history retained?

64 days

Correct answer

14 days

As configured in the CREATE PIPE settings

Until the pipe is dropped

Overall explanation

Stored in the metadata of the pipe for 14 days. Must be requested from Snowflake via a
REST endpoint, SQL table function, or ACCOUNT_USAGE view

See this link.

Question 90Skipped

A Snowflake Administrator needs to ensure that sensitive corporate data in Snowflake tables
is not visible to end users, but is partially visible to functional managers.

How can this requirement be met?

Correct answer

Use dynamic data masking.

Use data encryption.

Revoke all roles for functional managers and end users.

Use secure materialized views.

Overall explanation

Masking policy administrators can implement a masking policy such that analysts (i.e. users
with the custom ANALYST role) can only view the last four digits of a phone number and
none of the social security number, while customer support representatives (i.e. users with

chumma kizhii
#dm

the custom SUPPORT role) can view the entire phone number and social security number for
customer verification use cases.

See this link.

Question 91Skipped

What is the PRIMARY factor that determines the cost of using a virtual warehouse in
Snowflake?

The type of SQL statements executed

The number of tables or databases queried

The amount of data stored in the warehouse

Correct answer

The length of time the compute resources in each cluster run

Overall explanation

See this link.

Question 92Skipped

What type of query will benefit from the query acceleration service?

Correct answer

Queries with large scans and selective filters

Queries of tables that have search optimization service enabled

Queries where the GROUP BY has high cardinality

Queries without filters or aggregation

Overall explanation

Examples of the types of workloads that might benefit from the query acceleration service
include:

• Ad hoc analytics.

• Workloads with unpredictable data volume per query.

• Queries with large scans and selective filters.

See this link.

Question 93Skipped

Which encryption type will enable client-side encryption for a directory table?

chumma kizhii
#dm

AES

Correct answer

SNOWFLAKE_FULL

AWS_CSE

SNOWFLAKE_SSE

Overall explanation

SNOWFLAKE_FULL: Client-side and server-side encryption. The files are encrypted by a client
when it uploads them to the internal stage using PUT.

See this link.

Question 94Skipped

What effect does WAIT_FOR_COMPLETION = TRUE have when running an ALTER


WAREHOUSE command and changing the warehouse size?

Correct answer

It does not return from the command until the warehouse has finished changing its size.

The warehouse size does not change until the warehouse is suspended and restarted.

The warehouse size does not change until all queries currently in the warehouse queue
have completed.

The warehouse size does not change until all queries currently running in the warehouse
have completed.

Overall explanation

WAIT_FOR_COMPLETION = FALSE | TRUE When resizing a warehouse, you can use this
parameter to block the return of the ALTER WAREHOUSE command until the resize has
finished provisioning all its compute resources. Blocking the return of the command when
resizing to a larger warehouse serves to notify you that your compute resources have been
fully provisioned and the warehouse is now ready to execute queries using all the new
resources.

See this link.

Question 95Skipped

How would a user run a multi-cluster warehouse in maximized mode?

Configure the maximum clusters setting to "Maximum."

Turn on the additional clusters manually after starting the warehouse.

chumma kizhii
#dm

Correct answer

Set the minimum Clusters and maximum Clusters settings to the same value.

Set the minimum clusters and maximum clusters settings to different values.

Overall explanation

If min=max, there is no room for increasing any clusters and min and max would be same.
Hence same value for maximized mode.

See this link.

Question 96Skipped

What internal stages are available in Snowflake? (Choose three.)

Correct selection

Named stage

Schema stage

Correct selection

Table stage

Correct selection

User stage

Stream stage

Database stage

Overall explanation

Snowflake supports the following types of internal stages:

• User

• Table

• Named

See this link.

Question 97Skipped

What happens when a Snowflake user changes the data retention period at the schema
level?

All child objects with an explicit retention period will be overridden with the new
retention period.

chumma kizhii
#dm

Correct answer

All child objects that do not have an explicit retention period will automatically inherit the
new retention period.

All child objects will retain data for the new retention period.

All explicit child object retention periods will remain unchanged.

Overall explanation

If you change the data retention period for a database or schema, the change only affects
active objects contained within the database or schema. Any objects that have been
dropped (for example, tables) remain unaffected.

For example, if you have a schema s1 with a 90-day retention period and table t1 is in
schema s1, table t1 inherits the 90-day retention period. If you drop table s1.t1, t1 is
retained in Time Travel for 90 days. Later, if you change the schema’s data retention period
to 1 day, the retention period for the dropped table t1 is unchanged. Table t1 will still be
retained in Time Travel for 90 days.

See this link.

Question 98Skipped

A user has an application that writes a new file to a cloud storage location every 5 minutes.

What would be the MOST efficient way to get the files into Snowflake?

Create a task that runs a COPY INTO operation from an external stage every 5 minutes.

Create a task that runs a GET operation to intermittently check for new files.

Correct answer

Set up cloud provider notifications on the file location and use Snowpipe with auto-ingest.

Create a task that PUTS the files in an internal stage and automate the data loading
wizard.

Overall explanation

Pipes are highly scalable and cost-effective since they only incur charges when data is
ingested, unlike other options like copying data at regular intervals or using external tables.

Question 99Skipped

What does a table with a clustering depth of 1 mean in Snowflake?

The table has no micro-partitions.

chumma kizhii
#dm

The table has 1 overlapping micro-partition.

Correct answer

The table has no overlapping micro-partitions.

The table has only 1 micro-partition.

Overall explanation

Higher the overlap micro partition, higher is the overlap depth. Overlap depth=1 means
there is no overlap

See this link.

Question 100Skipped

What should be used when creating a CSV file format where the columns are wrapped by
single quotes or double quotes?

SKIP_BYTE_ORDER_MARK

ESCAPE_UNENCLOSED_FIELD

Correct answer

FIELD_OPTIONALLY_ENCLOSED_BY

BINARY_FORMAT

Overall explanation

chumma kizhii
#dm

Character used to enclose strings. Value can be NONE, single quote character ('), or double
quote character ("). To use the single quote character, use the octal or hex representation
(0x27) or the double single-quoted escape ('').

See this link.

Question 101Skipped

When referring to User-Defined Function (UDF) names in Snowflake, what does the term
overloading mean?

There are multiple SQL UDFs with the same names and the same number of arguments.

Correct answer

There are multiple SQL UDFs with the same names but with a different number of
arguments or argument types.

There are multiple SQL UDFs with different names but the same number of arguments or
argument types.

There are multiple SQL UDFs with the same names and the same number of argument
types.

Overall explanation

Snowflake supports overloading procedures and functions. In a given schema, you can
define multiple procedures or functions that have the same name but different signatures.
The signatures must differ by the number of arguments, the types of the arguments, or
both.

See this link.

Question 102Skipped

What happens to the underlying table data when a CLUSTER BY clause is added to a
Snowflake table?

Correct answer

Data may be colocated by the cluster key within the micro-partitions to improve pruning
performance

Larger micro-partitions are created for common data values to reduce the number of
partitions that must be scanned

Smaller micro-partitions are created for common data values to allow for more parallelism

Data is hashed by the cluster key to facilitate fast searches for common data values

Overall explanation

chumma kizhii
#dm

See this link.

Question 103Skipped

By default, which role has access to the SYSTEM$GLOBAL_ACCOUNT_SET_PARAMETER


function?

ACCOUNTADMIN

Correct answer

ORGADMIN

SECURITYADMIN

SYSADMIN

Overall explanation

Usage notes: Only organization administrators (i.e. users with the ORGADMIN role) can call
this SQL function.

See this link.

Question 104Skipped

Which statements reflect key functionalities of a Snowflake Data Exchange? (Choose two.)

Correct selection

A Data Exchange allows groups of accounts to share data privately among the accounts.

Data Exchange functionality is available by default in accounts using the Enterprise edition
or higher.

A Data Exchange allows accounts to share data with third, non-Snowflake parties.

If an account is enrolled with a Data Exchange, it will lose its access to the Snowflake
Marketplace.

Correct selection

The sharing of data in a Data Exchange is bidirectional. An account can be a provider for
some datasets and a consumer for others.

Overall explanation

See this link.

Question 105Skipped

Which Snowflake layer is always used when accessing a query from the result cache?

chumma kizhii
#dm

Metadata

Data Storage

Compute

Correct answer

Cloud Services

Overall explanation

See this link.

Question 106Skipped

What metadata does Snowflake store for rows in micro-partitions? (Choose two.)

Sorted values

chumma kizhii
#dm

Null values

Correct selection

Distinct values

Index values

Correct selection

Range of values

Overall explanation

See this link.

Question 107Skipped

Which columns are part of the result set of the Snowflake LATERAL FLATTEN command?
(Choose two.)

BYTE_SIZE

CONTENT

DATATYPE

Correct selection

INDEX

Correct selection

PATH

Overall explanation

The returned rows consist of a fixed set of columns:

1. +-----+------+------+-------+-------+------+

2. | SEQ | KEY | PATH | INDEX | VALUE | THIS |

3. |-----+------+------+-------+-------+------|

SEQ

A unique sequence number associated with the input record; the sequence is not
guaranteed to be gap-free or ordered in any particular way.

KEY

For maps or objects, this column contains the key to the exploded value.

chumma kizhii
#dm

PATH

The path to the element within a data structure which needs to be flattened.

INDEX

The index of the element, if it is an array; otherwise NULL.

VALUE

The value of the element of the flattened array/object.

THIS

The element being flattened (useful in recursive flattening).

See this link.

Question 108Skipped

What is a feature of a stored procedure in Snowflake?

They can be created as secure and hide the underlying metadata from all users.

Correct answer

They can be created to run with a caller's rights or an owner's rights.

They can only contain a single SQL statement.

They can access tables from a single database.

Overall explanation

See this link.

Question 109Skipped

What operation can be performed using Time Travel?

Extending a permanent table’s retention duration from 90 to 100 days

Correct answer

Creating a clone of an entire table at a specific point in the past from a permanent table

Restoring tables that have been dropped from a data share

Disabling Time Travel for a specific object by setting DATA_RETENTION_TIME_IN_DAYS to


NULL

Overall explanation

Using Time Travel, you can perform the following actions within a defined period of time:

chumma kizhii
#dm

• Query data in the past that has since been updated or deleted.

• Create clones of entire tables, schemas, and databases at or before specific points in
the past.

• Restore tables, schemas, and databases that have been dropped.

See this link.

Question 110Skipped

Which privilege is required for a role to be able to resume a suspended warehouse if auto-
resume is not enabled?

MONITOR

Correct answer

OPERATE

USAGE

MODIFY

Overall explanation

OPERATE: Enables changing the state of a warehouse (stop, start, suspend, resume). In
addition, enables viewing current and past queries executed on a warehouse and aborting
any executing queries.

See this link.

Question 111Skipped

How long does Snowflake retain information in the ACCESS_HISTORY view?

14 days

28 days

7 days

Correct answer

365 days

Overall explanation

See this link.

Question 112Skipped

What Snowflake role must be granted for a user to create and manage accounts?

chumma kizhii
#dm

ACCOUNTADMIN

SECURITYADMIN

SYSADMIN

Correct answer

ORGADMIN

Overall explanation

An account can be created by an ORGADMIN through the web interface or using SQL

See this link.

Question 113Skipped

Which of the following practices are recommended when creating a user in Snowflake?
(Choose two.)

Set the number of minutes to unlock to 15 minutes.

Correct selection

Set a default role for the user.

Correct selection

Force an immediate password change.

Configure the user to be initially disabled.

Set the user's access to expire within a specified timeframe.

Overall explanation

See this link.

Question 114Skipped

Which command is used to unload data from a Snowflake table into a file in a stage?

Correct answer

COPY INTO

WRITE

EXTRACT INTO

GET

Overall explanation

chumma kizhii
#dm

In Snowflake, the "COPY INTO" command is used to unload data from a Snowflake table into
a file in a stage. The stage acts as an intermediate storage location for the unloaded data,
and the data can then be transferred to an external storage location such as Amazon S3 or
Microsoft Azure Blob Storage.

See this link.

Question 115Skipped

When loading data into Snowflake, the COPY command supports which of the following?

Correct answer

Column reordering

Filters

Aggregates

Joins

Overall explanation

See this link.

Question 116Skipped

Where is Snowflake metadata stored?

Correct answer

In the cloud services layer

In the virtual warehouse layer

Within the data files

In the remote storage layer

Overall explanation

See this link.

Question 117Skipped

A user needs to create a materialized view in the schema MYDB.MYSCHEMA.

Which statements will provide this access?

Correct answer

1. GRANT ROLE MYROLE TO USER USER1;

chumma kizhii
#dm

2. GRANT CREATE MATERIALIZED VIEW ON SCHEMA MYDB.MYSCHEMA TO ROLE


MYROLE;

1. GRANT ROLE MYROLE TO USER USER1;

2. GRANT CREATE MATERIALIZED VIEW ON SCHEMA MYDB.MYSCHEMA TO USER


USER1;

1. GRANT ROLE MYROLE TO USER USER1;

2. GRANT CREATE MATERIALIZED VIEW ON SCHEMA MYDB.MYSCHEMA TO MYROLE;

1. GRANT ROLE MYROLE TO USER USER1;

2. GRANT CREATE MATERIALIZED VIEW ON SCHEMA MYDB.MYSCHEMA TO USER1;

Overall explanation

See this link.

Privileges can only be granted to users and role_name is a required parameter.

Question 118Skipped

Which views are included in the DATA_SHARING_USAGE schema? (Choose two.)

ACCESS_HISTORY

Correct selection

LISTING_TELEMETRY_DAILY

DATA_TRANSFER_HISTORY

Correct selection

MONETIZED_USAGE_DAILY

WAREHOUSE_METERING_HISTORY

Overall explanation

You can expect in the exams some strange questions about parameters that are not the most
typical ones to use. It is not necessary to know all the parameters available in Snowflake, but
it is important that you are at least familiar with the most important ones.

In the case of this question, you can approach it by elimination.

See this link.

Question 119Skipped

chumma kizhii
#dm

Which of the following statements describe features of Snowflake data caching? (Choose
two.)

When a virtual warehouse is suspended, the data cache is saved on the remote storage
layer.

Correct selection

The RESULT_SCAN table function can access and filter the contents of the query result
cache.

A user can only access their own queries from the query result cache.

Correct selection

When the data cache is full, the least-recently used data will be cleared to make room.

A user must set USE_METADATA_CACHE to TRUE to use the metadata cache in queries.

Overall explanation

Option When the data cache is full, the least-recently used data will be cleared to make
room. is correct because Snowflake automatically manages its data cache and evicts the
least-recently used data when the cache becomes full.

Option The RESULT_SCAN table function can access and filter the contents of the query result
cache. is correct because the RESULT_SCAN table function can be used to query and filter
the data that has been cached in the query result cache.

Option When a virtual warehouse is suspended, the data cache is saved on the remote
storage layer. is incorrect because when a virtual warehouse is suspended, the data cache is
not saved on the remote storage layer. The data cache is cleared when a virtual warehouse is
suspended and any data that needs to be cached is reloaded from the remote storage layer
when the virtual warehouse is resumed.

Option A user can only access their own queries from the query result cache. is incorrect
because the query result cache is a shared cache and all users can access the data that has
been cached. There are no restrictions based on user access.

Option A user must set USE_METADATA_CACHE to TRUE to use the metadata cache in
queries. is incorrect because the metadata cache is used by default in queries and there is no
need for a user to explicitly set USE_METADATA_CACHE to TRUE.

Question 120Skipped

Users are responsible for data storage costs until what occurs?

Data expires from Time Travel

Correct answer

chumma kizhii
#dm

Data expires from Fail-safe

Data is truncated from a table

Data is deleted from a table

Overall explanation

Storage is calculated and charged for data regardless of whether it is in the Active, Time
Travel, or Fail-safe state. Because these life-cycle states are sequential, updated/deleted data
protected by CDP will continue to incur storage costs until the data leaves the Fail-safe state.

See this link.

Question 121Skipped

How can a Snowflake user sample 10 rows from a table named SNOWPRO? (Choose two.)

Correct selection

1. SELECT * FROM SNOWPRO SAMPLE BERNOULLI (10 ROWS)

1. SELECT * FROM SNOWPRO TABLESAMPLE BLOCK (10)

1. SELECT * FROM SNOWPRO TABLESAMPLE BLOCK (10 ROWS)

1. SELECT * FROM SNOWPRO SAMPLE SYSTEM (10)

Correct selection

1. SELECT * FROM SNOWPRO TABLESAMPLE (10 ROWS)

Overall explanation

1. SELECT * FROM SNOWPRO TABLESAMPLE (10 ROWS)

2. SELECT * FROM SNOWPRO SAMPLE BERNOULLI (10 ROWS)

See this link.

Question 122Skipped

Which Snowflake object contains all the information required to share a database?

Correct answer

Share

Secure view

Sequence

Private listing

Overall explanation

chumma kizhii
#dm

Shares are named Snowflake objects that encapsulate all of the information required to
share a database.

See this link.

Question 123Skipped

What happens when a Data Provider revokes privileges to a share on an object in their
source database?

The Data Consumers stop seeing data updates and become responsible for storage charges
for the object.

A static copy of the object at the time the privilege was revoked is created in the Data
Consumers account.

Any additional data arriving after this point in time will not be visible to Data Consumers.

Correct answer

The object immediately becomes unavailable for all Data Consumers.

Overall explanation

Revokes access privileges for databases and other supported database objects (schemas,
tables, and views) from a share. Revoking privileges on these objects effectively removes the
objects from the share, disabling access to the objects granted via the database role in all
consumer accounts that have created a database from the share.

Question 124Skipped

On which of the following cloud platforms can a Snowflake account be hosted? (Choose
three.)

Correct selection

Microsoft Azure Cloud

Oracle Cloud

Private Virtual Cloud

Alibaba Cloud

Correct selection

Amazon Web Services

Correct selection

Google Cloud Platform

chumma kizhii
#dm

Overall explanation

See this link.

Question 125Skipped

Which of the following objects are contained within a schema? (Choose two.)

Share

Role

User

Correct selection

Stream

Correct selection

External table

Warehouse

Overall explanation

Role (ACCOUNT)

Stream (SCHEMA)

Warehouse (ACCOUNT)

External table (SCHEMA)

User (ACCOUNT)

Share (DATABASE)

Question 126Skipped

A marketing co-worker has requested the ability to change a warehouse size on their
medium virtual warehouse called MKTG_WH.

Which of the following statements will accommodate this request?

1. ALLOW RESIZE ON WAREHOUSE MKTG_WH TO USER MKTG_LEAD;

1. GRANT OPERATE ON WAREHOUSE MKTG_WH TO ROLE MARKET;

1. GRANT MODIFY ON WAREHOUSE MKTG_WH TO USER MKTG_LEAD;

Correct answer

1. GRANT MODIFY ON WAREHOUSE MKTG_WH TO ROLE MARKETING;

chumma kizhii
#dm

Overall explanation

See this link.

Question 127Skipped

What affects whether the query results cache can be used?

If the virtual warehouse has been suspended

If multiple users are using the same virtual warehouse

If the query contains a deterministic function

Correct answer

If the referenced data in the table has changed

Overall explanation

Result Cache: Which holds the results of every query executed in the past 24 hours. These
are available across virtual warehouses, so query results returned to one user is available to
any other user on the system who executes the same query, provided the underlying data
has not changed.

Question 128Skipped

Which of the following conditions must be met in order to return results from the results
cache? (Choose two.)

The new query is run using the same virtual warehouse as the previous query.

Correct selection

The user has the appropriate privileges on the objects associated with the query.

Micro-partitions have been reclustered since the query was last run.

The query includes a User Defined Function (UDF).

Correct selection

The query has been run within 24 hours of the previously-run query.

Overall explanation

If the same query is fired again in 24 hrs, it will not be COMPUTED, which means it will not
be charged. and it is not affected by WH suspension.

So as the previous question the same exact query will return the pre-computed results if the
underlying data hasn't changed and the results were last accessed within previous 24-hour
period.

chumma kizhii
#dm

See this link.

Question 129Skipped

Network policies can be set at which Snowflake levels? (Choose two.)

Role

Correct selection

User

Tables

Schema

Correct selection

Account

Database

Overall explanation

Identifying a Network Policy Activated at the Account or User Level

See this link.

Question 130Skipped

Which command can be used to load data into an internal stage?

GET

Correct answer

PUT

COPY

LOAD

Overall explanation

PUT allows to load data into a stage (from local)

COPY allows to unload data into a stage (from Snowflake)

If the question was "Which command can be used to UNLOAD ..." then is correct copy, but
loading on stage means that you are loading external data, so PUT option is correct.

See this link.

Question 131Skipped

chumma kizhii
#dm

Which TABLE function helps to convert semi-structured data to a relational representation?

PARSE_JSON

CHECK_JSON

TO_JSON

Correct answer

FLATTEN

Overall explanation

See this link.

Question 132Skipped

Snowpark provides libraries for which programming languages? (Choose two.)

Correct selection

Python

C++

JavaScript

Correct selection

Scala

Overall explanation

Snowpark provide libs for:

• Java

• Python

• Scala

See this link.

Question 133Skipped

Which of the following statements about data sharing are true? (Choose two.)

Reader Accounts are charged for warehouse usage.

Correct selection

Shared databases are read-only.

chumma kizhii
#dm

Correct selection

Reader Accounts are created by Data Providers.

All database objects can be included in a shared database.

New objects created by a Data Provider are automatically shared with existing Data
Consumers and Reader Accounts.

Overall explanation

See this link.

Question 134Skipped

Which function determines the kind of value stored in a VARIANT column?

IS_ARRAY

CHECK_JSON

IS_JSON

Correct answer

TYPEOF

Overall explanation

TYPEOF

Reports the type of a value stored in a VARIANT column. The type is returned as a string.

See this link.

Question 135Skipped

Assume there is a table consisting of five micro-partitions with values ranging from A to Z.

Which diagram indicates a well-clustered table?

A)

B)

chumma kizhii
#dm

C)

D)

Correct answer

chumma kizhii
#dm

Overall explanation

When there is no overlap in the range of values across all micro-partitions, the micro-
partitions are considered to be in a constant state (i.e. they cannot be improved by
clustering).

See this link.

Question 136Skipped

A table needs to be loaded. The input data is in JSON format and is a concatenation of
multiple JSON documents. The file size is 3 GB. A warehouse size S is being used.

The following COPY INTO command was executed:


COPY INTO SAMPLE FROM @~/SAMPLE.JSON (TYPE=JSON)

The load failed with this error:


Max LOB size (16777216) exceeded, actual size of parsed column is 17894470.

How can this issue be resolved?

Split the file into multiple files in the recommended size range (100 MB - 250 MB).

Use a larger-sized warehouse.

Correct answer

Set STRIP_OUTER_ARRAY=TRUE in the COPY INTO command.

Compress the file and load the compressed file.

Overall explanation

If the data exceeds 16 MB, enable the STRIP_OUTER_ARRAY file format option for the COPY
INTO <table> command to remove the outer array structure and load the records into
separate table rows

See this link.

Question 137Skipped

Why does Snowflake recommend file sizes of 100-250 MB compressed when loading data?

Increases the latency staging and accuracy when loading the data

Correct answer

Allows optimization of parallel operations

chumma kizhii
#dm

Optimizes the virtual warehouse size and multi-cluster setting to economy mode

Allows a user to import the files in a sequential order

Overall explanation

See this link.

Question 138Skipped

What are the least privileges needed to view and modify resource monitors? (Choose two.)

USAGE

OWNERSHIP

Correct selection

MODIFY

Correct selection

MONITOR

SELECT

Overall explanation

By default, resource monitors can only be created by account administrators and, therefore,
can only be viewed and maintained by them.

However, roles that have been granted the following privileges on specific resource monitors
can view and modify the resource monitor as needed using SQL:

• MONITOR

• MODIFY

See this link.

Question 139Skipped

What will happen if a Snowflake user increases the size of a suspended virtual warehouse?

The warehouse will resume immediately and start to share the compute load with other
running virtual warehouses.

The provisioning of new compute resources for the warehouse will begin immediately.

The warehouse will remain suspended but new resources will be added to the query
acceleration service.

Correct answer

chumma kizhii
#dm

The provisioning of additional compute resources will be in effect when the warehouse is
next resumed.

Overall explanation

Resizing a suspended warehouse does not provision any new compute resources for the
warehouse. It simply instructs Snowflake to provision the additional compute resources
when the warehouse is next resumed, at which time all the usage and credit rules associated
with starting a warehouse apply.

See this link.

Question 140Skipped

Which of the following compute resources or features are managed by Snowflake? (Choose
two.)

Correct selection

Snowpipe

Updating data

Scaling up a warehouse

Execute a COPY command

Correct selection

AUTOMATIC_CLUSTERING

Overall explanation

• Snowpipe uses Snowflake-supplied compute resources.

• Automatic Clustering is the Snowflake service that seamlessly and continually


manages all reclustering, as needed, of clustered tables.

See this link.

Question 141Skipped

What service is provided as an integrated Snowflake feature to enhance Multi-Factor


Authentication (MFA) support?

OAuth

Okta

Single Sign-On (SSO)

Correct answer

chumma kizhii
#dm

Duo Security

Overall explanation

MFA provides increased login security for users connecting to Snowflake. MFA support is
provided as an integrated Snowflake feature, powered by the Duo Security service, which is
managed completely by Snowflake.

See this link.

Question 142Skipped

Where would a Snowflake user find information about query activity from 90 days ago?

account_usage.query_history_archive view

information_schema.query_history_by_session view

Correct answer

account_usage.query_history view

information_schema.query_history view

Overall explanation

See this link.

Question 143Skipped

What is the MINIMUM Snowflake edition required to use the periodic rekeying of micro-
partitions?

Business Critical

Correct answer

Enterprise

Virtual Private Snowflake

Standard

Overall explanation

Periodic rekeying requires Enterprise Edition (or higher).

See this link.

Question 144Skipped

What does the Activity area of Snowsight allow users to do? (Choose two.)

Create and manage user roles and permissions.

chumma kizhii
#dm

Correct selection

Explore each step of an executed query.

Schedule automated data backups.

Correct selection

Monitor queries executed by users in an account.

Access Snowflake Marketplace to find and integrate datasets.

Overall explanation

See this link.

Question 145Skipped

Which type of loop requires a BREAK statement to stop executing?

FOR

REPEAT

WHILE

Correct answer

LOOP

Overall explanation

BREAK is required in a LOOP but is not necessary in WHILE, FOR, and REPEAT.

See this link.

Question 146Skipped

Which data types are supported by Snowflake when using semi-structured data? (Choose
two.)

Correct selection

VARIANT

STRUCT

Correct selection

ARRAY

QUEUE

VARRAY

chumma kizhii
#dm

Overall explanation

See this link.

Question 147Skipped

Which statement MOST accurately describes clustering in Snowflake?

The database ACCOUNTADMIN must define the clustering methodology for each
Snowflake table.

Clustering can be disabled within a Snowflake account.

The clustering key must be included in the COPY command when loading data into
Snowflake.

Correct answer

Clustering is the way data is grouped together and stored within Snowflake micro-
partitions.

Overall explanation

See this link.

Question 148Skipped

What are the default Time Travel and Fail-safe retention periods for transient tables?

Correct answer

Time Travel - 1 day, Failsafe - 0 days

Time Travel - 0 days, Fail-safe - 1 day

chumma kizhii
#dm

Transient tables are retained in neither Fail-safe nor Time Travel.

Time Travel - 1 day, Fail-safe - 1 day

Overall explanation

Transient tables can have a Time Travel retention period of either 0 or 1 day.

Temporary tables can also have a Time Travel retention period of 0 or 1 day; however, this
retention period ends as soon as the table is dropped or the session in which the table was
created ends.

Transient and temporary tables have no Fail-safe period.

See this link.

Question 149Skipped

A Snowflake user executed a query and received the results. Another user executed the
same query 4 hours later. The data had not changed.

What will occur?

The virtual warehouse that is defined at the session level will be used to read all data.

Correct answer

No virtual warehouse will be used, data will be read from the result cache.

The default virtual warehouse will be used to read all data.

No virtual warehouse will be used, data will be read from the local disk cache.

Overall explanation

See this link.

Question 150Skipped

chumma kizhii
#dm

Which of the following describes how multiple Snowflake accounts in a single organization
relate to various cloud providers?

Each Snowflake account can be hosted in a different cloud vendor, but must be in the
same region.

Each Snowflake account must be hosted in a different cloud vendor and region.

Correct answer

Each Snowflake account can be hosted in a different cloud vendor and region.

All Snowflake accounts must be hosted in the same cloud vendor and region.

Overall explanation

The cloud platform you choose for each Snowflake account is completely independent from
your other Snowflake accounts. In fact, you can choose to host each Snowflake account on a
different platform, although this may have some impact on data transfer billing when
loading data.

See this link.

Question 151Skipped

What is the minimum Snowflake edition required to create a materialized view?

Virtual Private Snowflake Edition

Business Critical Edition

Standard Edition

Correct answer

Enterprise Edition

Overall explanation

See this link.

chumma kizhii
#dm

Question 152Skipped

Which Snowflake architectural layer is responsible for a query execution plan?

Data storage

Correct answer

Cloud services

Cloud provider

Compute

Overall explanation

The cloud services layer is a collection of services that coordinate activities across Snowflake.
These services tie together all of the different components of Snowflake in order to process
user requests, from login to query dispatch. The cloud services layer also runs on compute
instances provisioned by Snowflake from the cloud provider.

Services managed in this layer include:

• Authentication

• Infrastructure management

• Metadata management

• Query parsing and optimization

• Access control

See this link.

Question 153Skipped

What are the recommended steps to address poor SQL query performance due to data
spilling? (Choose two.)

Run a multi-cluster warehouse with maximized mode.

Add another cluster in the virtual warehouse.

Correct selection

Use a larger virtual warehouse.

Clone the base table.

Correct selection

Fetch required attributes only.

chumma kizhii
#dm

Overall explanation

The spilling can't always be avoided, especially for large batches of data, but it can be
decreased by:

• Reviewing the query for query optimization especially if it is a new query

• Reducing the amount of data processed. For example, by trying to improve partition
pruning, or projecting only the columns that are needed in the output.

• Decreasing the number of parallel queries running in the warehouse.

• Trying to split the processing into several steps (for example by replacing the CTEs
with temporary tables).

• Using a larger warehouse - this effectively means more memory and more local disk
space.

See this link.

Question 154Skipped

How do secure views compare to non-secure views in Snowflake?

Correct answer

Secure views execute slowly compared to non-secure views.

There are no performance differences between secure and non-secure views.

Secure views are similar to materialized views in that they are the most performant.

Non-secure views are preferred over secure views when sharing data.

Overall explanation

See this link.

Question 155Skipped

When unloading to a stage, which of the following is a recommended practice or approach?

Use OBJECT_CONSTRUCT(*) when using Parquet.

Correct answer

Define an individual file format.

Avoid the use of the CAST function.

Set SINGLE = TRUE for larger files.

Overall explanation

chumma kizhii
#dm

Data Unloading Considerations:

Defining a File Format: File format defines the type of data to be unloaded into the stage or
S3. It is best practice to define an individual file format when regularly used to unload a
certain type of data based on the characteristics of the file needed.

See this link.

SET 6

Question 1Skipped

How can an administrator check for updates (for example, SCIM API requests) sent to
Snowflake by the identity provider?

QUERY_HISTORY

ACCESS_HISTORY

LOAD_HISTORY

Correct answer

REST_EVENT_HISTORY

Overall explanation

Administrators can query the rest_event_history table to determine whether the identity
provider is sending updates (i.e. SCIM API requests) to Snowflake.

See this link.

Question 2Skipped

What type of NULL values are supported in semi-structured data? (Choose two.)

Avro

Parquet

ORC

Correct selection

SQL

chumma kizhii
#dm

Correct selection

JSON

Overall explanation

Snowflake supports two types of NULL values in semi-structured data: SQL NULL:

• SQL NULL means the same thing for semi-structured data types as it means for
structured data types: the value is missing or unknown.

• JSON null (sometimes called “VARIANT NULL”): In a VARIANT column, JSON null
values are stored as a string containing the word “null” to distinguish them from SQL
NULL values.

See this link.

Question 3Skipped

Which function will return a row for each for each object in a VARIANT, OBJECT, or ARRAY
column?

Correct answer

FLATTEN

CAST

GET

PARSE_JSON

Overall explanation

The FLATTEN function is a table function that takes a VARIANT, OBJECT, or ARRAY column and
returns a row for each element or attribute within the column.

See this link.

Question 4Skipped

What JavaScript delimiters are available in Snowflake stored procedures? (Choose two.)

Double backslash (\\)

Double forward slash (//)

Correct selection

Single quote (’)

Double quotes (“)

chumma kizhii
#dm

Correct selection

Double dollar sign ($$)

Overall explanation

The JavaScript portion of the stored procedure code must be enclosed within either single
quotes ' or double dollar signs $$.

See this link.

Question 5Skipped

Which clustering indicator will show if a large table in Snowflake will benefit from explicitly
defining a clustering key?

Ratio

Correct answer

Depth

Total partition count

Percentage

Overall explanation

See this link.

Question 6Skipped

Which of the following languages can be used to implement Snowflake User Defined
Functions (UDFs)? (Choose two.)

Correct selection

SQL

Correct selection

Javascript

Ruby

C#

PERL

Overall explanation

See this link.

Question 7Skipped

chumma kizhii
#dm

What are characteristics of transient tables in Snowflake? (Choose two.)

Transient tables can be altered to make them permanent tables.

Correct selection

Transient tables persist until they are explicitly dropped.

Transient tables have a Fail-safe period of 7 days.

Transient tables can be cloned to permanent tables.

Correct selection

Transient tables have Time Travel retention periods of 0 or 1 day.

Overall explanation

Transient tables are a type of table in Snowflake that persist until they are explicitly dropped.
They do not have a Fail-safe period, and they can only have a Time Travel retention period of
0 or 1 day. Transient tables cannot be cloned to permanent tables, and they cannot be
altered to make them permanent tables.

Question 8Skipped

Which type of role can be granted to a share?

Custom role

Account role

Correct answer

Database role

Secondary role

Overall explanation

chumma kizhii
#dm

Grant the database role to a share and grant future privileges on an object to the database
role:

1. GRANT DATABASE ROLE dbr1 TO SHARE myshare;

2. GRANT SELECT ON FUTURE TABLES IN SCHEMA sh TO DATABASE ROLE dbr1;

See this link.

Question 9Skipped

What are valid sub-clauses to the OVER clause for a window function? (Choose two.)

UNION ALL

GROUP BY

LIMIT

Correct selection

PARTITION BY

Correct selection

ORDER BY

Overall explanation

Window Syntax

<function> ([ <arguments> ]) OVER ([ PARTITION BY <expr1> ] [ ORDER BY <expr2> ])

See this link.

Question 10Skipped

How are URLs that access unstructured data in external stages retrieved?

By using the INFORMATION_USAGE schema

From the Snowsight navigation menu

Correct answer

By querying a directory table

By creating an external function

Overall explanation

See this link.

Question 11Skipped

chumma kizhii
#dm

What are characteristics of Snowsight worksheets? (Choose two.)

Correct selection

Each worksheet is a unique Snowflake session.

Correct selection

Users can import worksheets and share them with other users.

Users are limited to running only one query on a worksheet.

Worksheets can be grouped under folders, and a folder of folders.

The Snowflake session ends when a user switches worksheets.

Overall explanation

You can share worksheets and folders of worksheets with other Snowflake users in your
account. Each worksheet is a unique session and can use roles different from the role you
select. Folders can't be nested.

See this link.

Question 12Skipped

How does Snowflake handle the bulk unloading of data into single or multiple files?

It uses COPY INTO to copy the data from a table into one or more files in an external stage
only.

It uses COPY INTO for bulk unloading where the default option is SINGLE = TRUE.

Correct answer

It assigns each unloaded data file a unique name.

It uses the PUT command to download the data by default.

Overall explanation

Bulk Unloading into Single or Multiple Files The COPY INTO <location> command provides a
copy option (SINGLE) for unloading data into a single file or multiple files. The default is
SINGLE = FALSE (i.e. unload into multiple files).

Snowflake assigns each file a unique name. The location path specified for the command can
contain a filename prefix that is assigned to all the data files generated. If a prefix is not
specified, Snowflake prefixes the generated filenames with data_.

See this link.

Question 13Skipped

chumma kizhii
#dm

What ensures that a user with the role SECURITYADMIN can activate a network policy for an
individual user?

A role that has been granted the global ATTACH POLICY privilege

A role that has been granted the EXECUTE TASK privilege

Correct answer

Ownership privilege on both the user and the network policy

Ownership privilege on only the role that created the network policy

Overall explanation

Only the role with the OWNERSHIP privilege on both the user and the network policy, or a
higher role, can activate a network policy for an individual user.

See this link.

Question 14Skipped

How often are encryption keys automatically rotated by Snowflake?

365 Days

Correct answer

30 Days

60 Days

90 Days

Overall explanation

All Snowflake-managed keys are automatically rotated by Snowflake when they are more
than 30 days old

See this link.

Question 15Skipped

When unloading data, which file format preserves the data values for floating-point number
columns?

JSON

CSV

Avro

Correct answer

chumma kizhii
#dm

Parquet

Overall explanation

When floating-point number columns are unloaded to CSV or JSON files, Snowflake
truncates the values to approximately (15,9).

The values are not truncated when unloading floating-point number columns to Parquet
files.

See this link.

Question 16Skipped

Which result shows efficient pruning?

Correct answer

Partitions scanned is less than partitions total.

Partitions scanned is greater than partitions total.

Partitions scanned is equal to the partitions total.

Partitions scanned is greater than or equal to the partitions total.

Overall explanation

See this link.

Question 17Skipped

What does the VARIANT data type impose a 16 MB size limit on?

All rows

Individual columns

Correct answer

Individual rows

All columns

Overall explanation

The VARIANT data type imposes a 16 MB size limit on individual rows.

See this link.

Question 18Skipped

How can a Snowflake user share data with another user who does not have a Snowflake
account?

chumma kizhii
#dm

Share the data by implementing User-Defined Functions (UDFs)

Move the Snowflake account to a region where data sharing is enabled

Correct answer

Create a reader account and create a share of the data

Grant the READER privilege to the database that is going to be shared

Overall explanation

See this link.

Question 19Skipped

A Snowflake user wants to share data with someone who does not have a Snowflake
account.

How can the Snowflake user share the data?

Use a Snowflake share.

Use the Snowflake Marketplace.

Correct answer

Create a reader account.

Create a consumer account.

Overall explanation

See this link.

Question 20Skipped

What column type does a Kafka connector store formatted information in a single column?

ARRAY

VARCHAR

Correct answer

VARIANT

OBJECT

Overall explanation

chumma kizhii
#dm

Each Kafka message is passed to Snowflake in JSON format or Avro format. The Kafka
connector stores that formatted information in a single column of type VARIANT. The data is
not parsed, and the data is not split into multiple columns in the Snowflake table.

See this link.

Question 21Skipped

What is the MINIMUM permission needed to access a file URL from an external stage?

READ

MODIFY

Correct answer

USAGE

SELECT

Overall explanation

Permissions Required

Query

USAGE (external stage) or READ (internal stage)

See this link.

Question 22Skipped

What actions does the use of the PUT command do automatically? (Choose two.)

It creates an empty target table.

Correct selection

It encrypts the file data in transit.

Correct selection

It compresses all files using GZIP.

It creates a file format object.

It uses the last stage created.

Overall explanation

Compression and Encryption

See this link.

Question 23Skipped

chumma kizhii
#dm

What Snowflake function should be used to unload relational data to JSON?

TO_VARIANT()

Correct answer

OBJECT_CONSTRUCT()

PARSE_JSON()

BUILD_STAGE_FILE_URL()

Overall explanation

See this article.

OBJECT_CONSTRUCT Returns an OBJECT constructed from the arguments.

See this link.

Question 24Skipped

How do managed access schemas help with data governance?

They require the use of masking and row access policies across every table and view in the
schema.

They log all operations and enable fine-grained auditing.

They enforce identical privileges across all tables and views in a schema.

Correct answer

They provide centralized privilege management with the schema owner.

Overall explanation

With managed access schemas, object owners lose the ability to make grant decisions. Only
the schema owner (i.e. the role with the OWNERSHIP privilege on the schema) or a role with
the MANAGE GRANTS privilege can grant privileges on objects in the schema, including
future grants, centralizing privilege management.

See this link.

Question 25Skipped

Which roles can make grant decisions to objects within a managed access schema? (Choose
two.)

Correct selection

ACCOUNTADMIN

chumma kizhii
#dm

SYSADMIN

Correct selection

SECURITYADMIN

USERADMIN

ORGADMIN

Overall explanation

See this link.

Question 26Skipped

How are micro-partitions typically generated in Snowflake?

GROUP BY <>;

Correct answer

Automatically

ORDER BY <>;

PARTITION BY <>;

Overall explanation

See this link.

Question 27Skipped

What are characteristics of reader accounts in Snowflake? (Choose two.)

A single reader account can consume data from multiple provider accounts.

Data consumers are responsible for reader account setup and data usage costs.

chumma kizhii
#dm

Correct selection

Reader accounts enable data consumers to access and query data shared by the provider.

Reader account users can share data to other reader accounts.

Correct selection

Reader account users cannot add new data to the account.

Overall explanation

See this link.

Question 28Skipped

What are characteristics of Snowflake directory fables? (Choose two.)

Correct selection

A directory table can be added to a stage when the stage is created, or later.

Directory tables are separate database objects.

Directory tables can only be used with an external stage.

Correct selection

Directory tables store file-level metadata about the data files in a stage.

Directory tables contain copies of staged files in binary format.

Overall explanation

A directory table is an implicit object layered on a stage and it stores file-level metadata
about the data files in the stage.

You can add a directory table to a stage when you create a stage (using CREATE STAGE) or
later (using ALTER STAGE).

See this link.

Question 29Skipped

When should a stored procedure be created with caller's rights?

When the caller needs to run a statement that could not execute outside of the stored
procedure

When the caller needs to be prevented from viewing the source code of the stored
procedure

chumma kizhii
#dm

When the stored procedure needs to operate on objects that the caller does not have
privileges on

Correct answer

When the stored procedure needs to run with the privileges of the role that called the
stored procedure

Overall explanation

See this link.

Question 30Skipped

What statistical information in a Query Profile indicates that the query is too large to fit in
memory? (Choose two.)

Bytes spilled to remote cache.

Correct selection

Bytes spilled to local storage.

Bytes spilled to local cache.

Bytes spilled to remote metastore.

Correct selection

Bytes spilled to remote storage.

Overall explanation

Bytes spilled to local storage — volume of data spilled to local disk.

Bytes spilled to remote storage — volume of data spilled to remote disk.

See this link.

Question 31Skipped

Which operation will produce an error in Snowflake?

Inserting duplicate values into a PRIMARY KEY column

Correct answer

Inserting a NULL into a column with a NOT NULL constraint

Inserting a value to FOREIGN KEY column that does not match a value in the column
referenced

Inserting duplicate values into a column with a UNIQUE constraint

chumma kizhii
#dm

Overall explanation

Snowflake supports defining and maintaining constraints, but does not enforce them, except
for NOT NULL constraints, which are always enforced.

See this link.

Question 32Skipped

How long can a data consumer who has a pre-signed URL access data files using Snowflake?

Until the result_cache expires

Correct answer

Until the expiration_time is exceeded

Indefinitely

Until the retention_time is met

Overall explanation

See this link.

Question 33Skipped

A user wants to access files stored in a stage without authenticating into Snowflake. Which
type of URL should be used?

File URL

Staged URL

Scoped URL

Correct answer

Pre-signed URL

Overall explanation

You can allow data consumers to retrieve either scoped or pre-signed URLs from the secure
view. Scoped URLs provide better security, while pre-signed URLs can be accessed without
authorization or authentication.

See this link.

Question 34Skipped

Which Snowflake object can be accessed in the FROM clause of a query, returning a set of
rows having one or more columns?

chumma kizhii
#dm

A Scalar User Defined Function (UDF)

Correct answer

A User-Defined Table Function (UDTF)

A stored procedure

A task

Overall explanation

See this link.

Question 35Skipped

Which function returns the URL of a stage using the stage name as the input?

GET_PRESIGNED_URL

Correct answer

GET_STAGE_LOCATION

BUILD_STAGE_FILE_URL

BUILD_SCOPED_FILE_URL

Overall explanation

See this link.

Question 36Skipped

Which common query problems are identified by the Query Profile? (Choose two.)

Correct selection

Queries too large to fit in memory

Object does not exist or not authorized

Ambiguous column names

Correct selection

Inefficient pruning

Syntax error

Overall explanation

Pruning - information on the effects of table pruning.

chumma kizhii
#dm

Spilling — information about disk usage for operations where intermediate results do not fit
in memory.

See this link.

Question 37Skipped

What are Snowflake best practices when assigning the ACCOUNTADMIN role to users?
(Choose two.)

Correct selection

All users assigned the ACCOUNTADMIN role should use Multi-Factor Authentication (MFA).

The ACCOUNTADMIN role should be given to any user who needs a high level of authority.

The ACCOUNTADMIN role should be used to create Snowflake objects.

The ACCOUNTADMIN role should be used for running automated scripts.

Correct selection

The ACCOUNTADMIN role should be assigned to at least two users.

Overall explanation

Snowflake strongly recommends the following precautions when assigning the


ACCOUNTADMIN role to users:

• Assign this role only to a select/limited number of people in your organization.

• All users assigned the ACCOUNTADMIN role should also be required to use multi-
factor authentication (MFA) for login (for details, see Configuring Access Control).

• Assign this role to at least two users. We follow strict security procedures for
resetting a forgotten or lost password for users with the ACCOUNTADMIN role. These
procedures can take up to two business days. Assigning the ACCOUNTADMIN role to
more than one user avoids having to go through these procedures because the users
can reset each other’s passwords.

See this link.

Question 38Skipped

A complex SQL query involving eight tables with joins is taking a while to execute. The Query
Profile shows that all partitions are being scanned.

What is causing the query performance issue?

The columns in the micro-partitions need granular ordering based on the dataset.

A huge volume of data is being fetched, with many joins applied.

chumma kizhii
#dm

Incorrect joins are being used, leading to scanning and pulling too many records.

Correct answer

Pruning is not being performed efficiently.

Overall explanation

Key is: Query Profile shows that all partitions are being scanned.

The efficiency of pruning can be observed by comparing Partitions scanned and Partitions
total statistics in the TableScan operators. If the former is a small fraction of the latter,
pruning is efficient. If not, the pruning did not have an effect.

See this link.

Question 39Skipped

What does Snowflake's search optimization service support?

Materialized views

External tables

Casts on table columns (except for fixed-point numbers cast to strings)

Correct answer

Tables that use masking policies and row access policies.

Overall explanation

The search optimization service does not support the following:

• External tables.

• Materialized views.

• Columns defined with a COLLATE clause.

• Column concatenation.

• Analytical expressions.

• Casts on table columns (except for fixed-point numbers cast to strings).

See this link.

And this link.

Question 40Skipped

Which object-level parameters can be set to help control query processing and concurrency?
(Choose two).

chumma kizhii
#dm

Correct selection

STATEMENT_QUEUED_TIMEOUT_IN_SECONDS

Correct selection

STATEMENT_TIMEOUT_IN_SECONDS

MAX_CONCURRENCY_LEVEL

MIN_DATA_RETENTION_TIME_IN_DAYS

DATA_RETENTION_TIME_IN_DAYS

Overall explanation

Query Processing and Concurrency

The number of queries that a warehouse can concurrently process is determined by the size
and complexity of each query. As queries are submitted, the warehouse calculates and
reserves the compute resources needed to process each query. If the warehouse does not
have enough remaining resources to process a query, the query is queued, pending
resources that become available as other running queries complete.

Snowflake provides some object-level parameters that can be set to help control query
processing and concurrency:

STATEMENT_QUEUED_TIMEOUT_IN_SECONDS

STATEMENT_TIMEOUT_IN_SECONDS

See this link.

Question 41Skipped

Which statements reflect valid commands when using secondary roles? (Choose two.)

Correct selection

1. USE SECONDARY ROLES ALL

1. USE SECONDARY ROLES SUSPEND

1. USE SECONDARY ROLES RESUME

1. USE SECONDARY ROLES ADD

Correct selection

1. USE SECONDARY ROLES NONE

Overall explanation

chumma kizhii
#dm

See this link.

Question 42Skipped

What is the MINIMUM size requirement when creating a Snowpark-optimized virtual


warehouse?

Small

X-Small

Large

Correct answer

Medium

Overall explanation

See this link.

Question 43Skipped

A user needs to MINIMIZE the cost of large tables that are used to store transitory data. The
data does not need to be protected against failures, because the data can be reconstructed
outside of Snowflake.

What table type should be used?

External

Directory

Permanent

Correct answer

Transient

Overall explanation

Transient tables are best suited in scenarios where the data in your table is not critical and
can be recovered from external means if required. Also, they have no fail-safe period, and
Time travel is also only 1 day (which can be set to 0 also).

See this link.

chumma kizhii
#dm

Question 44Skipped

The use of which technique or tool will improve Snowflake query performance on very large
tables?

Materialized views

Indexing

Multi-clustering

Correct answer

Clustering keys

Overall explanation

A clustering key is a subset of columns in a table (or expressions on a table) that are explicitly
designated to co-locate the data in the table in the same micro-partitions. This is useful for
very large tables where the ordering was not ideal (at the time the data was
inserted/loaded) or extensive DML has caused the table’s natural clustering to degrade.

See this link.

Question 45Skipped

When using the ALLOW_CLIENT_MFA_CACHING parameter, how long is a cached Multi-


Factor Authentication (MFA) token valid for?

Correct answer

4 hours

1 hour

2 hours

8 hours

Overall explanation

chumma kizhii
#dm

MFA token caching can help to reduce the number of prompts that must be acknowledged
while connecting and authenticating to Snowflake, especially when multiple connection
attempts are made within a relatively short time interval. A cached MFA token is valid for up
to four hours.

See this link.

Question 46Skipped

What metadata does Snowflake store concerning all rows stored in a micro-partition?
(Choose two.)

Correct selection

The number of distinct values for each column in the micro-partition

The range of values for each of the rows in the micro-partition

Correct selection

The range of values for each of the columns in the micro-partition

The range of values for each partition in the micro-partition

A count of the number of total values in the micro-partition

Overall explanation

Snowflake stores metadata about all rows stored in a micro-partition, including:

• The range of values for each of the columns in the micro-partition.

• The number of distinct values.

• Additional properties used for both optimization and efficient query processing.

See this link.

Question 47Skipped

A Snowflake user wants to unload data from a relational table sized 5 GB using CSV. The
extract needs to be as performant as possible.

What should the user do?

Correct answer

Leave the default MAX_FILE_SIZE to 16 MB to take advantage of parallel operations.

Use a regular expression in the stage specification of the COPY command to restrict
parsing time.

chumma kizhii
#dm

Use Parquet as the unload file format, using Parquet's default compression feature.

Increase the default MAX_FILE_SIZE to 5 GB and set SINGLE = true to produce a single file.

Overall explanation

By default, COPY INTO location statements separate table data into a set of output files to
take advantage of parallel operations. The maximum size for each file is set using the
MAX_FILE_SIZE copy option. The default value is 16777216 (16 MB) but can be increased to
accommodate larger files. The maximum file size supported is 5 GB for Amazon S3, Google
Cloud Storage, or Microsoft Azure stages.

See this link.

Question 48Skipped

Why would a Snowflake user choose to use a transient table?

Correct answer

To store transitory data that needs to be maintained beyond the session

To store data for long-term analysis

To create a permanent table for ongoing use in ELT

To store large data files that are used frequently

Overall explanation

Transient tables are specifically designed for transitory data that needs to be maintained
beyond each session

See this link.

Question 49Skipped

Which kind of Snowflake table stores file-level metadata for each file in a stage?

Transient

External

Correct answer

Directory

Temporary

Overall explanation

chumma kizhii
#dm

A directory table is an implicit object layered on a stage (not a separate database object) and
is conceptually similar to an external table because it stores file-level metadata about the
data files in the stage. A directory table has no grantable privileges of its own.

See this link.

Question 50Skipped

Which statement describes when a virtual warehouse can be resized?

A resize will affect running, queued, and new queries.

Correct answer

A resize can be completed at any time.

A resize can only be completed when the warehouse is in an auto-resume status.

A resize must be completed when the warehouse is suspended.

Overall explanation

A warehouse can be resized up or down at any time, including while it is running and
processing statements.

See this link.

Question 51Skipped

User A cloned a schema and overwrote a schema that User B was working on. User B no
longer has access to their version of the tables. However, this all occurred within the Time
Travel retention period defined at the database level.

How should the missing tables be restored?

Correct answer

Rename the cloned schema and use an UNDROP SCHEMA statement.

Contact Snowflake Support to retrieve the data from Fail-safe

Use an UNDROP TABLE statement.

Use a CREATE TABLE AS SELECT statement

Overall explanation

If an object with the same name already exists, UNDROP fails. You must rename the existing
object, which then enables you to restore the previous version of the object.

See this link.

chumma kizhii
#dm

Question 52Skipped

What happens when the size of a virtual warehouse is changed?

Queries that are running on the current warehouse configuration are moved to the new
configuration and finished there.

Correct answer

Queries that are running on the current warehouse configuration are not impacted.

Queries that are running on the current warehouse configuration are aborted and have to
be resubmitted by the user.

Queries that are running on the current warehouse configuration are aborted and are
automatically resubmitted.

Overall explanation

Resizing a running warehouse does not impact queries that are already being processed by
the warehouse; the additional compute resources, once fully provisioned, are only used for
queued and new queries.

See this link.

Question 53Skipped

Which use case will always cause an exploding join in Snowflake?

Correct answer

A query that has not specified join criteria for tables.

A query that has more than 10 left outer joins.

A query that is using a UNION without an ALL.

A query that has requested too many columns of data.

Overall explanation

It's also called cartesian product.

See this link.

Exploding join looks like this on Query Profile UI:

chumma kizhii
#dm

Question 54Skipped

Which user object property requires contacting Snowflake Support in order to set a value for
it?

Correct answer

MINS_TO_BYPASS_NETWORK_POLICY

DISABLED

MINS_TO_BYPASS_MFA

MINS_TO_UNLOCK

Overall explanation

It is possible to temporarily bypass a network policy for a set number of minutes by


configuring the user object property MINS_TO_BYPASS_NETWORK_POLICY, which can be
viewed by executing DESCRIBE USER. Only Snowflake can set the value for this object
property. Please contact Snowflake Support to set a value for this property.

See this link.

Question 55Skipped

Which command would return an empty sample?

1. select * from testtable sample (null);

chumma kizhii
#dm

Correct answer

1. select * from testtable sample (0);

1. select * from testtable sample (none);

1. select * from testtable sample ();

Overall explanation

Return an empty sample: SELECT * FROM testtable SAMPLE ROW (0);

See this link.

Question 56Skipped

A tag object has been assigned to a table (TABLE_A) in a schema within a Snowflake
database.

Which CREATE object statement will automatically assign the TABLE_A tag to a target object?

1. CREATE MATERIALIZED VIEW AS SELECT * FROM TABLE_A;

1. CREATE VIEW AS SELECT * FROM TABLE_A;

1. CREATE TABLE AS SELECT * FROM TABLE_A;

Correct answer

1. CREATE TABLE LIKE TABLE_A;

Overall explanation

With CREATE TABLE … LIKE, tags assigned to the source table are assigned to the target table

See this link.

Question 57Skipped

Which object type is granted permissions for reading a table?

Attribute

Schema

User

Correct answer

Role

Overall explanation

chumma kizhii
#dm

See this link.

Question 58Skipped

Which virtual warehouse privilege is required to view a load-monitoring chart?

OPERATE

MODIFY

USAGE

Correct answer

MONITOR

Overall explanation

See this link.

Question 59Skipped

What does Snowflake recommend regarding database object ownership? (Choose two.)

Use only managed access schemas for objects owned by ACCOUNTADMIN.

Create objects with SECURITYADMIN to ease granting of privileges later.

Create objects with ACCOUNTADMIN and do not reassign ownership.

Correct selection

Create objects with a custom role and grant this role to SYSADMIN.

Correct selection

Create objects with SYSADMIN.

Overall explanation

SYSADMIN - Role that has privileges to create warehouses and databases (and other objects)
in an account.

If, as recommended, you create a role hierarchy that ultimately assigns all custom roles to
the SYSADMIN role, this role also has the ability to grant privileges on warehouses,
databases, and other objects to other roles.

See this link.

Question 60Skipped

Which operation can be performed on Snowflake external tables?

DELETE

chumma kizhii
#dm

RENAME

Correct answer

ALTER

INSERT

Overall explanation

External tables are read-only. You cannot perform data manipulation language (DML)
operations on them. However, you can use external tables for query and join operations. You
can also create views against external tables.

ALTER EXTERNAL TABLE

Modifies the properties, columns, or constraints for an existing external table.

See this link.

Question 61Skipped

What is used to extract the content of PDF files stored in Snowflake stages?

FLATTEN function

HyperLogLog (HLL) function

Correct answer

Java User-Defined Function (UDF)

Window function

Overall explanation

See this link.

This quickstart is a good example.

Question 62Skipped

What does it mean when the sample function uses the Bernoulli sampling method?

Correct answer

The data is based on sampling every row.

The data is based on sampling 1000 rows of the source data.

The data is based on sampling blocks of the source data.

The data is based on sampling 10% of the source data.

Overall explanation

chumma kizhii
#dm

BERNOULLI (or ROW): Includes each row with a probability of p/100. Similar to flipping a
weighted coin for each row.

See this link.

Question 63Skipped

Where can a Snowflake user find the query history in Snowsight?

Correct answer

Activity

Data

Dashboards

Admin

Overall explanation

See this link.

Question 64Skipped

What is the default period of time the Warehouse Activity section provides a graph of
Snowsight activity?

1 week

2 hours

1 month

Correct answer

2 weeks

Overall explanation

Warehouse Activity

The Warehouse Activity section provides a graph of activity over a period of time:

• Hour

• Day

• Week

• 2 Weeks (default)

See this link.

Question 65Skipped

chumma kizhii
#dm

How does Snowflake define its approach to Discretionary Access Control (DAC)?

A defined level of access to an object.

An entity to which access can be granted.

Access privileges are assigned to roles, which are in turn assigned to users.

Correct answer

Each object has an owner, who can in turn grant access to that object.

Overall explanation

Discretionary Access Control (DAC): Each object has an owner, who can in turn grant access
to that object.

See this link.

Question 66Skipped

What MINIMUM privilege is required on the external stage for any role in the GET REST API
to access unstructured data files using a file URL?

Correct answer

USAGE

OWNERSHIP

WRITE

READ

Overall explanation

External: USAGE

Internal: READ

See this link.

Question 67Skipped

What actions can be performed by a consumer account on a shared database? (Choose


two.)

Correct selection

Joining the data from a shared table with another table

Cloning a shared table

Using Time Travel on a shared table

chumma kizhii
#dm

Modifying the data in a shared table

Correct selection

Executing the SELECT statement on a shared table

Overall explanation

See this link.

Question 68Skipped

Which parameter prevents streams on tables from becoming stale?

MIN_DATA_RETENSION_TIME_IN_DAYS

Correct answer

MAX_DATA_EXTENSION_TIME_IN_DAYS

LOCK_TIMEOUT

STALE_AFTER

Overall explanation

See this link.

Question 69Skipped

Which activities are managed by Snowflake’s Cloud Services layer? (Choose two.)

Data compression

Correct selection

Authentication

Access delegation

Data pruning

Correct selection

Query parsing and optimization

Overall explanation

Services managed in this layer include:

• Authentication

• Infrastructure management

• Metadata management

chumma kizhii
#dm

• Query parsing and optimization

• Access control

See this link.

Question 70Skipped

What are reasons for using the VALIDATE function in Snowflake after a COPY INTO command
execution? (Choose two.)

To count the number of errors encountered during the execution of the COPY INTO
command

Correct selection

To validate the files that have been loaded earlier using the COPY INTO command

Correct selection

To return errors encountered during the execution of the COPY INTO command

To identify potential issues in the COPY INTO command before it is executed

To fix errors that were made during the execution of the COPY INTO command

Overall explanation

VALIDATE function validates the files loaded in a past execution of the COPY INTO <table>
command and returns all the errors encountered during the load, rather than just the first
error.

See this link.

Question 71Skipped

Two users share a virtual warehouse named WH_DEV_01. When one of the users loads data,
the other one experiences performance issues while querying data.

How does Snowflake recommend resolving this issue?

Stop loading and querying data at the same time

Create separate warehouses for each user

Correct answer

Create separate warehouses for each workload

Scale up the existing warehouse

Overall explanation

chumma kizhii
#dm

It is always good practice to separate the loading and transformation/analysis warehouses to


adjust the size that best suits each case.

If the running query load is high or there’s queuing, consider starting a separate warehouse
and moving queued queries to that warehouse.

See this link.

Question 72Skipped

Which function should be used to find the query ID of the second query executed in a
current session?

Correct answer

1. Select LAST_QUERY_ID(2)

1. Select LAST_QUERY_ID(1)

1. Select LAST_QUERY_ID(-1)

1. Select LAST_QUERY_ID(-2)

Overall explanation

Positive numbers start with the first query executed in the session. For example:

LAST_QUERY_ID(1) returns the first query.

LAST_QUERY_ID(2) returns the second query.

LAST_QUERY_ID(6) returns the sixth query.

Etc.

Negative numbers start with the most recently-executed query in the session. For example:

LAST_QUERY_ID(-1) returns the most recently-executed query (equivalent to


LAST_QUERY_ID()).

LAST_QUERY_ID(-2) returns the second most recently-executed query.

See this link.

Question 73Skipped

Which command should a Snowflake user execute to load data into a table?

1. copy into mytable file_format = (format_name);

1. copy into mytable validation = ‘RETURN_ERRORS’;

Correct answer

chumma kizhii
#dm

1. copy into mytable from @my_int_stage;

1. copy into mytable purge_mode = TRUE;

Overall explanation

See this link to check syntax.

Question 74Skipped

Which view will return users who have queried a table?

Correct answer

SNOWFLAKE.ACCOUNT_USAGE.ACCESS_HISTORY

SNOWFLAKE.ACCOUNT_USAGE.OBJECT_DEPENDENCIES

SNOWFLAKE.ACCOUNT_USAGE.COLUMNS

SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_EVENT_HISTORY

Overall explanation

See this link.

Question 75Skipped

What does Snowflake attempt to do if any of the compute resources for a virtual warehouse
fail to provision during start-up?

Queue the failed resources.

Correct answer

Repair the failed resources.

Restart the failed resources.

Provision the failed resources.

Overall explanation

If any of the compute resources for the warehouse fail to provision during start-up,
Snowflake attempts to repair the failed resources.

See this link.

Question 76Skipped

By default, which role has privileges to create tables and views in an account?

USERADMIN

PUBLIC

chumma kizhii
#dm

Correct answer

SYSADMIN

SECURITYADMIN

Overall explanation

The SYSADMIN role is a system-defined role that has privileges to create warehouses,
databases, and database objects in an account and grant those privileges to other roles.

See this link.

Question 77Skipped

What is a characteristic of the Snowflake query profiler?

It provides detailed statistics about which queries are using the greatest number of
compute resources.

Correct answer

It provides a graphic representation of the main components of the query processing.

It can be used by third-party software using the query profiler API.

It can provide statistics on a maximum number of 100 queries per week.

Overall explanation

Query Profile, available through the classic web interface, provides execution details for a
query. For the selected query, it provides a graphical representation of the main components
of the processing plan for the query, with statistics for each component, along with details
and statistics for the overall query.

See this link.

Question 78Skipped

Snowflake strongly recommends that all users with what type of role be required to use
Multi-Factor Authentication (MFA)?

Correct answer

ACCOUNTADMIN

USERADMIN

SECURITYADMIN

SYSADMIN

Overall explanation

chumma kizhii
#dm

All users assigned the ACCOUNTADMIN role should also be required to use multi-factor
authentication (MFA) for login.

See this link.

Question 79Skipped

A JSON object is loaded into a column named data using a Snowflake variant datatype. The
root node of the object is BIKE. The child attribute for this root node is BIKEID.

Which statement will allow the user to access BIKEID?

1. select data.BIKE.BIKEID

Correct answer

1. select data:BIKE.BIKEID

1. select data:BIKE:BIKEID

1. select data:BIKEID

Overall explanation

SYNTAX --> COLUMN:FirstElement.SubsequentElement

See this link.

Question 80Skipped

Which command should be used when loading many flat files into a single table?

Correct answer

COPY INTO

INSERT

MERGE

PUT

Overall explanation

See this link.

Question 81Skipped

What technique does Snowflake recommend for determining which virtual warehouse size
to select?

Always start with an X-Small and increase the size if the query does not complete in 2
minutes

chumma kizhii
#dm

Use X-Large or above for tables larger than 1 GB

Correct answer

Experiment by running the same queries against warehouses of different sizes

Use the default size Snowflake chooses

Overall explanation

The keys to using warehouses effectively and efficiently are:

1. Experiment with different types of queries and different warehouse sizes to


determine the combinations that best meet your specific query needs and workload.

2. Don’t focus on warehouse size. Snowflake utilizes per-second billing, so you can run
larger warehouses (Large, X-Large, 2X-Large, etc.) and simply suspend them when not
in use.

See this link.

Question 82Skipped

What is to be expected when sharing worksheets in Snowsight?

Snowsight allows users to view and refresh results but not to edit shared worksheets.

Snowsight offers different sharing permissions at the worksheet, folder, and dashboard
level.

Correct answer

To run a shared worksheet, a user must be granted the role used for the worksheet session
context.

Worksheets can be shared with users that are internal or external to any organization.

Overall explanation

When you share a worksheet with someone, you can manage access to the worksheet and
its contents by choosing which permissions to grant to the other user. These permissions are
also used for sharing dashboards. Worksheet owners have the same permissions as
worksheet editors.

Each worksheet in Snowsight uses a unique session with a specific role and warehouse
assigned in the context of the worksheet. The worksheet role is the primary role last used to
run the worksheet and is required to run the worksheet. The worksheet role can change if
the worksheet owner or editor runs the worksheet using a different role.

See this link.

chumma kizhii
#dm

Question 83Skipped

Which data types are valid in Snowflake? (Choose two.)

Correct selection

Geography

BLOB

Correct selection

Variant

CLOB

JSON

Overall explanation

See this link.

Question 84Skipped

Other than ownership what privileges does a user need to view and modify resource
monitors in Snowflake? (Choose two.)

DROP

ALTER

Correct selection

MONITOR

Correct selection

MODIFY

CREATE

Overall explanation

See this link.

Question 85Skipped

How can a Snowflake user post-process the result of SHOW FILE FORMATS?

Assign the command to RESULTSET.

Put it in the FROM clause in brackets.

Correct answer

chumma kizhii
#dm

Use the RESULT_SCAN function.

Create a CURSOR for the command.

Overall explanation

RESULT_SCAN function returns the result set of a previous command (within 24 hours of
when you executed the query) as if the result was a table.

1. SHOW FILE FORMATS;

2. SELECT * FROM TABLE(RESULT_SCAN(LAST_QUERY_ID(-1))) ;

See this link.

Question 86Skipped

In the Data Exchange, who can get or request data from the listings? (Choose two.)

Correct selection

Users with IMPORT SHARE privilege

Users with SYSADMIN role

Correct selection

Users with ACCOUNTADMIN role

Users with MANAGE GRANTS privilege

Users with ORGADMIN role

Overall explanation

All users can browse listings in the Data Exchange, but only users with the ACCOUNTADMIN
role or the IMPORT SHARE privilege can get or request data.

See this link.

Question 87Skipped

How does a Snowflake user execute an anonymous block of code?

The user must run the CALL command to execute the block.

The block must be saved to a worksheet and executed using a connector.

The SUBMIT command must run immediately after the block is defined

Correct answer

The statements that define the block must also execute the block.

Overall explanation

chumma kizhii
#dm

The BEGIN … END statement that defines the block also executes the block. (You don’t run a
separate CALL command to execute the block.)

See this link.

Question 88Skipped

Several users are using the same virtual warehouse. The users report that the queries are
running slowly, and that many queries are being queued.

What is the recommended way to resolve this issue?

Correct answer

Increase the warehouse MAX_CLUSTER_COUNT parameter.

Reduce the warehouse STATEMENT_QUEUED_TIMEOUT_IN SECONDS parameter.

Reduce the warehouse AUTO_SUSPEND parameter.

Increase the warehouse MAX_CONCURRENCY_LIMIT parameter.

Overall explanation

We have to solve concurrency issues here and scale out is the solution so most suitable
option is increase the cluster count.

See this link.

Question 89Skipped

Which features could be used to improve the performance of queries that return a small
subset of rows from a large table? (Choose two.)

Secure views

Row access policies

Correct selection

Search optimization service

Multi-cluster virtual warehouses

Correct selection

Automatic clustering

Overall explanation

The search optimization service aims to significantly improve the performance of certain
types of queries on tables, including:

chumma kizhii
#dm

• Selective point lookup queries on tables. A point lookup query returns only one or a
small number of distinct rows. (...)

See this link.

Typically, queries benefit from clustering when the queries filter or sort on the clustering key
for the table. Sorting is commonly done for ORDER BY operations, for GROUP BY operations,
and for some joins.

See this link.

Question 90Skipped

Which role has the ability to create a share from a shared database by default?

SYSADMIN

Correct answer

ACCOUNTADMIN

ORGADMIN

SECURITYADMIN

Overall explanation

By default, the privileges required to create and manage shares are granted only to
the ACCOUNTADMIN role, ensuring that only account administrators can perform these
tasks.

See this link.

Question 91Skipped

If file format options are specified in multiple locations, the load operation selects which
option FIRST to apply in order of precedence?

Correct answer

COPY INTO TABLE statement

Stage definition

Table definition

Session level

Overall explanation

If file format options are specified in multiple locations, the load operation applies the
options in the following order of precedence:

chumma kizhii
#dm

1. COPY INTO TABLE statement.

2. Stage definition.

3. Table definition.

See this link.

Question 92Skipped

What is SnowSQL?

Snowflake's new user interface where users can visualize data into charts and dashboards.

Snowflake's proprietary extension of the ANSI SQL standard, including built-in keywords
and system functions.

Snowflake's library that provides a programming interface for processing data on


Snowflake without moving it to the system where the application code runs.

Correct answer

Snowflake's command line client built on the Python connector which is used to connect
to Snowflake and execute SQL.

Overall explanation

SnowSQL is the command line client for connecting to Snowflake to execute SQL queries and
perform all DDL and DML operations, including loading data into and unloading data out of
database tables.

SnowSQL (snowsql executable) can be run as an interactive shell or in batch mode through
stdin or using the -f option.

SnowSQL is an example of an application developed using the Snowflake Connector for


Python; however, the connector is not a prerequisite for installing SnowSQL. All required
software for installing SnowSQL is bundled in the installers.

See this link.

Question 93Skipped

What is the recommended way to obtain a cloned table with the same grants as the source
table?

Correct answer

Clone the table with the COPY GRANTS command.

Create a script to extract grants and apply them to the cloned table.

Use an ALTER TABLE command to copy the grants.

chumma kizhii
#dm

Clone the schema then drop the unwanted tables.

Overall explanation

See this link.

Question 94Skipped

A Query Profile shows a UnionAll operator with an extra Aggregate operator on top.

What does this signify?

Correct answer

UNION without ALL

Inefficient pruning

Queries that are too large to fit in memory

Exploding joins

Overall explanation

UNION Without ALL

In SQL, it is possible to combine two sets of data with either UNION or UNION ALL
constructs. The difference between them is that UNION ALL simply concatenates inputs,
while UNION does the same, but also performs duplicate elimination.

A common mistake is to use UNION when the UNION ALL semantics are sufficient. These
queries show in Query Profile as a UnionAll operator with an extra Aggregate operator on
top (which performs duplicate elimination).

See this link.

Question 95Skipped

For the ALLOWED_VALUES tag property, what is the MAXIMUM number of possible string
values for a single tag?

256

10

Correct answer

300

64

Overall explanation

chumma kizhii
#dm

The ALLOWED_VALUES tag property enables specifying the possible string values that can be
assigned to the tag when the tag is set on an object. The maximum number of possible
string values for a single tag is 300.

See this link.

Question 96Skipped

What is the difference between a stored procedure and a User-Defined Function (UDF)?

Multiple stored procedures can be called as part of a single executable statement while a
single SQL statement can only call one UDF at a time.

Returning a value is required in a stored procedure while returning values in a UDF is


optional.

Values returned by a stored procedure can be used directly in a SQL statement while the
values returned by a UDF cannot.

Correct answer

Stored procedures can execute database operations while UDFs cannot.

Overall explanation

See this link.

Question 97Skipped

What are the main differences between the account usage views and the information
schema views? (Choose two.)

No active warehouse is needed to query account usage views but one is needed to query
information schema views.

Account usage views do not contain data about tables but information schema views do.

Correct selection

Account usage views contain dropped objects but information schema views do not.

Information schema views are read-only but account usage views are not.

Correct selection

Data retention for account usage views is 1 year but is 7 days to 6 months for information
schema views, depending on the view.

Overall explanation

Other difference between ACCOUNT USAGE and INFORMATION SCHEMA is latency.

chumma kizhii
#dm

See this link.

Question 98Skipped

How long is a query visible in the Query History page in the Snowflake Web Interface (UI)?

Correct answer

14 days

24 hours

30 days

60 minutes

Overall explanation

See this link.

Question 99Skipped

How is the hierarchy of database objects organized in Snowflake?

A schema consists of one or more databases. A database contains tables, views, and
warehouses.

A schema consists of one or more databases. A database contains tables and views.

Correct answer

A database consists of one or more schemas. A schema contains tables and views.

A database consists of one of more schemas and warehouses. A schema contains tables
and views.

Overall explanation

See this link.

chumma kizhii
#dm

Question 100Skipped

As a best practice, all custom roles should be granted to which system-defined role?

ACCOUNTADMIN

SECURITYADMIN

Correct answer

SYSADMIN

ORGADMIN

Overall explanation

See this link.

Question 101Skipped

How does Snowflake recommend handling the bulk loading of data batches from files
already available in cloud storage?

Use the INSERT command.

Use an external table.

Use Snowpipe.

Correct answer

Use the COPY command.

Overall explanation

Key concept is Bulk.

chumma kizhii
#dm

See this link.

Question 102Skipped

Which constraint type is enforced in Snowflake from the ANSI SQL standard?

UNIQUE

FOREIGN KEY

PRIMARY KEY

Correct answer

NOT NULL

Overall explanation

Snowflake supports defining and maintaining constraints, but does not enforce them, except
for NOT NULL constraints, which are always enforced.

See this link.

Question 103Skipped

A user wants to add additional privileges to the system-defined roles for their virtual
warehouse.

How does Snowflake recommend they accomplish this?

Grant the additional privileges to the SYSADMIN role.

Correct answer

Grant the additional privileges to a custom role.

Grant the additional privileges to the ORGADMIN role.

Grant the additional privileges to the ACCOUNTADMIN role.

Overall explanation

Although additional privileges can be granted to the system-defined roles, it is not


recommended.

If additional privileges are needed, Snowflake recommends granting the additional privileges
to a custom role and assigning the custom role to the system-defined role.

See this link.

Question 104Skipped

chumma kizhii
#dm

hat is the MAXIMUM number of clusters that can be provisioned with a multi-cluster virtual
warehouse?

100

Correct answer

10

Overall explanation

See this link.

Question 105Skipped

A user creates a stage using the following command:

1. CREATE STAGE mystage

2. DIRECTORY = (ENABLE = TRUE)

3. FILE_FORMAT = myformat;

What will be the outcome?

Correct answer

A stage with a directory table that has metadata that must be manually refreshed will be
created.

The command will fail to run because the name of the directory table is not specified.

An error will be received stating that the storage location for the stage must be identified
when creating a stage with a directory table.

A stage with a directory table set to automatically refresh will be created.

Overall explanation

Check the statement, it is creating an internal stage.

ENABLE = TRUE | FALSE

Specifies whether to add a directory table to the stage. When the value is TRUE, a directory
table is created with the stage.

chumma kizhii
#dm

See this link.

Directory tables on internal stages require manual metadata refreshes. You could also
choose to include a directory table on external stages and refresh the metadata manually.
For information about automated metadata refreshes, see automated metadata refreshes.

Question 106Skipped

When executing a COPY INTO command, performance can be negatively affected by using
which optional parameter on a large number of files?

FILES

FILE_FORMAT

Correct answer

PATTERN

VALIDATION_MODE

Overall explanation

See this link.

Question 107Skipped

Which file function provides a URL with access to a file on a stage without the need for
authentication and authorization?

BUILD_SCOPED_FILE_URL

BUILD_STAGE_FILE_URL

GET_RELATIVE_PATH

Correct answer

GET_PRESIGNED_URL

Overall explanation

Generates the pre-signed URL to a staged file using the stage name and relative file path as
inputs. Access files in an external stage using the function.

chumma kizhii
#dm

See this link.

Question 108Skipped

Why do Snowflake’s virtual warehouses have scaling policies?

To help increase the performance of serverless computing features

Correct answer

To help control the credits consumed by a multi-cluster warehouse running in auto-scale


mode

To help control the credits consumed by a multi-cluster warehouse running in maximized


mode

To help save extra storage costs

Overall explanation

To help control the credits consumed by a multi-cluster warehouse running in Auto-scale


mode, Snowflake provides scaling policies, which are used to determine when to start or
shut down a cluster.

See this link.

Question 109Skipped

Which function is used to profile warehouse credit usage?

WAREHOUSE_LOAD_HISTORY

MATERIALIZED_VIEW_REFRESH_HISTORY

AUTOMATIC_CLUSTERING_HISTORY

Correct answer

WAREHOUSE_METERING_HISTORY

Overall explanation

See this link.

Question 110Skipped

When initially creating an account in Snowflake, which settings can be specified? (Choose
two.)

Correct selection

Snowflake edition

chumma kizhii
#dm

Region

Correct selection

Account name

Account locator

Organization name

Overall explanation

Be cautious with this question, it asks you to choose 2 options but actually there are 3
correct options: Account name, Region, Snowflake edition. In the real exam choosing 2 of
the 3 valid options will be counted as a correct question.

See this link.

Question 111Skipped

Which chart type does Snowsight support to visualize worksheet data?

Box plot

Bubble chart

Correct answer

Scatterplot

Pie chart

Overall explanation

Snowsight supports the following types of charts:

• Bar charts

• Line charts

• Scatterplots

• Heat grids

• Scorecards

See this link.

Question 112Skipped

If a multi-cluster warehouse is using an economy scaling policy, how long will queries wait in
the queue before another cluster is started?

2 minutes

chumma kizhii
#dm

8 minutes

Correct answer

6 minutes

1 minute

Overall explanation

See this link.

Question 113Skipped

Given the statement template below, which database objects can be added to a share?
(Choose two.)

GRANT ON TO SHARE;

Streams

Correct selection

Secure functions

Stored procedures

Tasks

Correct selection

chumma kizhii
#dm

Tables

Overall explanation

See this link.

Question 114Skipped

What common query issues can be identified using the Query Profile? (Choose two.)

Data classification

Correct selection

Inefficient pruning

Data masking

Unions

Correct selection

Exploding joins

Overall explanation

See this link.

Question 115Skipped

How can a Snowflake user validate data that is unloaded using the COPY INTO command?

Correct answer

Use the VALIDATION_MODE = RETURN_ROWS statement.

Load the data into a CSV file.

Use the VALIDATION_MODE = SQL statement.

Load the data into a relational table.

Overall explanation

String (constant) that instructs the COPY command to return the results of the query in the
SQL statement instead of unloading the results to the specified cloud storage location. The
only supported validation option is RETURN_ROWS. This option returns all rows produced by
the query.

When you have validated the query, you can remove the VALIDATION_MODE to perform the
unload operation.

See this link.

chumma kizhii
#dm

Question 116Skipped

Why should a user select the economy scaling policy for a multi-cluster warehouse?

Correct answer

To conserve credits by keeping running clusters fully loaded

To increase performance of the clusters

To prevent/minimize query queuing

To reduce queuing concurrent user queries

Overall explanation

Economy

Conserves credits by favoring keeping running clusters fully-loaded rather than starting
additional clusters, which may result in queries being queued and taking longer to complete.

See this link.

Question 117Skipped

Which system function can be used to manage access to the data in a share and display
certain data only to paying customers?

SYSTEM$ALLOWLIST

SYSTEM$ALLOWLIST_PRIVATELINK

Correct answer

SYSTEM$IS_LISTING_PURCHASED

SYSTEM$AUTHORIZE_PRIVATELINK

Overall explanation

If you choose to limit trial consumers to specific data and functionality, create a single share
for your paid listing and use secure views and a system function provided by Snowflake,
SYSTEM$IS_LISTING_PURCHASED, to control which data is visible to trial consumers and
which data is available only to paying consumers.

See this link.

Question 118Skipped

A Snowflake account administrator has set the resource monitors as shown in the diagram,
with actions defined for each resource monitor as “Notify & Suspend Immediately”.

chumma kizhii
#dm

What is the MAXIMUM limit of credits that Warehouse 2 can consume?

Correct answer

5000

3500

1500

Overall explanation

Warehouse 2 is controlled by policy on the account level and all five warehouses usage
count toward this limit. Having limit on another warehouses (in this example: 3, 4, 5) just
means that warehouse 3, 4, 5 can be suspended earlier when reaching their limits.

The question ask for the MAXIMUM, so the best case scenario for warehouse 2 is that other
warehouses doesn't consume ANY resources and in such case warehouse 2 can burn whole
5000 limit.

See this link.

Question 119Skipped

Which function produces a lateral view of a VARIANT column?

PARSE_JSON

Correct answer

FLATTEN

GET_PATH

chumma kizhii
#dm

GET

Overall explanation

See this link.

Question 120Skipped

Which query will return a sample of a table with 1000 rows named testtable, in which each
row has a 10% probability of being included in the sample?

1. select * from testtable sample (0.1 rows);

Correct answer

1. select * from testtable sample (10);

1. select * from testtable sample (0.1);

1. select * from testtable sample (10 rows);

Overall explanation

See this link.

Question 121Skipped

Which features are included in Snowsight? (Choose two.)

Referencing SnowSQL

Downloading query result data larger than 100 MB

Changing the Snowflake account cloud provider

Correct selection

Exploring the Snowflake Marketplace

Correct selection

Worksheet sharing

Overall explanation

See this link.

Question 122Skipped

What is the default compression type when unloading data from Snowflake?

Brotli

bzip2

chumma kizhii
#dm

Zstandard

Correct answer

gzip

Overall explanation

By default, all unloaded data files are compressed using gzip, unless compression is explicitly
disabled or one of the other supported compression methods is explicitly specified.

See this link.

Question 123Skipped

What unit of storage supports efficient query processing in Snowflake?

Blobs

Block storage

Correct answer

Micro-partitions

JSON

Overall explanation

See this link.

Question 124Skipped

chumma kizhii
#dm

Which Snowflake objects track DML changes made to tables, like inserts, updates, and
deletes?

Tasks

Correct answer

Streams

Pipes

Procedures

Overall explanation

See this link.

Question 125Skipped

Which view can be used to determine if a table has frequent row updates or deletes?

TABLES

STORAGE_DAILY_HISTORY

Correct answer

TABLE_STORAGE_METRICS

STORAGE_USAGE

Overall explanation

See this link.

Question 126Skipped

A Snowflake user wants to share transactional data with retail suppliers. However, some of
the suppliers do not use Snowflake.

According to best practice, what should the Snowflake user do? (Choose two.)

Correct selection

Provide each non-Snowflake supplier with their own reader account.

Extract the shared transactional data to an external stage and use cloud storage utilities to
reload the suppliers' regions.

Correct selection

Use a data share for suppliers in the same cloud region and a replicated proxy share for
other cloud deployments.

chumma kizhii
#dm

Create an ETL pipeline that uses select and inserts statements from the source to the
target supplier accounts.

Deploy a single reader account to be shared by all of the non-Snowflake suppliers.

Overall explanation

See this link.

Question 127Skipped

If a virtual warehouse runs for 30 seconds after it is provisioned, how many seconds will the
customer be billed for?

121 seconds

30 seconds

Correct answer

60 seconds

1 hour

Overall explanation

See this link.

Question 128Skipped

What type of function can be used to estimate the approximate number of distinct values
from a table that has trillions of rows?

Window

External

Correct answer

HyperLogLog (HLL)

MD5

Overall explanation

Snowflake uses HyperLogLog to estimate the approximate number of distinct values in a


data set. HyperLogLog is a state-of-the-art cardinality estimation algorithm, capable of
estimating distinct cardinalities of trillions of rows with an average relative error of a few
percent.

HyperLogLog can be used in place of COUNT(DISTINCT …) in situations where estimating


cardinality is acceptable.

chumma kizhii
#dm

Question 129Skipped

Which table type is no longer available after the close of the session and therefore has no
Fail-safe or Time Travel recovery option?

Permanent

Transient

External

Correct answer

Temporary

Overall explanation

See this link.

Question 130Skipped

What is a characteristic of a role in Snowflake?

Privileges granted to system roles by Snowflake can be revoked.

System-defined roles can be dropped.

Correct answer

Privileges on securable objects can be granted and revoked to a role.

Roles cannot be granted to other roles.

Overall explanation

See this link.

Question 131Skipped

To use the OVERWRITE option on INSERT, which privilege must be granted to the role?

TRUNCATE

chumma kizhii
#dm

Correct answer

DELETE

SELECT

UPDATE

Overall explanation

To use the OVERWRITE option on INSERT, you must use a role that has DELETE privilege on
the table because OVERWRITE will delete the existing records in the table.

See this link.

Question 132Skipped

What mechanisms can be used to inform Snowpipe that there are staged files available to
load into a Snowflake table? (Choose two.)

Snowsight interactions

Correct selection

REST endpoints

Email integrations

Correct selection

Cloud messaging

Error notifications

Overall explanation

See this link.

Question 133Skipped

Which role can execute the SHOW ORGANIZATION ACCOUNTS command successfully?

ACCOUNTADMIN

USERADMIN

Correct answer

ORGADMIN

SECURITYADMIN

Overall explanation

See this link.

chumma kizhii
#dm

Question 134Skipped

Which security feature is used to connect or log in to a Snowflake account?

Network policy

SCIM

Role-Based Access Control (RBAC)

Correct answer

Key pair authentication

Overall explanation

To authenticate to Snowflake, you can use one of the following options:

• Password-based authentication. To use this, set the password option when


establishing the connection.

• Single sign-on (SSO) through a web browser.

• Native SSO through Okta.

• Key pair authentication.

• OAuth.

See this link.

Network Policy in Snowflake are used to control the network traffic allowed to access a
Snowflake account. They can restrict access based on IP address ranges or other network-
related properties.

SCIM is used for managing user identities across different systems, including provisioning
and deprovisioning users in a centralized manner.

RBAC in Snowflake is a method for controlling access to resources based on the roles
assigned to users. It defines what actions users can take and what data they can access
within Snowflake.

Question 135Skipped

What happens when a database is cloned?

It replicates all granted privileges on the corresponding source objects.

It does not retain any privileges granted on the source object.

It replicates all granted privileges on the corresponding child schema objects.

Correct answer

chumma kizhii
#dm

It replicates all granted privileges on the corresponding child objects.

Overall explanation

If the source object is a database or schema, the clone inherits all granted privileges on the
clones of all child objects contained in the source object:

For databases, contained objects include schemas, tables, views, etc.

For schemas, contained objects include tables, views, etc.

Note that the clone of the container itself (database or schema) does not inherit the
privileges granted on the source container.

See this link.

Question 136Skipped

A user wants to upload a file to an internal Snowflake stage using a PUT command.

Which tools and/or connectors could be used to execute this command? (Choose two.)

Snowsight worksheets

Correct selection

SnowSQL

Correct selection

Python connector

SnowCD

SQL API

Overall explanation

PUT command is not supported by SQL API.

See this link.

PUT command the command cannot be executed from the Worksheets

page in either Snowflake web interface; instead, use the SnowSQL client or Drivers to
upload data files, or check the documentation for a specific Snowflake client to verify
support for this command.

See this link.

SnowCD is a tool for solving connectivity issues.

chumma kizhii
#dm

Question 137Skipped

A Snowflake user has a query that is running for a long time. When the user opens the query
profiler, it indicates that a lot of data is spilling to disk.

What is causing this to happen?

The result cache is almost full and is unable to hold the results.

The cloud storage staging area is not sufficient to hold the data results.

Clustering has not been applied to the table so the table is not optimized.

Correct answer

The warehouse memory is not sufficient to hold the intermediate query results.

Overall explanation

See this link.

Question 138Skipped

Snowflake Partner Connect is limited to users with a verified email address and which role?

USERADMIN

SECURITYADMIN

Correct answer

ACCOUNTADMIN

SYSADMIN

Overall explanation

See this link.

Question 139Skipped

A Snowflake user has been granted the CREATE DATA EXCHANGE LISTING privilege with their
role.

Which tasks can this user now perform on the Data Exchange? (Choose two.)

Rename listings

Delete provider profiles

Modify incoming listing access requests

Correct selection

chumma kizhii
#dm

Modify listings properties

Correct selection

Submit listings for approval/publishing

Overall explanation

See this link.

Question 140Skipped

Which semi-structured data formats can be loaded into Snowflake with a COPY command?
(Choose two.)

CSV

EDI

Correct selection

ORC

Correct selection

XML

HTML

Overall explanation

See this link.

Question 141Skipped

A custom role owns multiple tables. If this role is dropped from the system, who becomes
the owner of these tables?

SYSADMIN

Tables will be standalone or orphaned.

ACCOUNTADMIN

Correct answer

The role that dropped the custom role.

Overall explanation

See this link.

Question 142Skipped

How does the Snowflake search optimization service improve query performance?

chumma kizhii
#dm

It improves the performance of all queries running against a given table.

Correct answer

It improves the performance of equality searches.

It defines different clustering keys on the same source table.

It improves the performance of range searches.

Overall explanation

Keywords are "equality searches".

See this link.

Question 143Skipped

The INFORMATION_SCHEMA included in each database contains which objects? (Choose


two.)

Correct selection

Table functions for historical and usage data across the Snowflake account

Views for all the objects contained in the Snowflake account

Table functions for account-level objects, such as roles, virtual warehouses, and databases

Correct selection

Views for all the objects contained in the database

Views for historical and usage data across the Snowflake account

Overall explanation

Each database created in your account automatically includes a built-in, read-only schema
named INFORMATION_SCHEMA.

The schema contains the following objects:

• Views for all the objects contained in the database, as well as views for account-level
objects (i.e. non-database objects such as roles, warehouses, and databases)

• Table functions for historical and usage data across your account.

See this link.

Question 144Skipped

What is the MINIMUM size of a table for which Snowflake recommends considering adding a
clustering key?

chumma kizhii
#dm

Correct answer

1 Terabyte (TB)

1 Kilobyte (KB)

1 Gigabyte (GB)

1 Megabyte (MB)

Overall explanation

See this link.

Question 145Skipped

Which Snowflake function is maintained separately from the data and helps to support
features such as Time Travel, Secure Data Sharing, and pruning?

Column compression

Data clustering

Correct answer

Metadata management

Micro-partitioning

Overall explanation

Metadata management in Snowflake is maintained separately from the actual data in Cloud
Service Layer.

Metadata management in Snowflake plays a crucial role in supporting advanced features


such as Time Travel, Secure Data Sharing, and pruning.

Question 146Skipped

Which table function is used to view all errors encountered during a previous data load?

Correct answer

VALIDATE

QUERY_HISTORY

GENERATOR

INFER_SCHEMA

Overall explanation

VALIDATE

chumma kizhii
#dm

Validates the files loaded in a past execution of the COPY INTO <table> command and
returns all the errors encountered during the load, rather than just the first error.

See this link.

Question 147Skipped

Which Data Definition Language (DDL) commands are supported by Snowflake to manage
tags? (Choose two.)

GRANT ... TO TAG

GRANT TAG

DESCRIBE TAG

Correct selection

DROP TAG

Correct selection

ALTER TAG

Overall explanation

Snowflake supports the following DDL to create and manage tags:

1. CREATE TAG

2. ALTER TAG ALTER <object> (to set a tag on a Snowflake object)

3. SHOW TAGS

4. DROP TAG

5. UNDROP TAG

See this link.

Question 148Skipped

What is the compressed size limit for semi-structured data loaded into a VARIANT data type
using the COPY command?

32 MB

8 MB

Correct answer

16 MB

64 MB

chumma kizhii
#dm

Overall explanation

See this link.

Question 149Skipped

The following SQL statements have been executed:

What will the output be of the last select statement?

Correct answer

Overall explanation

See this link.

Question 150Skipped

Which types of URLs are provided by Snowflake to access unstructured data files? (Choose
two).

Correct selection

Scoped URL

Correct selection

File URL

Dynamic URL

chumma kizhii
#dm

Relative URL

Absolute URL

Overall explanation

See this link.

Question 151Skipped

What does a masking policy consist of in Snowflake?

Multiple data types, with only one condition, and one or more masking functions

Multiple data types, with one or more conditions, and one or more masking functions

A single data type, with only one condition, and only one masking function

Correct answer

A single data type, with one or more conditions, and one or more masking functions

Overall explanation

A masking policy consists of a single data type, one or more conditions, and one or more
masking functions.

See this link.

Question 152Skipped

A Snowflake user is writing a User-Defined Function (UDF) with some unqualified object
names.

How will those object names be resolved during execution?

Correct answer

Snowflake will only check the schema the UDF belongs to.

Snowflake will first check the current schema, and then the PUBLIC schema of the current
database.

Snowflake will first check the current schema, and then the schema the previous query
used.

Snowflake will resolve them according to the SEARCH_PATH parameter.

Overall explanation

chumma kizhii
#dm

In queries, unqualified object names are resolved through a search path. The SEARCH_PATH
is not used inside views or Writing User-Defined Functions (UDFs). All unqualifed objects in a
view or UDF definition will be resolved in the view’s or UDF’s schema only.

See this link.

Question 153Skipped

How many network policies can be assigned to an account or specific user at a time?

Correct answer

One

Three

Two

Unlimited

Overall explanation

Only a single network policy can be assigned to the account or a specific user at a time.

See this link.

Question 154Skipped

How many resource monitors can be applied to a single virtual warehouse?

Zero

Eight

Unlimited

Correct answer

One

Overall explanation

A resource monitor can be set to monitor multiple warehouses but a warehouse can be
assigned only to a single resource monitor.

chumma kizhii
#dm

See this link.

Question 155Skipped

Which task is supported by the use of Access History in Snowflake?

Performance optimization

Correct answer

Compliance auditing

Cost monitoring

Data backups

Overall explanation

Access History in Snowflake refers to when the user query reads data and when the SQL
statement performs a data write operation, such as INSERT, UPDATE, and DELETE along with
variations of the COPY command, from the source data object to the target data object. The
user access history can be found by querying the Account Usage ACCESS_HISTORY view. The
records in this view facilitate regulatory compliance auditing and provide insights on
popular and frequently accessed tables and columns because there is a direct link between
the user (i.e. query operator), the query, the table or view, the column, and the data.

See this link.

chumma kizhii

You might also like