1- Create File format for CSV type?
2- Write the Mandatory Parameters to load any simple CSV files.
3- What is the limit to load data from UI ?
4- Can we load EXCEL file from UI?
5- What are the various method to load data?
6- Which cache is available irrespective of querying the table?
7- Explain the Execution Plan in simple words?
8- What are the type of caching techniques available in Snowflake?
9- Load this Spotify.csv in snowflake?
10 - What are the data types available for storing Semi Structure data.
11- What all semi structured file can be loaded in Snowflake?
12- What are the different types of Tables we have in Snowflake ?
13- How does snowflake handle file formats in stages during data loading ?
14- Why We should create Internal stages, when we already have Table/User Stages?
15- How we can list down all the files from Stages?
16- Where would you use Transient table over Permanent tables and why ?
17- How would you set up the SnowSQL ?
18- What are those commands which run only on Snowsql?
19- In which Cloud Snowflake will always overwrite the files when we load in
stages?
20- Describe the Security consideration associated with Snowflake Stages?
21- What will happen If I try to load the same file in stages from local machine
which is already in Stages?
22- Perform this activity to analyze the Warehouse performance, which operation
will be faster ?
a- First load 1GB of single file and load into Table using Small Warehouse.
b- Now convert that file into 20 files and load using Small warehouse.
23- Load JSON,AVRO,ORC,PARQUET files in single Stages and fetch data.
24- What is the usages of PARALLEL Parameter during PUT command ?
25- Load data from SNowfake_sample_data.TPCH_Sf1.CUSTOMER into one Single files and
then copy it in your local machine with SFCUSTOMER prefix.
26 - Suppose I have 30 files in my local folders and I want to load all those files
in single shot to my Internal stages How would you do that ?
27- What are the FILE STAGING Commands we have in Snowflake?
28- If i have created one table Name customer, what can be the possible reasons
That table is not available in Snowsight for other user ?
29- What all type of transformation is supported in COPY into Command?
30- Write the query for below scenarios
a- Dump 5 files to Internal stages with same Structure and Copy these files from
Internal stages to Table, Make sure we
have Audit columns, Who dumped these files,when this records were inserted ,
LASTMODIFIED TIme Stamp and Filenames.
b- We need to have all failure files list with errors.
c- If 1M records are there and out of that 100 are incorrect then you should
load rest files and create seprate file for all errors.
31- What all Metadata fields are available in Stages??
32 : I have 12 columns in my files but in Tables we only got 9 colums,Now client
asked me to load this data in my tables but i need to skip
staring 2 columns and last columns, How would you achieve it in Snowflake?
33 : A- Take CUSTOMER Table from Snowflake_sample_data.TPCH_SF1.ORDER and copy this
table in Stage called ORDER_STAGE. Stage files should start with Order prefix.
B- Same Order table should be copy into Single file but make sure the extension is
TXT of output file.
34- Suppose there is one common stages which is used by 5 users in single Team,
Manager tried to know who loaded these files but he could not figured out you being
a developer, How can you help him ??
35- I have specified FILE FORMAT in Stages and then specified parameters in COPY
INTO commands, which file format parameters will be used by snowflake in this
case ??
36- I have one stage as STG_CSV and second is STAGING_CSV, NOW I want to transfer
files from STG_CSV to another stages, How would you achieve it.
37 - Client is uploading files dynamically in S3 bucket and you have created
Snowpipe on top of this, Now he want you to set up the email notification for all
failure processed files?
38- What are the steps you will follow to debug your SNowpipe, who worked without
error but data could not get ingested?
39-COnsider a scenario where client is uploading files in S3 bucket called
snowflake_bucket/sourcefolder but thier automated process is creating daily new
folders date wise like 01, 02 , 03 etc and you need to pick files from dynamically
folders and then copy into tables.
Design this entire pipeline.
40- How will you make sure that your pipeline did not fail while performing the
copying data from stages to tables?