Guide To Connect To AWS s3 AND Snowpipe Setup For AWS S1
Guide To Connect To AWS s3 AND Snowpipe Setup For AWS S1
Here is the step by step guide for beginners to establish snowpipe connectivity in Snowflake with
AWS S3 bucket. If you are looking for navigation using AWS UI to configure the set up, then you
are at a right place here.
Here is the step by step guide for beginners to connect Snowflake with AWS S3
bucket using storage integration. The snowflake documentation currently is not
updated with the new AWS screens and if you are confused there, you are at a
right place here.
1)Login to AWS account
Navigate to S3 service
Create a bucket with unique name
Note: AWS will not allow if bucket name already exists and also take a look at
the bucket. You can leave rest of the options to defaults.
Once you create the bucket by navigating to the end, you will have the below
screen
Click on the bucket and create a folder with any name
Within the folder upload the files you want to process to
Snowflake
Once successfully loaded and closed, you can see the files in
the bucket folder
Make a note of the S3 URL and here it is s3://snowflake-
datademo/load/
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:GetObjectVersion",
"s3:DeleteObject",
"s3:DeleteObjectVersion"
],
"Resource": "arn:aws:s3:::<bucket>/<prefix>/*"
},
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": "arn:aws:s3:::<bucket>",
"Condition": {
"StringLike": {
"s3:prefix": [
"<prefix>/*"
]
}
}
}
]
}
Back in AWS access the role created and edit trust policy
with the 2 properties noted in Step5) and update the policy.
7) Create stage in Snowflake
Create external stage in snowflake using the storage
integration name created in step5) and S3 bucket url created
in step2)
List the stage to see if the files are present, thus completing
the integration between Snowflake & AWS.
References: https://docs.snowflake.com/en/user-guide/data-
load-s3-config-storage-integration.html
Create pipe in snowflake here with name mypipe and make a note
of notification_channel value populated using DESC PIPE.
In the properties tab go to Event notification section and create event notification here with
name snowpipe_ingest.
Select Event type for all objects created and at destination SQS queue, fill in the value
of notification_channel noted in step2) and save changes.
4) Verify the Pipe and check if data is loading through it
select system$pipe_status('mypipe');
4b) Upload new file in S3 blog storage and validate if they are being processed by the pipe
References: https://docs.snowflake.com/en/user-guide/data-load-s3-config-storage-
integration.html