0% found this document useful (0 votes)
194 views20 pages

Lab Assignment 2-CSET 463

This document provides instructions for an AWS Cloud Support Associate lab assignment on Amazon S3. The lab includes 8 tasks to provide hands-on experience with S3: 1) Creating an S3 bucket and uploading an object; 2) Enabling versioning; 3) Creating a lifecycle policy; 4) Enabling CORS; 5) Comparing transfer speeds with acceleration; 6) Creating a static website; 7) Accessing S3 with IAM roles; and 8) Multipart uploading using the CLI. The compulsory tasks are creating a static website, demonstrating IAM role access, creating a lifecycle policy, enabling CORS, and multipart uploading.

Uploaded by

2412arjitchauhan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
194 views20 pages

Lab Assignment 2-CSET 463

This document provides instructions for an AWS Cloud Support Associate lab assignment on Amazon S3. The lab includes 8 tasks to provide hands-on experience with S3: 1) Creating an S3 bucket and uploading an object; 2) Enabling versioning; 3) Creating a lifecycle policy; 4) Enabling CORS; 5) Comparing transfer speeds with acceleration; 6) Creating a static website; 7) Accessing S3 with IAM roles; and 8) Multipart uploading using the CLI. The compulsory tasks are creating a static website, demonstrating IAM role access, creating a lifecycle policy, enabling CORS, and multipart uploading.

Uploaded by

2412arjitchauhan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Course-BTECH Type- Specialization Core – II

Course Code- CSET-463 Course Name- AWS Cloud Support Associate (Lab
2)
Year- 2024 Semester- Even VI Semester
Date- 19/01/2023 Batch- 2021-2024

CO-Mapping
CO1 CO2 CO3

Q1 √
Objective: The purpose of this lab assignment is to provide hands-on experience with Amazon S3 (Simple
Storage Service), covering various aspects such as bucket creation, object management, permissions,
versioning, lifecycle policies, CORS configuration, transfer acceleration, static website hosting, IAM roles,
and multipart uploads using the AWS CLI.

Task 1: Create a Sample S3 Bucket and Upload an Object

1.1. Create an S3 Bucket:

• Use the AWS Management Console to create a new S3 bucket.

• Choose a unique and meaningful name.

• Select an AWS region for your bucket.

1.2. Upload an Object to the S3 Bucket:

• Upload a sample file or object to the created S3 bucket using the AWS Management Console or
AWS CLI.

• Verify that the object has been successfully uploaded.

1.3. Setting Up Bucket Permissions and Policy:

• Configure bucket permissions to allow public or specific access.

• Create a bucket policy to define access controls for the bucket.

Task 2: Enable Versioning in Amazon S3


2.1. Enable Versioning:

• Enable versioning for the created S3 bucket using the AWS Management Console or AWS CLI.

• Upload multiple versions of an object to observe versioning in action.

Task 3: Create an S3 Lifecycle Policy

3.1. Creating a Lifecycle Policy:

• Define a lifecycle policy for the S3 bucket to automatically transition objects to different storage
classes or delete them based on defined rules.

Task 4: Enable CORS in Amazon S3

4.1. Configure CORS in S3:

• Enable Cross-Origin Resource Sharing (CORS) for the S3 bucket.

• Define CORS rules to allow or restrict access from specific domains.

Task 5: Compare Data Transfer Speeds with S3 Transfer Acceleration

5.1. Enable Transfer Acceleration:

• Enable Amazon S3 Transfer Acceleration for the bucket.

• Use AWS CLI or SDK to compare data transfer speeds with and without acceleration.

Task 6: Create a Static Website Using Amazon S3

6.1. Static Website Configuration:

• Configure the S3 bucket to host a static website.

• Upload HTML, CSS, and other necessary files.

• Verify that the S3 bucket is serving the static website.

Task 7: Accessing S3 with AWS IAM Roles

7.1. IAM Role Creation:

• Create an IAM role with the necessary permissions to access the S3 bucket.

• Attach the IAM role to an EC2 instance or another AWS service.


• Validate that the instance or service can access the S3 bucket using the IAM role.

Task 8: AWS S3 Multipart Upload Using AWS CLI

8.1. Multipart Upload:

• Use the AWS CLI to perform a multipart upload for a large object to the S3 bucket.

• Confirm that the multipart upload was successful and the object is intact.

Compulsory Tasks

1. Create a Static Website Using Amazon S3


2. Accessing S3 with AWS IAM Roles
3. S3 Lifecycle Policy
4. Enable CORS in S3
5. Multi-part uploading

Task 1:

1. Log into the AWS Management Console.


2. Create an S3 bucket.
3. Upload an object to S3 Bucket.
4. Access the object on the browser.
5. Change S3 object permissions.
6. Setup the bucket policy and permission and test the object accessibility.
How to enable versioning Amazon S3: This lab walks you through the steps on how to enables versioning
on an AWS S3 Bucket. Versioning allows you to keep multiple versions of an object in one bucket.

Task Details

1. Log into the AWS Management Console.


2. Create an S3 bucket.
3. Enable object versioning on the bucket.
4. Upload a text file to the S3 Bucket.
5. Test object versioning by changing the text file and re-uploading it.

Architecture Diagram

Creating an S3 Lifecycle Policy: This walks you through the steps on how to create a Lifecycle Rule for an
object in an S3 Bucket.

Lab Tasks

1. Log into the AWS Management Console.


2. Create S3 Bucket and upload and object into the bucket.
3. Create a Lifecycle Rule on the object.
4. Create Transition types.
5. Create Transition Expiration.
6. Test the Lifecycle Rule on the uploaded object.

Architecture Diagram
Enable CORS in Amazon S3: This lab walks you through the steps to Enable Cross-Origin Resource
Sharing (CORS) in Amazon S3.

Lab Tasks

1. Login to AWS Management Console.


2. Create a S3 Source Bucket.
3. Enable Versioning for source bucket.
4. Enable CORS Configuration and Management Replication for source bucket.
5. Create a S3 Target Bucket.
6. Enable Versioning for target bucket.

Architecture Diagram

Comparing Data Transfer Speeds with S3 Transfer Acceleration: This lab walks you through the steps to
create an S3 Bucket to compare the speeds of Direct Upload and Transfer Accelerated Upload of a file.

Lab Tasks

1. Prepare a short video on your local machine.


2. Create an Amazon S3 Bucket.
3. Upload the short video through Direct Upload.
4. Enable Transfer Acceleration.
5. Upload the same video after enabling Transfer Acceleration.
6. Comparing the Upload Speeds.

Architecture Diagram
How to Create a static website using Amazon S3: This walks you through how to create a static HTML
website using AWS S3 and also how to make it accessible from the internet.

Task Details

1. Log into the AWS Management Console.


2. Create an S3 bucket and upload a sample HTML page to the bucket.
3. Enable static website settings in the S3 bucket.
4. Make the bucket public.
5. Test the website URL

Accessing S3 with AWS IAM Roles: This walks you through the steps to create an AWS S3 bucket and
demonstrates how to access the bucket using AWS CLI commands from EC2 instance and IAM roles.

Introduction

IAM Policy

1. An IAM (Identity and access management) policy is an entity in AWS, that enables you to manage
access to AWS services and resources in a secure fashion.
2. Policies are stored on AWS in JSON format and are attached to resources as identity-based policies.
3. You can attach an IAM policy to different entities such as an IAM group, user, or role.
4. IAM policies gives us the power of restricting users or groups to only use the specific services that
they need.

Policy Types
There are two important types of policies:

• Identity-Based-Policies
• Resource-Based-Policies

Identity-Based-Policy**

1. Identity-based policies are policies that you can attach to an AWS identity (such as a user, group
of users, or role).
2. These policies control what actions an entity can perform, which resources they can use, and the
conditions in which they can use said resources.
3. Identity-based policies are further classified as:
o AWS Managed Policies
o Custom Managed Policies

AWS Managed Policies

1. AWS Managed policies are those policies that are created and managed by AWS itself.
2. If you are new to IAM policies, you can start with AWS managed policies before managing your
own.

Custom Managed Policies

1. Custom managed policies are policies that are created and managed by you in your AWS account.
2. Customer managed policies provide us with more precise control than AWS managed policies.
3. You can create and edit an IAM policy in the visual editor or by creating the JSON policy document
directly.
4. You can create your own IAM policy using the following
link: https://awspolicygen.s3.amazonaws.com/policygen.html

Resource-Based-Policy

1. Resource-based policies are policies that we attach to a resource such as an Amazon S3 bucket.
2. Resource-based policies grant the specified permission to perform specific actions on particular
resources and define under what conditions these policies apply to them.
3. Resource-based policies are in line with other policies.
4. There are currently no AWS-managed resource-based policies.
5. There is only one type of resource-based policy called a trust policy, which is attached to an IAM
role.
6. An IAM role is both an identity and a resource that supports resource-based policies.

IAM Role

1. An IAM role is an AWS IAM identity (that we can create in our AWS account) that has specific
permissions.
2. It is similar to an IAM user, which determines what the identity can and cannot do in AWS.
3. Instead of attaching a role to a particular user or group, it can be attached to anyone who needs it.
4. The advantage of having a role is that we do not have standard long-term credentials such as a
password or access keys associated with it.
5. When resources assume a particular role, it provides us with temporary security credentials for our
role session.
6. We can use roles to access users, applications, or services that don’t have access to our AWS
resources.
7. We can attach one or more policies with roles, depending on our requirements.
8. For example, we can create a role with s3 full access and attach it to an EC2 instance to access S3
buckets.

Lab Tasks

1. Create an IAM role with S3 full access.


2. Create an EC2 instance and attach the S3 role created in the first step.
3. Create an S3 bucket and upload some files to the bucket.
4. Access the bucket using AWS CLI via our EC2 instance.
5. List the objects in the S3 bucket using the AWS CLI from the EC2 instance.

Architecture Diagram

IAM Configuration

Services -> IAM


Creating IAM Role

Select Roles in the left pane and click on Create Role to create a new IAM role.
In the Create Role section, choose AWS Service and then select EC2 service for the role. Click on Next:
Permissions

Type S3fullaccess in the search bar and then click on AmazonS3FullAccess.


Click on Next: Tags.

• Key: Name
• Value: ec2S3role

Click on Next: Review.

On the Create Role Page:

• Role Name: S3Role


o Note : You can create the role in your desired name and then attach it to the EC2 instance.
• Role description : IAM Role to access S3

Click on Create Role.

You have successfully created the IAM role to access the S3 bucket.
See the highlight role.

EC2 Configuration

Services -> EC2


Region: N.Virginia

Under the left sub-menu, click on Instances and then click on Launch Instance

Choose an Amazon Machine Image (AMI): Search for Amazon Linux 2 AMI in the search box and click
on the Select button.

Choose an Instance Type: Select t2.micro and then click on Next: Configure Instance Details

Configure Instance Details:

• Scroll down to the IAM role and then select the role that we have created in the above step.
• Leave other fields as default.
Click on Next: Add Storage
Add Storage: No need to change anything in this step. Click on Next: Add Tags
Add Tags: Click on Add Tag

• Key: Name
• Value: S3EC2server

Click on Next: Configure Security Group:


Configure Security Group:
Choose Create new security group

• Name: S3server-SG

To add SSH:

• Choose Type SSH:


• Source : Custom - 0.0.0.0/0

Click on Review and Launch

Review and Launch: Review all settings and click on Launch.


Key Pair : Select Create a new key pair and click on Download Key Pair.

• Key pair name: ec2

Click on Launch Instances.


Navigate to Instances. Once the Instance State changes from pending to running, the EC2 instance is ready.

You can tell that the instance is running by checking the instance status (example below).

S3 Configuration

Services - > S3

Create a bucket with all default settings. Give it a bucket name yourBucketName.
Note: the bucket name must be globally unique.

Accessing the S3 bucket via EC2 Instance

SSH into the EC2 Instance

To SSH into the server, please follow the steps in SSH into EC2 Instance.
Move file from current EC2 instance to S3 bucket

Once logged in, switch to the root user:

Shell
1sudo su
Run the below command to find your S3 bucket via CLI.
You can see yourBucketName below.

AWS CLI
1aws s3 ls
You will see output similar to the image above, which shows that we are able to access the S3 bucket with
the help of role attached to the EC2 instance.

Create a new text file and upload it to the bucket via AWS CLI (using the following set of commands):

Shell
1touch test.txt
2echo "Hello World" >> test.txt
3cat test.txt

Move your the file in the EC2 instance to S3 bucket:


replace yourBucketName below

AWS CLI
1aws s3 mv test.txt s3://yourBucketName

Check for the new file in the S3 bucket.

Service -> S3

Click on yourBucketName

Repeat the above steps and create some more files like new.txt, smile.txt and upload it to the S3 bucket
using below commands:

Shell
1touch new.txt smile.txt
AWS CLI
1aws s3 mv new.txt s3://yourBucketName
2
3aws s3 mv smile.txt s3://yourBucketName

You can confirm the files uploaded to S3 bucket by navigating to the bucket in the AWS console.
Services -> S3

You can also list the files uploaded to S3 bucket via CLI from the EC2 instance with the following
command:

AWS CLI
1aws s3 ls s3://yourBucketName

Move file from S3 bucket to current EC2 instance

Check the S3 bucket, now we have three files.

Move test.txt from the S3 bucket to current EC2 instance.

• a single . means your current working directory

AWS CLI
1aws s3 mv test.txt s3://yourBucketName/test.txt .

Check the S3 bucket, now we have two files.

AWS S3 Multipart Upload using AWS CLI: This Lab walks you through the steps on how to upload a file
to an S3 bucket using multipart uploading.

Tasks

1. Log in to the AWS Management Console.


2. Create an S3 bucket
3. Create an EC2 instance
4. SSH into the EC2 instance
5. Create a directory
6. Copy a file from S3 to EC2
7. Split the file into many parts
8. Initiate Multipart upload
9. Upload the individual parts
10. Combine individual parts to a single file
11. View the file in the S3 bucket
Architecture Diagram

IAM Configuration

Services -> IAM

Creating IAM Role

Select Roles in the left pane and click on Create Role to create a new IAM role.
In the Create Role section, choose AWS Service and then select EC2 service for the role. Click on Next:
Permissions as shown below in the screenshot:

Type S3fullaccess in the search bar and then click on AmazonS3FullAccess.


Click on Next: Tags.

• Key: Name
• Value: EC2-S3-fullAccess

Click on Next: Review.

On the Create Role Page:

• Role Name: EC2S3Role


o Note : You can create the role in your desired name and then attach it to the EC2 instance.
• Role description : IAM Role to access S3

Click on Create Role.


You have successfully created the IAM role to access the S3 bucket.
See the highlight role.

S3 Configuration

Services - > S3

Create a bucket with all default settings. Give it a bucket name yourBucketName.

• Note: the bucket name must be globally unique.

Click on Create

EC2 Configuration

Services -> EC2


Region: N.Virginia

Launching a EC2 Instance

Under the left sub-menu, click on Instances and then click on Launch Instance

Choose an Amazon Machine Image (AMI): Search for Amazon Linux 2 AMI in the search box and click
on the Select button.

Choose an Instance Type: Select t2.micro and then click on Next: Configure Instance Details

Configure Instance Details:

• Scroll down to the IAM role and then select the role that we have created in the above step.
• Leave other fields as default.

Scroll down to Advanced Details


Under the user data section, enter the following script to copy a video file from the S3 bucket to the EC2
instance.

Shell
1#!/bin/bash
2sudo su
3yum update -y
4mkdir /home/ec2-user/tmp/
Click on Next: Add Storage

Add Storage: No need to change anything in this step. Click on Next: Add Tags

Add Tags: Click on Add Tag

• Key: Name
• Value: Multipart_Server

Click on Next: Configure Security Group:

Configure Security Group:


Choose Create new security group

• Name: Multipart_Server-SG
• Description: Multi-part Server SSH Security Group

To add SSH:

• Choose Type SSH:


• Source : Anywhere

Click on Review and Launch


Review and Launch: Review all settings and click on Launch.
Key Pair : Select Create a new key pair and click on Download Key Pair.

• Key pair name: ec2

Click on Launch Instances.


Navigate to Instances. Once the Instance State changes from pending to running, the EC2 instance is ready.

You can tell that the instance is running by checking the instance status (example below).

Accessing the S3 bucket via EC2 Instance

SSH into the EC2 Instance

To SSH into the server, please follow the steps in SSH into EC2 Instance.
Upload a short video to EC2

Open another terminal on your local machine

• change directory that has your short video and ec2.pem


o my ec2.pem and short video are located on ~/Downloads/ directory
• upload it to your EC2 server
o replace yourEC2PublicIPAddress to your EC2 instance public IPv4 address
o replace yourVideo.mp4 to your video name
o replace ec2.pem to your pem file name

Your local machine Shell


1cd ~/Downloads/
2scp -i ec2.pem ~/Downloads/yourVideo.mp4 ec2-user@yourEC2PublicIPAddress:/home/ec2-user
Go to the terminal logged into your EC2 server

Move your video to directory tmp

• replace yourVideo.mp4 to your video name

AWS CLI
1sudo su
2ls -l
3
4mv yourVideo.mp4 tmp
Change directory to tmp

AWS CLI
1cd tmp
2ll
Notice this is a 56.4MB video

Split the Original File

Split the file into chunks

• The split command will split a large file into many pieces (chunks) based on the option.
• split [options] [filename]

Here we are dividing the 56.4 MB file into 10MB chunks. [ -b option means Bytes ]

AWS CLI
1split -b 10M yourVideo.mp4
View the chunked files

AWS CLI
1ls -lh
Info: Here “xaa” and “xad” are the chunked files that have been renamed alphabetically. Each file is 10MB
in size but except the last one. The number of chunks depends on the size of your original file and the byte
value used to partition the chunks.

Create a Multipart Upload

Get your bucket name first.

AWS CLI
1aws s3 ls
We are initiating the multipart upload using an AWS CLI command, which will generate a UploadID that
will be used later.

• Syntax: aws s3api create-multipart-upload –bucket [Bucket name] –key [original file name]

Note: Replace the example bucket name below with your bucket name.
Note: Replace the example file name below with your file name.

AWS CLI
1aws s3api create-multipart-upload --bucket yourBucketName --key yourVideo.mp4
Note: Please copy the UploadId and save it for later use.

• My
UploadId: _igJ9Rd_RMsonoL0FgYs1zZ6pqrzJyUudKYMDYNS_Cf2k4ktHbNaYeQgJtVjpJwH
AmAPIjeYZTvLUjsAndKqOToyOzRSqeZONdhLy7uPD1a2qX9kdo8_NYkxpmj_xe2xx250xRZ
jDAm32N6X8bCYlA--

Uploading the File Chunks

Next, we need to upload each file chunk one by one, using the part number. The part number is assigned
based on the alphabetic order of the file.

Chunk File Name Part Number

xaa 1

xab 2

xac 3
xad 4

xae 5

xaf 6

Syntax:

AWS CLI
aws s3api upload-part --bucket [bucketname] --key [filename] --part-number [number] --body [chunk
1
file name] --upload-id [id]
Example:
Note: Replace the example bucket name below with your bucket name.
Note: Replace the example file name below with your file name.
Note: Replace the example UploadId below with your UploadId.

AWS CLI
aws s3api upload-part --bucket yourBucketName --key yourVideo.mp4 --part-number 1 --body xaa --
1
upload-id yourUploadId
Note: Copy the ETag id and Part number for later use.

Repeat the above CLI command for each file chunk [Replace --part-number & --body values with the above
table values]

Press the UP Arrow Key on your computer to get back to the previous command. No need to enter the
Upload ID again, just change the Part Number and Body Value.

Each time you upload a chunk, don’t forget to save the Etag value.

My ETags:

• “ETag”: “"48dc187fd8b0e41deec06891ed7e0c02"“
• “ETag”: “"07d54f5c82a1c59f5df23d453e8775d7"“
• “ETag”: “"fa79d59ab3e32c6dd8b9c53fa9747f54"“
• “ETag”: “"3898e88d3d5f5ba93a9b7175e859b42d"“
• “ETag”: “"7dcd2ad6dd3f28f146cdf87775754dcc"“
• “ETag”: “"fc511daca99afd0be81a6d606ee95c2d"“

Create a Multipart JSON file


Create a file with all part numbers with their Etag values.
Creating a file named list.json

EC2 Shell
1vim list.json
Copy the below JSON Script and paste it in the list.json file.

Note: Replace the ETag ID according to the part number, which you received after uploading each chunk.

1{
2
3 "Parts": [
4
5 {
6
7 "PartNumber": 1,
8
9 "ETag": "\"48dc187fd8b0e41deec06891ed7e0c02\""
10
11 },
12
13 {
14
15 "PartNumber": 2,
16
17 "ETag": "\"07d54f5c82a1c59f5df23d453e8775d7\""
18
19 },
20
21 {
22
23 "PartNumber": 3,
24
25 "ETag": "\"fa79d59ab3e32c6dd8b9c53fa9747f54\""
26
27 },
28
29 {
30
31 "PartNumber": 4,
32
33 "ETag": "\"3898e88d3d5f5ba93a9b7175e859b42d\""
34
35 },
36
37 {
38
39 "PartNumber": 5,
40
41 "ETag": "\"7dcd2ad6dd3f28f146cdf87775754dcc\""
42
43 },
44
45 {
46
47 "PartNumber": 6,
48
49 "ETag": "\"fc511daca99afd0be81a6d606ee95c2d\""
50
51 }
52 ]
53}

Complete the Multipart Upload

Now we are going to join all the chunks together with the help of the JSON file we created in the above
step.

Syntax:

AWS CLI
aws s3api complete-multipart-upload --multipart-upload [json file link] --bucket [upload bucket name] -
1
-key [original file name] --upload-id [upload id]
Example:
Example:
Note: Replace the example bucket name below with your bucket name.
Note: Replace the example file name below with your file name.
Note: Replace the example UploadId below with your UploadId.
Note: Replace the example list.json with your json name.

AWS CLI
aws s3api complete-multipart-upload --multipart-upload file://yourJsonName --bucket yourBucketName
1
--key yourVideo.mp4 --upload-id yourUploadId
Note:

Replace the example below with our bucket name.

Replace the Upload-Id value with your upload id.

You might also like