0% found this document useful (0 votes)
114 views

AWS Foundation Course - Exercises: Instructions To Strictly Follow

This document provides instructions for exercises to complete an AWS Foundation course. It includes setting up accounts and environments, then completing hands-on labs with various AWS services like S3, EC2, DynamoDB, Lambda and API Gateway, CloudTrail, and CloudWatch. The exercises involve creating, configuring, and testing resources on each service to learn their core capabilities and how to manage them on the AWS platform.

Uploaded by

Prabhakar
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
114 views

AWS Foundation Course - Exercises: Instructions To Strictly Follow

This document provides instructions for exercises to complete an AWS Foundation course. It includes setting up accounts and environments, then completing hands-on labs with various AWS services like S3, EC2, DynamoDB, Lambda and API Gateway, CloudTrail, and CloudWatch. The exercises involve creating, configuring, and testing resources on each service to learn their core capabilities and how to manage them on the AWS platform.

Uploaded by

Prabhakar
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

AWS Foundation Course – Exercises

1. For doubt clarification please login to https://im.htcindia.com with your eportal credentials
2. Fill your profile by clicking image icon on top left corner.
3. Select group chat menu icon and fill the form as follows ( conference.im.htcindia.com).
4. Click on show group chat button and choose LnB2020-Angular
5. In the window you can ask your questions to faculty.

Instructions to strictly follow:


1. Create a personal aws account.
2. Use only free-tier eligible AWS resources.
3. Stop/terminate/delete the AWS resources that you used at the end of the day Lab
4. Never keep the account credentials in GitHub/any other online platform.
5. Contact the instructor for any clarifications.

Environment setup

1. Login to AWS management console with the given credentials.


2. Choose the region you want to work.
3. Look through various AWS Services available in AWS cloud platform.

Amazon S3 (Simple Storage Services)

1. Create an S3 bucket
Upload objects (files) to the bucket. Note down the region where bucket is created.

2. Identify the bucket ARN.


3. Configure the bucket to enable versioning. Test the feature.
4. Create an S3 bucket to host a static website.
Configure the bucket to host a static website
Access the website pages from a browser.

5. Create a Life cycle rule for an S3 bucket. [ For entire bucket and specific folders ]
Rules
1. All the current version of objects in the bucket should move to Standard-IA after 30 days
2. After 60 days it should move to Glacier.
3. All Objects should get deleted after 1 year.

6. Create two s3 buckets in two different regions in AWS cloud.


Create a cross-region replication between these two buckets.

7. Enable transfer acceleration feature for a bucket.

8. Choose an s3 bucket.
Upload a .CSV/.JSON file to the bucket.
Choose S3 select to perform basic analytics.
9. Using S3 batch operation, copy the entire bucket content to another bucket with help of a bucket
inventory.
IAM (Identity and access management)

1. Create an IAM policy to perform following operation on a given S3 bucket.


a) Full read access to bucket and its content
b) Permission to upload objects to the bucket.

2. Create an IAM user and test the above policy.

3. Create a bucket policy to allow an IAM user read-only access to a bucket.

4. Create a bucket policy to allow everyone read-only access to a bucket and full access to a specific
IAM user.

5. Allow all IAM users


1. Read-only access to a buckets and its content.
2. Full access for a group of users to a Folder in the bucket.

EC2 Lab Exercises


1.Launch an EC2 Linux (free tier eligible) instance.
2.Note the Public IP assigned to the instance.
3.Create a new EBS volume (2GB). Type -> General Purpose SSDs
4.Attach the new EBS volume to the instance launched.
5.Connect to the instance using SSH.
(You should have putty and PuttyGen installed on your local PC)
6.Install a web server in Linux instance and access using web browser.
Issue following command to install web-server
sudo yum update –y;
sudo yum install httpd –y;
sudo service httpd start;
sudo service httpd status;
sudo chkconfig httpd on;
7.Edit inbound traffic rule in security group for HTTP port 80
8.Open the browser and type your instance Public IP to access your web-server
9.Detach the volume from the instance and attach to another instance.
10. Create a Snapshot from an EC-2 instance.
11. Create an EBS volume from a Snapshot and attach to an EC2 instance.
12. Create a AWS classic Load Balancer.
13. Create a Launch configuration and setup an Auto-scaling group.
14. Install AWS CLI and test with S3, EC2 and IAM services.

DynamoDB Exercises
15. Create a DynamoDB table – Videos (authorId, videoId, name, length, publishdate)
16. Add items to the table.
17. Test scan/query options
18. Edit /delete items in the table.
19. Create a DynamoDB global table
20. Try put/get items using AWS CLI Commands.
21. Try query/scan using AWS CLI
22. Load JSON file into dynamo db table.

AWS Lambda and API Gateway


1. Create a Lambda function to return a “Hello World” message. Test it.
2. Create a Lambda function which inserts a log entry into dynamodb table upon uploading an object
in a S3 bucket.
3. Create an API using API gateway to fetch records from a DynamoDB table using Lambda integration.
Pass parameter to the API and fetch relevant records.

CloudTrail, CloudWatch and Simple Notification Service (SNS)


1. Create a Trail to record API calls in your aws account. Log all the events to an S3 bucket.
2. Validate the logs to check the integrity of CloudTrail logs.
3. Create a cloudwatch monitoring alarm if an EC2 instance CPU utilization is below 30%. Send email
notification and terminate the instance.

You might also like