0% found this document useful (0 votes)
4 views14 pages

Integrating Local To S3

The document outlines the steps to integrate local systems with Amazon S3 using AWS IAM for user access. It details the creation of a user with S3 permissions, generating access keys, and utilizing the boto3 library in Python to manage S3 buckets and objects. Additionally, it covers operations such as creating, listing, and deleting buckets and objects within S3.

Uploaded by

Lucky Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views14 pages

Integrating Local To S3

The document outlines the steps to integrate local systems with Amazon S3 using AWS IAM for user access. It details the creation of a user with S3 permissions, generating access keys, and utilizing the boto3 library in Python to manage S3 buckets and objects. Additionally, it covers operations such as creating, listing, and deleting buckets and objects within S3.

Uploaded by

Lucky Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Integrating local to S3

1. To access S3 through local, we’ll need user_access_id &


user_access_secret_key.
2. Open IAM service.

3. Now create a user. To create, select Users.


4. Select Create user.

5. Give User name. Then click Next.


6. Now Attach policies to new user. Select Attach policies directly.

7. In Permission policies, search for S3. In the listed policy name select
AmazonS3FullAccess.
8. Select the policy and click Next.

9. In Review and Create, click Create user.


10. In IAM User, select newly created users.

11. In Users tab, click on Create access key.

12. Select CLI, and click Next.


13. Click Create access key.

14. Now from Retrieve access keys, copy Access key and Secret access key.
15. Now open any python editor, for here we are using jupyter notebook.
16. Store the access key & secret key in variable.

17. Now we’ll use boto3 library, which is a AWS SDK(Software Development
Kit) for python. Boto3 makes it easy to integrate your Python
application, library, or script with AWS services including Amazon S3,
Amazon EC2, Amazon DynamoDB, and more.
Here we’ll use boto3 to create bucket, list bucket & read contents of the
bucket.
18. Use AWS documentation for references on tools, code and services.
19. Search for s3 client.

20. We have different option such as create bucket, delete bucket, list bucket,
etc.
21. We’ll be taking references for from this document in coming topics.
22. First of all let’s create some bucket and object inside buckets in AWS S3.
23. We have created this two buckets.
24. Objects inside folder in the first bucket are

25. Objects inside folder in the second bucket are

26. We will access this objects and bucket through CLI using boto3 library.
27. Listing buckets

28. This is dictionary so we can access it like any dictionary.

29. To access buckets


30. To list objects inside bucket
31. To list version of objects
32. Deleting objects with version id
We have two versions of tumor file. we want to delete both versions
33. Here we got 200 response means the file is deleted successfully

34.we can also confirm through s3 GUI


Deleted:

35. to read s3 object


36. To read the body

37. This file is binary file so we need to import io and convert the file

You might also like