Yubin Exp5 Iot
Yubin Exp5 Iot
Experiment 5
Student Name: Yubin UID: 22BCS16626
Branch: BE-CSE Section/Group: IOT-636-B
Semester: Sixth Date of Performance: 24-02-2025
Subject Name: Foundation of Cloud IOT Edge Subject Code: 22CSP-367
ML Lab (22CSP-367)
1) Aim: Set up a system using IoT sensor data to AWS IoT Core and store it in an S3 bucket
2) Objective: To demonstrate the process of integrating IoT sensors with AWS IoT Core,
transmitting sensor data, and storing the data in AWS S3 for further analysis.
3) Pre-requisites:
a) Basic knowledge of cloud computing.
b) Basic knowledge of Linux and Windows operating systems.
c) An AWS account.
4) Procedure:
1) Go to AWS console and search for S3 Bucket in search bar.
2) From the list of options that appear, select S3.
3) Click on Create Bucket option, provide a name for the bucket and create the bucket.
DEPARTMENT OF
COMPUTER SCIENCE & ENGINEERING
4) Now search for IOT Core in the search bar and select IOT Core from the list of options that
appears.
5) Now click on Rules engine option.
6) Click on Create Rule option, provide a name and description for the rule. On the next page
click on SQL Version and select 2016-03-23. Inside the SQL Statement box, type SELECT
* FROM ‘iotdevice/+/datas3’
7) Choose ‘Store a message in an Amazon S3 bucket’ option from the rule actions list.
8) Select the bucket which you created , and provide the key as ${cast(topic(2) AS
DECIMAL)}/${timestamp()}.
9) Now create a new rule for the action and select the same rule.
DEPARTMENT OF
COMPUTER SCIENCE & ENGINEERING
14) Now go to the chosen S3 bucket. A folder titled 55/ will be there. Open the folder and lists
of files will be available. Downlaod any one file to see the data.
5) Output:
6) Conclusion:
As a result, AWS IoT Core with the IoT Rule engine will assist in filtering IoT topics and the storage
of data in AWS S3. AWS IoT core can receive and send millions of IoT data at a time, and the AWS
IoT Rule engine can filter MQTT topics from IoT devices and send them to other AWS Services
and a time stamp. For data backup and archive, AWS S3 will be used.