Modul 2 - IoT Application - Update
Modul 2 - IoT Application - Update
Background
Today’s business world is changing with the adoption of IoT. IoT is helping to capture a
tremendous amount of data from multiple sources. However, wrapping around the multitude
of data from countless IoT devices, makes it complex to collect, process, and analyse the data.
Realizing the future and full potential of IoT devices will require an investment in innovative
technologies. The IoT convergence can redefine how industries, businesses, and economy’s
function. Startup Company values this project very much.
According to the feedback from the operation team, using t2.micro instances is enough to run
Startup Service reliably. But you need to monitor the traffic closely in case of any unexpected
traffic and ensure Startup Service is available.
Task 1
1. Share your account AWS (Amazon Web Services) Professional to link
https://s.id/akuncc2023/.
2. Understand architectural problems that have already been prepared.
3. Read the documentation thoroughly (Outline below),
4. Log in to the Amazon Web Service Console.
5. Set up the role for accessing IoT and NoSQL. You can read IAM (Identity Access
Management) Service details.
6. Set up the IoT service. You can read IoT Service details.
7. Set up the bucket to collect the private key. You can read Bucket Service details.
8. Store the Amazon root CA, Certification.pem, Private Key.pem, and Endpoint to the
bucket.
9. IoT Devices must be connected to IoT Services. You can read IoT Devices' details.
10. Set up the IoT rule for the store data to NoSQL using Lambda. You can read IoT Rule
services details.
11. Set up the Lambda Function for the process data from IoT Service. You can read
Lambda services details.
12. Configure IoT Analytics, which includes Channel, Pipeline, Data Store, and Datasets.
See the details of IoT Analytics.
13. Set up the networking using Cloud Formation. You can read Networking
CloudFormation services details.
14. Set up the application and Load Balancer using Cloud Formation. You can read the
Application CloudFormation services details.
15. Monitor the application's performance from the load balancer and upload the link to
https://s.id/lbccmod3-2023.
Technical Details
1. The server application source code is already from GitHub. You can access it from this
link: https://github.com/handipradana/chartiot2023.git,
2. The default region is Virginia,
3. The server application is prepared for 1000 concurrent requests,
4. The base OS that has been chosen is Amazon Linux (https://aws.amazon.com/amazon-
linux-ami/). This distribution was selected for its broad industry support, stability,
availability of support, and excellent integration with AWS.
Reference
1. For documentation IoT Services you can access this link:
https://docs.aws.amazon.com/iot/index.html
2. For documentation IoT Analytics, you can access this link:
https://docs.aws.amazon.com/iotanalytics/latest/userguide/welcome.html
3. For documentation on S3 Bucket Policy Services you can access this link:
https://docs.aws.amazon.com/AmazonS3/latest/userguide/example-bucket-policies.html
4. For documentation AWS Backups, you can access this link:
https://docs.aws.amazon.com/aws-backup/latest/devguide/whatisbackup.html
5. For documentation CloudFormation you can access this link:
https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/Welcome.html
Architecture
IoT Device must be connected to AWS IoT Core service, you must upload the source code
Arduino to microcontroller, download on this link: https://s.id/codeesp32, before upload to
ESP32, change the parameter marked with XXXXXX in the secrets.h file. If successful, you
can see the output:
IoT Rules
The rules for storing data from the device to DynamoDB using Lambda Function and storing
data to Analytics set the 2 rules with the name:
1. lks-lambda-yourname, as an example: lks-lambda-handi. Setting the query SQL
from topic esp32/pub and setting the description as Rules for IoT Devices connect
to Lambda.
2. lks-analytics-yourname, as an example: lks-analytics-handi. Setting the query SQL
from topic esp32/pub and setting the description as Rules for IoT Devices connect
to Analytics.
DynamoDB
DynamoDB has a rule for the save data from IoT Devices using Lambda Function, settings
according to the following appendix table.
Table Name sensor
Partition Key timestamp
Table Class DynamoDB Standard-IA
Read/Write Capacity settings Provisioned
Minimum Capacity units 1
Maximum Capacity units 5
Target Utilization 50%
After configuring the appendix table above, your task is to set up indexes, enable backups for
point-in-time recovery (PITR) and AWS backups, and then export to your main bucket.
Lambda Function
AWS Lambda function is used to provide a trigger obtained from AWS IoT Core service to
save data to DynamoDB. Set a lambda function with the name lambda-name as an example:
lambda-handi, with a runtime using Python 3.8 and use the role you created earlier. The
program code below is incomplete, please complete it so that it can save data from IoT
Core to DynamoDB. Complete it in the section marked and put your code in here.:
IoT Analytics
AWS IoT Analytics automates the steps required to analyze data from IoT devices. AWS IoT
Analytics filters, transforms, and enriches IoT data before storing it in a time-series data store
for analysis. You can set up the service to collect only the data you need from your devices,
apply mathematical transforms to process the data, and enrich the data with device-specific
metadata such as device type and location before storing it. Configure the channels, data store,
pipeline, data sets.
- Channels
A channel collects and archives raw, unprocessed message data before publishing this data
to a pipeline. Set the channel name to lks_channel_name for the example is
lks_channel_handi, set the channel storage as Service managed storage and set the
Indefinitely, set the topic from IoT Core as esp32/pub. After that set, the IAM role, choose
existing role, because in the first step you create role.
- Data Store
A data store receives and stores your messages. It is not a database but a scalable and
queryable repository of your messages. You can create multiple data stores to store
messages that come from different devices or locations, or you can use a single data store
to receive all your AWS IoT messages. settings according to the following appendix table.
Data Store Id lks_datastore_name
Storage Type Service managed storage & indefinitely
Classification Parquet
Inference Source Existing channels
Channel Existing channels
Classification Parquet
Inference source Existing data stores
Data Store Your data store
Parquet schema Data from topic esp32/pub
Custom data partitioning Enable
Sample Source Existing channels
Channel Existing channels
- Pipeline
The simplest functional pipeline connects a channel to a data store, which makes it a
pipeline with two activities: a channel activity and a datastore activity. Adding additional
activities to your pipeline can achieve more powerful message processing. Configure the
pipeline name as lks_pipeline_name, example is lks_pipeline_handi, pipeline_source
from your channel, and pipeline_output from datastore. Set the pipeline activity to Select
attributes from the message update incoming message.
- Datasets
You retrieve data from a data store by creating an SQL dataset or a container dataset. AWS
IoT Analytics can query the data to answer analytical questions. Although a data store is
not a database, you use SQL expressions to query the data and produce results stored in a
dataset. Settings according to the following appendix table.
Dataset Name lks_datasets_yourname
Data store source Your data store
Author Query SELECT * FROM your nama data store limit 10
Data Selection Filter None
Set query Schedule Frequency every 1 minutes
Keep result Indefinitely
Versioning Limited Versions 3
Networking
You must build a network infrastructure before you start the Application. You should create an
infrastructure using CloudFormation, and set the stack using template.yaml, give the name
the stack is lkscf-yourname for the example is lkscf-handi and according to the following
conditions:
- VPC (Virtual Private Cloud) Name is lksvpc,
- VPC with CIDR 175.20.0.0/15,
- 2 Availability Zone,
- DNS Hostname,
- DNS Resolution,
- 2 Public Subnet multiple AZ from IP Address 175.20.0.0/25 and 175.20.0.128/25,
- 2 Private Subnet multiple AZ from IP Address 175.20.1.0/26 and 175.20.1.64/26,
- Internet Gateway name is lksigw,
- NAT Gateway name is natgw,
- 1 Public Route Table, the name is public lkspublic,
- 1 Private Route Table, the name is lksprivate,
- The results can be seen from the outputs CloudFormation.
Application
After the configuration networking, update the stack CloudFormation, and set the instance for
deploy the application from GitHub, the application must be deployed from userdata, after that
set the load balancer, and according to the following conditions:
- Set the Security Group: allow ports 80 and 443 to be public for Load Balancer. The
name is SG-LB, and it does not allow another port,
- Set the Security Group: Allow ports 3000 to be public for Application. The name is
SG-Apps, and it does not allow another port,
- Deploy application use 2 instances in a private subnet, and you must set using userdata,
set the name of the instance is lksapps1a and lksapps1b,
- Set the target group using HTTP protocol and port 3000,
- Application must be running on Public Load Balancer. Set the name as lks-public-lb.
- The scheme for internet-facing.
Results
If your success for deploy instance, you can see the Dashboard.