0% found this document useful (0 votes)
10 views

SoftwareTask2

The task involves sending data from an ESP device to the oneM2M platform and establishing a data pipeline to PostgreSQL and Grafana. Participants must collect and analyze at least 1000 data points, perform data cleaning, and build a basic machine learning model, documenting each step in a Jupyter Notebook and a PDF report. The submission is due by November 8th, and it must include original work with code, explanations, and visualizations.

Uploaded by

claudis21
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

SoftwareTask2

The task involves sending data from an ESP device to the oneM2M platform and establishing a data pipeline to PostgreSQL and Grafana. Participants must collect and analyze at least 1000 data points, perform data cleaning, and build a basic machine learning model, documenting each step in a Jupyter Notebook and a PDF report. The submission is due by November 8th, and it must include original work with code, explanations, and visualizations.

Uploaded by

claudis21
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Task 2 - Sending Data from ESP to oneM2M and Creating a Data Pipeline

Submission Deadline: 8th November, 11:59 PM

Objective:

In this task, you will send data points from an ESP device to oneM2M and set up the
complete data flow: ESP -> oneM2M -> PostgreSQL -> Grafana. You are required to
collect and analyze the data, clean it, and build a basic machine learning model.

Submission Guidelines:

• All work must be original and free from plagiarism. You are expected to write your
own code and explanations. Include code, comments, markdown explanations,
and results at every stage of the task.
• The submission must be in the form of a Jupyter Notebook for ML model and a
PDF showing the Grafana dashboard along with all the steps used.

Tasks

1. ESP to oneM2M Data Flow


o Send data points from the ESP device to the oneM2M platform.
o Ensure that the data is correctly received and processed by oneM2M.
2. Store Data in PostgreSQL
o Configure PostgreSQL as the backend database.
o Store the data points received from the ESP in the PostgreSQL database.
o Collect at least 1000 data points for analysis.
3. Connect PostgreSQL to Grafana
o Link your PostgreSQL database to Grafana.
o Build a dashboard with at least 5 charts:
§ Include at least 3 different types of charts (e.g., line charts, bar
charts, pie charts).
§ Use the collected sensor data to display meaningful
visualizations.
4. Data Cleaning and Preprocessing
o Perform data cleaning on the 1000 data points:
§ Handle missing data.
§ Identify and deal with outliers or erroneous values.
o Provide detailed explanations of the cleaning process and any
transformations applied to the data.
5. Build a Machine Learning Model
o Using the cleaned data, develop a basic machine learning model to
predict one of the sensor measurements (e.g., PM2.5, CO2, temperature,
or humidity).
o Clearly explain the choice of model (e.g., linear regression, decision tree,
etc.).
o Visualize the model’s performance using appropriate metrics (e.g.,
accuracy, confusion matrix, etc.)

You might also like