0% found this document useful (0 votes)
192 views2 pages

User Manual For Backend

The document provides details about the backend functionality for cleaning and analyzing data uploaded to the Orefox application. It describes processes for defining data units, cleaning data by removing non-numeric values or columns, performing principal component analysis and correlation analysis, and running K-means clustering. It also notes that report generation functionality for cleaning and analysis is not yet sending data to the server correctly.

Uploaded by

gulraizjaved2016
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
192 views2 pages

User Manual For Backend

The document provides details about the backend functionality for cleaning and analyzing data uploaded to the Orefox application. It describes processes for defining data units, cleaning data by removing non-numeric values or columns, performing principal component analysis and correlation analysis, and running K-means clustering. It also notes that report generation functionality for cleaning and analysis is not yet sending data to the server correctly.

Uploaded by

gulraizjaved2016
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

User Manual for Backend:

Initially, we began integrating the front-end web application into the Orefox application. We
accomplished this by creating a drag-and-drop icon in the sidebar. Upon opening the dataset upload
option, we added a file item that, when clicked, triggers a function that redirects the browser to the
Geochem home page.

To upload the dataset, click "Upload File" and then select "Merge." The webpage will revert to drag-
and-drop functionality.

Now that the data has been uploaded, we can proceed with the cleaning process.

Cleaning Tab:

Define units:

When we drag an item from the sidebar to the main page and click on it, a pop-up appears
prompting us to input some data.

Clicking on the submit button in the pop-up triggers a JavaScript function that sends the request back
to the server in the script. js

For input entries, the process involves calling the model from HTML. After clicking submit, a value is
taken from the selection field and combined with others to send an AJAX request to the server. The
response from the server is then taken as either a success or an error.

The functionality for cleaning data is mostly the same, ranging from removing non-numeric values to
removing columns.

To call the model of any object that we drag onto our main area, we click on it and refer to its HTML
file (drag_drop.html). This file includes a table of emotions.

To accomplish this, we will recursively insert the columns from our memory and include a parent
checkbox. When the checkbox is checked, a JavaScript function call in script.js checks all of its child
checkboxes. Upon clicking the submit button, a JavaScript function first checks the number of
checked child checkboxes. It then converts them from an array to an object type.

Next, we extract values from the project's metadata and attempt to send all of the data to the server
using AJAX. Upon receiving a response, a success or error message will appear.
Analysis Tab:

The Analysis tab provides simple statistical information and a summary of the float column in the
dataset.

Currently, two functions in this tab, simple data and summarise field data, are not being sent to the
server. When I attached the request to the server, it returned an AJAX error. As a response, I added
some values for downloading the report.

Principal Component Analysis:

This involves calling the data-model-view, which results in an empty table being displayed in the
HTML view. We dynamically insert columns in TI through JavaScript. After clicking the submit button,
the data is processed by a script that adds an element of fun. In order to send values from JavaScript
to the server, you can prepare the necessary data along with additional information such as report
and project details. Once the server receives this data, it can respond with a success or error
message. If successful, the server can also provide values for downloading a report as a response.

The correlation coefficients:

function utilises the data-model-view approach. In the HTML view, an empty table is displayed. We
dynamically insert columns into it using JavaScript and create a dropdown menu for the columns to
calculate the correlation coefficient. After clicking on the submit button, the page redirects to an
entertaining script. In order to send data to the server, we can use JavaScript to prepare the
necessary values along with additional information, such as a report or project. Once the data is sent,
the server will respond with either a success or an error message. If successful, the server will also
provide values for downloading a report as a response.

K-means is a process where clicking on an HTML element triggers a pop-up. This pop-up then
displays two checkboxes and three input fields for entering data to ensure that the algorithm is
functioning correctly.

When the user clicks "submit," a JavaScript function is triggered that reads all the input fields and
creates an object to be sent to the server.

Please check this and add extra information which I have missed here, including after K-means, as
well as the implementation of report cleaning and analysis report functionality as well how it is
working here, as I have described. And if I made any mistake please correct.

You might also like