0% found this document useful (0 votes)
26 views14 pages

Competency Learning and Student Centric

A COMPETENCY LEARNING AND STUDENT CENTRIC PREDICTIVE MODEL FOR EVALUATING STUDENT PERFORMANCE USING ENSEMBLE LEARNING

Uploaded by

saravanc2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views14 pages

Competency Learning and Student Centric

A COMPETENCY LEARNING AND STUDENT CENTRIC PREDICTIVE MODEL FOR EVALUATING STUDENT PERFORMANCE USING ENSEMBLE LEARNING

Uploaded by

saravanc2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 14

A COMPETENCY LEARNING AND STUDENT CENTRIC

PREDICTIVE MODEL FOR EVALUATING STUDENT


PERFORMANCE USING ENSEMBLE LEARNING

Dr. Saravana Chandrasekaran


Asst. Professor, M.E., Ph.D, SRMIST
INTRODUCTION
● Purpose of Prediction in Education: Helps educators forecast student performance
accurately to better allocate resources and improve outcomes.
● Data-Driven Approaches: Modern educational institutions are increasingly adopting
data-driven strategies to enhance student success.
● Complex Learning Environments: Predictive models adapt to the evolving
complexities of today’s learning settings.
● Focus on GPA Prediction: Hybrid models play a critical role in accurately predicting
student GPAs.
● Student-Teacher Relationship Insights: Models assess multiple factors impacting
student-teacher relationships to enable proactive interventions.
CONT….
● Comprehensive Approach: Considers personal, academic, and physical factors
affecting student performance.
● Educational Data Mining (EDM): EDM techniques improve the accuracy of grade
predictions across courses.
● Customized Educational Experiences: Hybrid regression models offer more
precise, tailored learning paths.
● Machine Learning and Cognitive Insights: Provides teachers with information
on students' cognitive abilities and learning techniques.
● Smart Campus Frameworks: Emphasis on inclusive data processing and
prediction models for smart campus settings to support online student success.
SCOPE
● Develop Predictive Models: Focus on creating models to forecast student
performance accurately.
● Adapt to Complex Learning Environments: Adjust models to dynamic educational
settings and diverse learning needs.
● Incorporate Hybrid Models: Use a mix of regression models and machine learning
techniques for accurate predictions.
● Educational Data Mining (EDM): Leverage EDM methods to enhance the accuracy of
predictions in course grades and GPA.
● Focus on Student-Teacher Dynamics: Analyze factors impacting student-teacher
interactions for proactive intervention strategies.
● Smart Campus Application: Apply models to smart campus environments for real-
time, data-driven insights.
MOTIVATION
● Improved Resource Allocation: Allow schools to distribute resources based on
predicted student performance.
● Enhance Student Outcomes: Proactively address areas where students need support
to improve overall performance.
● Support Customized Learning: Enable personalized learning plans based on
predictive insights about students' strengths and weaknesses.
● Facilitate Proactive Interventions: Identify at-risk students early for timely
interventions.
● Data-Driven Educational Strategies: Promote evidence-based decision-making in
educational planning and policy.
● Advancement of Predictive Analytics in Education: Contribute to the growing field of
predictive analytics and EDM in academia.
METHODOLOGY
Predictive Ensemble Module & Feature Selection
1. Greedy Feature Selection Control: Terminates after k performance declines, reverting to the best-
performing feature set.
2. Tree-Based Ensemble Model: Constructs an ensemble using various reliable classification and clustering
models.
3. Message Propagation over Bipartite Model: Aggregates results to enhance predictive accuracy and
stability.
4. Hybrid Approach with Supervised & Unsupervised Models: Uses clustering for informed grouping and
prediction, supporting ensemble diversity.
5. Optimized Parameters: Grid search ensures the best parameter settings for each model, enhancing
reliability.
6. Handling Positive Traits: Reduces feature loss impact to maintain traits that positively affect model
accuracy.
7. Consistency of Outcomes: Aims for uniformity by combining outputs of diverse models, enhancing
prediction reliability.
8. Mitigating Individual Algorithm Bias: Ensemble approach minimizes bias from any single model,
increasing robustness.
CONT…
Machine Learning Pipeline & Data Preprocessing

1. Pipeline Structure: Data flows from collection to preprocessing, through multiple models, then to a final
predictive output.
2. Data Collection & Preparation: Initial step gathers relevant data, setting a foundation for predictive
analysis.
3. Data Cleaning: Missing values addressed with .isna() and .fillna(); duplicates removed with .duplicated()
function.
4. Label Encoding for Categorical Data: Encodes non-numerical features using LabelEncoder() for model
compatibility.
5. Scaling Numerical Features: Uses MinMaxScaler or StandardScaler to prevent feature dominance due
to scale differences.
6. Train-Test Data Split: Ensures model validation by splitting data into training and testing sets.
7. Feature Selection: Filters out irrelevant variables, improving model focus on significant data features.
8. Robust Data Foundation: Ensures data consistency and quality for machine learning models.
CONT….
Model Training and Evaluation

1. Model Ensemble Selection: Utilizes Random Forest, LSTM, Extra Trees, Gradient Boosting, and AdaBoost for
robust predictions.
2. Performance Monitoring: Tracks training and validation accuracy via learning curves to detect overfitting or
underfitting.
3. Evaluation Metrics: Accuracy, ROC AUC, learning curves, and confusion matrix used to assess each model’s
performance.
4. High-Accuracy Models: Random Forest, Extra Trees, and Gradient Boosting achieved 90% accuracy, showing
strong predictive power.
5. LSTM Model Success: Achieved 96.5% accuracy, highlighting its capability to handle complex, sequential data.
6. Overfitting Control: Regularization and tuning applied to minimize overfitting risks, maintaining model
generalization.
7. Learning Curve Analysis: Demonstrates effective model learning through consistent rise in training and validation
accuracy.
8. Ensemble’s Predictive Advantage: Combines models to leverage strengths of each, providing a balanced,
effective prediction system.
RESULT
1. Predicts educational outcomes by analyzing student performance using diverse datasets.
2. Datasets include:
- Student Information: Enrollment, module details, presentation info, student IDs, past attempts, and
outcomes.
- Assessments: Assessment IDs, types, dates, weights, and individual student scores.
- Virtual Learning Environment (VLE): Module codes, activity types, interaction timelines, and start weeks.
- Student VLE Interactions: Virtual classroom interaction data, site IDs, student IDs, and click counts.
3. Ensemble models like Random Forest, AdaBoost, Extra Trees, and Gradient Boosting showed high
accuracy and robustness.
4. LSTM Neural Network excelled in predicting long-term student performance due to its ability to capture
sequential data patterns.
CONT..
5. Integrates deep learning with ensemble methods to improve accuracy and
interpret variables influencing student outcomes.

6. Provides data-driven insights for targeted interventions to support academic


progress.

7. Metrics include precision, recall, and F1 score; highest accuracy, precision,


recall, and F1 score of 0.80 achieved by LSTM.

8. LSTM Neural Network outperformed other models in classification accuracy,


ideal for long-term academic forecasting.
CONCLUSION
1. Research confirms deep learning and ensemble models are effective for predicting
student performance.

2. These models help institutions improve student support and optimize resource
allocation.

3. Promotes data-driven decision-making in educational settings.

4. Real-time prediction of student performance addresses a critical challenge in


education.

5. Early analysis helps identify students' strengths and weaknesses to improve exam
outcomes.
CONT…
6. Proposed an ensemble machine learning approach for academic performance prediction.

7. Approach includes two modules: ensemble prediction and ensemble feature engineering.

8. Trials show superior performance compared to single machine learning techniques.

9. System enhances forecast accuracy significantly.

10. Future research will focus on applying this method to diverse large-scale datasets.

11. Further work will also assess fairness across protected characteristics like race and gender.

12. Investigation will explore the relationship between different concepts of fairness.
REFERENCES
1].L. Wang, J. Zhou, and X. Li, “Influence of College Teacher’s Instructional Design on the Development of College
Students’ Thinking of Innovation Multi-Algorithms Perspective Analysis,” IEEE Access. New York,vol. 12, pp. 3969–
3980, 2024, doi: 10.1109/access.2024.3349979.

[2].Y. M. I. Hassan, A. Elkorany, and K. Wassif, “Utilizing Social Clustering-Based Regression Model for Predicting
Student’s GPA,” IEEE Access. New York,vol. 10, pp. 48948–48963, 2022, doi: 10.1109/access.2022.3172438.

[3].H. P. Singh and H. N. Alhulail, “Predicting student-teachers dropout risk and early identification: A four-step logistic
regression approach,” IEEE Access. New York,vol. 10, pp. 6470–6482, 2022, doi: 10.1109/access.2022.3141992.

[4].T. Nguyen-Huy et al., “Student Performance Predictions for Advanced Engineering Mathematics Course With New
Multivariate Copula Models,” IEEE Access. New York,vol. 10, pp. 45112–45136, 2022, doi:
10.1109/access.2022.3168322.

[5].J. E. Yoo, M. Rho, and Y. Lee, “Online Students’ Learning Behaviors and Academic Success: An Analysis of LMS
Log Data From Flipped Classrooms via Regularization,” IEEE Access. New York,vol. 10, pp. 10740–10753, 2022, doi:
10.1109/access.2022.3144625.
REFERENCES
[6].N. Kathirisetty, R. Jadeja, D. Garg, and H. K. Thakkar, “On the Design of Student Assessment Model Based on Intelligence
Quotient Using Machine Learning,” IEEE Access. New York,vol. 10, pp. 48733–48746, 2022, doi: 10.1109/access.2022.3171807.

[7].N. Wannapiroon and P. Pimdee, “Thai undergraduate science, technology, engineering, arts, and math (STEAM) creative thinking
and innovation skill development: a conceptual model using a digital virtual classroom learning environment,” Education and
Information Technologies. United States,vol. 27, no. 4, pp. 5689–5716, Jan. 2022, doi: 10.1007/s10639-021-10849-w.

[8].R. Summers, H. Higson, and E. Moores, “The impact of disadvantage on higher education engagement during different delivery
modes: a pre-versus peri-pandemic comparison of learning analytics data,” Assessment; Evaluation in Higher Education. New
York,vol. 48, no. 1, pp. 56–66, Jan. 2022, doi: 10.1080/02602938.2021.2024793.

[9].Nabil, M. Seyam, and A. Abou-Elfetouh, “Prediction of Students’ Academic Performance Based on Courses’ Grades Using Deep
Neural Networks,” IEEE Access. New York, vol. 9, pp. 140731–140746, 2021, doi: 10.1109/access.2021.3119596.

[10].P. V, V. S. C, V. T, and J. A, “Multiclass Prediction Model for Student Grade Prediction Using Machine Learning,” International
Scientific Journal of Engineering and Management. India,vol. 02, no. 04, Apr. 2023, doi: 10.55041/isjem00333.

You might also like