Competency Learning and Student Centric
Competency Learning and Student Centric
1. Pipeline Structure: Data flows from collection to preprocessing, through multiple models, then to a final
predictive output.
2. Data Collection & Preparation: Initial step gathers relevant data, setting a foundation for predictive
analysis.
3. Data Cleaning: Missing values addressed with .isna() and .fillna(); duplicates removed with .duplicated()
function.
4. Label Encoding for Categorical Data: Encodes non-numerical features using LabelEncoder() for model
compatibility.
5. Scaling Numerical Features: Uses MinMaxScaler or StandardScaler to prevent feature dominance due
to scale differences.
6. Train-Test Data Split: Ensures model validation by splitting data into training and testing sets.
7. Feature Selection: Filters out irrelevant variables, improving model focus on significant data features.
8. Robust Data Foundation: Ensures data consistency and quality for machine learning models.
CONT….
Model Training and Evaluation
1. Model Ensemble Selection: Utilizes Random Forest, LSTM, Extra Trees, Gradient Boosting, and AdaBoost for
robust predictions.
2. Performance Monitoring: Tracks training and validation accuracy via learning curves to detect overfitting or
underfitting.
3. Evaluation Metrics: Accuracy, ROC AUC, learning curves, and confusion matrix used to assess each model’s
performance.
4. High-Accuracy Models: Random Forest, Extra Trees, and Gradient Boosting achieved 90% accuracy, showing
strong predictive power.
5. LSTM Model Success: Achieved 96.5% accuracy, highlighting its capability to handle complex, sequential data.
6. Overfitting Control: Regularization and tuning applied to minimize overfitting risks, maintaining model
generalization.
7. Learning Curve Analysis: Demonstrates effective model learning through consistent rise in training and validation
accuracy.
8. Ensemble’s Predictive Advantage: Combines models to leverage strengths of each, providing a balanced,
effective prediction system.
RESULT
1. Predicts educational outcomes by analyzing student performance using diverse datasets.
2. Datasets include:
- Student Information: Enrollment, module details, presentation info, student IDs, past attempts, and
outcomes.
- Assessments: Assessment IDs, types, dates, weights, and individual student scores.
- Virtual Learning Environment (VLE): Module codes, activity types, interaction timelines, and start weeks.
- Student VLE Interactions: Virtual classroom interaction data, site IDs, student IDs, and click counts.
3. Ensemble models like Random Forest, AdaBoost, Extra Trees, and Gradient Boosting showed high
accuracy and robustness.
4. LSTM Neural Network excelled in predicting long-term student performance due to its ability to capture
sequential data patterns.
CONT..
5. Integrates deep learning with ensemble methods to improve accuracy and
interpret variables influencing student outcomes.
2. These models help institutions improve student support and optimize resource
allocation.
5. Early analysis helps identify students' strengths and weaknesses to improve exam
outcomes.
CONT…
6. Proposed an ensemble machine learning approach for academic performance prediction.
7. Approach includes two modules: ensemble prediction and ensemble feature engineering.
10. Future research will focus on applying this method to diverse large-scale datasets.
11. Further work will also assess fairness across protected characteristics like race and gender.
12. Investigation will explore the relationship between different concepts of fairness.
REFERENCES
1].L. Wang, J. Zhou, and X. Li, “Influence of College Teacher’s Instructional Design on the Development of College
Students’ Thinking of Innovation Multi-Algorithms Perspective Analysis,” IEEE Access. New York,vol. 12, pp. 3969–
3980, 2024, doi: 10.1109/access.2024.3349979.
[2].Y. M. I. Hassan, A. Elkorany, and K. Wassif, “Utilizing Social Clustering-Based Regression Model for Predicting
Student’s GPA,” IEEE Access. New York,vol. 10, pp. 48948–48963, 2022, doi: 10.1109/access.2022.3172438.
[3].H. P. Singh and H. N. Alhulail, “Predicting student-teachers dropout risk and early identification: A four-step logistic
regression approach,” IEEE Access. New York,vol. 10, pp. 6470–6482, 2022, doi: 10.1109/access.2022.3141992.
[4].T. Nguyen-Huy et al., “Student Performance Predictions for Advanced Engineering Mathematics Course With New
Multivariate Copula Models,” IEEE Access. New York,vol. 10, pp. 45112–45136, 2022, doi:
10.1109/access.2022.3168322.
[5].J. E. Yoo, M. Rho, and Y. Lee, “Online Students’ Learning Behaviors and Academic Success: An Analysis of LMS
Log Data From Flipped Classrooms via Regularization,” IEEE Access. New York,vol. 10, pp. 10740–10753, 2022, doi:
10.1109/access.2022.3144625.
REFERENCES
[6].N. Kathirisetty, R. Jadeja, D. Garg, and H. K. Thakkar, “On the Design of Student Assessment Model Based on Intelligence
Quotient Using Machine Learning,” IEEE Access. New York,vol. 10, pp. 48733–48746, 2022, doi: 10.1109/access.2022.3171807.
[7].N. Wannapiroon and P. Pimdee, “Thai undergraduate science, technology, engineering, arts, and math (STEAM) creative thinking
and innovation skill development: a conceptual model using a digital virtual classroom learning environment,” Education and
Information Technologies. United States,vol. 27, no. 4, pp. 5689–5716, Jan. 2022, doi: 10.1007/s10639-021-10849-w.
[8].R. Summers, H. Higson, and E. Moores, “The impact of disadvantage on higher education engagement during different delivery
modes: a pre-versus peri-pandemic comparison of learning analytics data,” Assessment; Evaluation in Higher Education. New
York,vol. 48, no. 1, pp. 56–66, Jan. 2022, doi: 10.1080/02602938.2021.2024793.
[9].Nabil, M. Seyam, and A. Abou-Elfetouh, “Prediction of Students’ Academic Performance Based on Courses’ Grades Using Deep
Neural Networks,” IEEE Access. New York, vol. 9, pp. 140731–140746, 2021, doi: 10.1109/access.2021.3119596.
[10].P. V, V. S. C, V. T, and J. A, “Multiclass Prediction Model for Student Grade Prediction Using Machine Learning,” International
Scientific Journal of Engineering and Management. India,vol. 02, no. 04, Apr. 2023, doi: 10.55041/isjem00333.