0% found this document useful (0 votes)
52 views7 pages

Final Project 555

Good

Uploaded by

Aftab Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views7 pages

Final Project 555

Good

Uploaded by

Aftab Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

1.

Personalized AI Tutoring System for Education:

Related Projects:

• AI-Based Intelligent Tutoring Systems (ITS): Several projects have explored using AI to create
intelligent tutors that can provide personalized feedback and assistance to students based on
their learning styles. A well-known example is Knewton, which customizes lessons for
individual students using data analytics and machine learning. The system adapts in real-time
based on student performance.
• AutoTutor: This system uses NLP and cognitive psychology models to simulate human tutors. It
adapts its teaching strategies and feedback based on the learner's responses.
• Coursera Machine Learning Projects: Some courses on platforms like Coursera have projects
where students build AI models to predict and improve student performance in online
education, using data on student behavior and quiz scores.

Extensions for Final Year Project:

• Expand on the above by adding features like speech recognition to understand spoken
questions from students, or emotion detection to recognize when students are frustrated or
confused.
• Integration with a real-time recommendation engine for study materials based on students’
weaknesses and learning preferences.

Usable Technologies:

• Programming Language: Python for back-end and AI models


• NLP Models: Hugging Face's transformers, OpenAI GPT models, or Google’s BERT
• Speech Recognition: Google Speech-to-Text or IBM Watson
• Backend Framework: Django or Flask
• Database: MongoDB or PostgreSQL (to store student data and responses)
• Front-end Technologies: React.js, HTML, CSS
• ML Library: TensorFlow or PyTorch for implementing adaptive learning algorithms
• Deployment: AWS or Google Cloud for hosting the model and the web app

Challenges:

• Time spent training models for personalized feedback.


• Ensuring real-time interaction with students.

2. AI-Powered Drone Navigation for Disaster Management:

Related Projects:

• Skydio Autonomous Drones: Skydio uses AI-based obstacle detection and navigation for their
drones. Their technology is used in search and rescue missions where the drone autonomously
navigates difficult environments.
Drone for Search and Rescue Using Deep Learning: A project that uses drones equipped with
deep learning models to detect human bodies in disaster areas through thermal or infrared
cameras.
• Multi-Agent System for Disaster Relief: Research at universities has developed multi-agent
drones that can work together to map out disaster-stricken areas in real-time, coordinating
between themselves to optimize search-and-rescue efforts.

Extensions for Final Year Project:

• Use reinforcement learning to improve drone decision-making during missions, enabling them
to handle unpredictable environments better.
• Add a human detection model based on YOLOv5 or other real-time object detection
algorithms, and integrate with cloud services for live mapping and rescue operation updates.

Usable Technologies:

• Drone Hardware: DJI Tello or Parrot drones (easy to use and integrate with Python)
• Reinforcement Learning: TensorFlow or PyTorch for drone navigation models
• Computer Vision: OpenCV for image processing, YOLOv5 for real-time object (human)
detection
• Simulation Tools: Gazebo or AirSim (for testing drone algorithms before hardware
implementation)
• Cloud Services: AWS for live data processing and mapping (if needed)
• Programming Language: Python for AI/ML models and navigation control
• Drone SDK: DroneKit or DJI SDK for drone programming and integration
Challenges:

• Training RL models to handle complex terrains and obstacles.


• Real-time object detection accuracy in varying environmental conditions.

Final Remarks:

• Tutoring System (Project 5) is a more manageable project in terms of real-time AI processing


and integration.
• Drone Navigation (Project 7) involves hardware, which adds time for integration and physical
testing, making it a longer project.

Project 1: Personalized AI Tutoring System (6 months total)

Learning Phase (10 weeks)

1. Week 1-2: Python Basics & Introduction to Machine Learning


a. Goal: Master Python and basic machine learning principles.
b. Technology: Python, NumPy, Pandas
c. Tasks:
i. Team 1 (2 members): Focus on Python and basic ML.
ii. Team 2 (3 members): Web development fundamentals.
d. Resources: Codecademy (Python), Coursera ML course.
2. Week 3-4: Web Development (Frontend & Backend)
a. Goal: Learn front-end and back-end technologies to build the tutoring
platform.
b. Technology: HTML, CSS, JavaScript, Django/Flask.
c. Tasks:
i. Team 1: Focus on Django/Flask for server setup.
ii. Team 2: Focus on building the user interface with HTML, CSS, JavaScript.
d. Resources: freeCodeCamp (Frontend), Django/Flask tutorials.
3. Week 5-6: Machine Learning for Personalized Learning
a. Goal: Understand supervised learning and recommendation systems for AI
tutoring.
b. Technology: TensorFlow, Keras.
c. Tasks:
i. Team 1: Learn how to build models that adapt to student performance.
ii. Team 2: Focus on integrating ML models with the web interface.
d. Resources: Hands-On Machine Learning with Scikit-Learn and Keras.
4. Week 7-8: Natural Language Processing (NLP)
a. Goal: Learn NLP techniques to process student queries and provide feedback.
b. Technology: Hugging Face, SpaCy, OpenAI GPT.
c. Tasks:
i. Team 1: Develop NLP models for understanding student input.
ii. Team 2: Work on integrating NLP models into the web platform.
d. Resources: Hugging Face tutorials.

5. Week 9-10: Full-Stack Integration & Database


a. Goal: Learn to integrate front-end, back-end, and machine learning models.
b. Technology: Django, MongoDB/PostgreSQL, API integration.
c. Tasks:
i. Team 1: Focus on back-end integration and database setup.
ii. Team 2: Ensure the front-end communicates with the back-end.
d. Resources: Full-Stack Django tutorial.

Development Phase (14 weeks)

1. Week 11-12: Initial Setup & Task Distribution


a. Goal: Set up project repositories, assign tasks, and outline project scope.
b. Tasks:
i. Divide team members: 2 on back-end, 2 on NLP/ML, 1 on front-end.
2. Week 13-15: Building Frontend and Backend
a. Goal: Create a functional front-end with basic back-end server connectivity.
b. Tasks:
i. Front-end team: Develop a UI for student interaction.
ii. Back-end team: Set up user authentication, data handling.
iii. ML/NLP team: Implement basic student interaction model.
3. Week 16-18: Implementing Core AI Features
a. Goal: Add personalized learning and feedback features.
b. Tasks:
i. Front-end team: Handle dynamic content based on AI feedback.
ii. Back-end team: Enable data storage for student progress.
iii. ML/NLP team: Train models to adapt to student behavior.
4. Week 19-21: Testing & Debugging
a. Goal: Conduct extensive testing of models and platform usability.
b. Tasks:
i. Front-end and back-end teams: Debug UI interactions and ensure data
flows smoothly.
ii. ML/NLP team: Fine-tune models and fix performance issues.
5. Week 22-23: Final Integration and Optimization
a. Goal: Finalize the project with full integration of all features.
b. Tasks: Ensure all teams' components work seamlessly together.
6. Week 24: Deployment and Presentation
a. Goal: Deploy the tutoring platform on a cloud service (e.g., AWS) and prepare
the final presentation.

Project 2: AI-Powered Drone Navigation for Disaster Management (6 months total)

Learning Phase (10 weeks)

1. Week 1-2: Python and Reinforcement Learning Basics


a. Goal: Learn Python programming and reinforcement learning (RL) concepts.
b. Technology: Python, TensorFlow, PyTorch.
c. Tasks:
i. All members learn basic Python and RL.
d. Resources: Coursera's "Reinforcement Learning Specialization."
2. Week 3-4: Drone Programming
a. Goal: Understand drone hardware and SDK integration.
b. Technology: DroneKit or DJI SDK.
c. Tasks:
i. 2 members focus on drone hardware and SDK.
d. Resources: DroneKit or DJI SDK tutorials.
3. Week 5-6: Computer Vision & Object Detection
a. Goal: Learn computer vision and real-time object detection.
b. Technology: OpenCV, YOLO.
c. Tasks:
i. 3 members focus on training models for human detection.
d. Resources: PyImageSearch tutorials (OpenCV), YOLOv5 guides.
4. Week 7-8: Reinforcement Learning for Drone Navigation
a. Goal: Master RL concepts for obstacle avoidance and autonomous navigation.
b. Technology: TensorFlow or PyTorch, RL libraries.
c. Tasks:
i. 2 members focus on RL model development.
d. Resources: AirSim (Drone Simulation).
5. Week 9-10: Integration of ML with Drone Hardware
a. Goal: Learn to integrate RL and object detection with drone control.
b. Technology: TensorFlow, DroneKit.
c. Tasks:
i. All members work on integrating models with drone hardware.
d. Resources: TensorFlow/DroneKit integration tutorials.

Development Phase (14 weeks)

1. Week 11-12: Initial Setup & Task Distribution


a. Goal: Set up drone, repository, assign tasks.
b. Tasks:
i. Divide team members: 2 focus on RL and navigation, 2 on object
detection, 1 on drone control.
2. Week 13-15: Basic Drone Navigation Setup
a. Goal: Set up basic drone movement and obstacle avoidance using RL.
b. Tasks:
i. RL team: Develop initial navigation model.
ii. Drone team: Test basic flight maneuvers.
3. Week 16-18: Object Detection and Integration
a. Goal: Implement human detection and integrate with drone's navigation
system.
b. Tasks:
i. Object detection team: Train and test YOLO for real-time detection.
ii. RL team: Improve RL models for obstacle handling.
4. Week 19-21: Testing and Simulation
a. Goal: Run simulations in controlled environments.
b. Tasks:
i. Simulate disaster scenarios in Gazebo/AirSim.
ii. Test the human detection system in real-time.
5. Week 22-23: Final Integration and Testing
a. Goal: Integrate RL and object detection models into the drone system.
b. Tasks: Ensure seamless real-world operation of the drone.
6. Week 24: Live Demo and Project Completion
a. Goal: Conduct a live demo showcasing the drone's autonomous navigation
and human detection capabilities.

Team Member Roles:

• Member 1: AI/ML development (focus on tutoring system and drone navigation).


• Member 2: Front-end/UI developer (focus on tutoring system).
• Member 3: Back-end developer (focus on database, server-side for tutoring).
• Member 4: Drone control and navigation (focus on drone movement).
• Member 5: Computer vision and object detection (focus on both projects).

By following this schedule, the team can efficiently learn the required technologies and
develop two functional projects in about 6 months. Each team member focuses on a
specific area, making the project manageable even for those starting from scratch.

You might also like