- 🌱 I’m currently learning AI/Machine Learning/Deep Learning Computer vision (AI/ML/DL-CV)
- 👯 I’m looking to collaborate on building frameworks or AI model
- 📫 How to reach me: [email protected].
PhD Research Collaboration • RAG • Digital Twin Viewer
“From blank canvas to data-driven clarity in one click.”
- Context: Partnered on a PhD-level research project to evaluate wearable exoskeletons for heavy-lift and repetitive industrial tasks.
- Backend (RAG):
• Built a Retrieval-Augmented Generation pipeline that ingests technical specs, runs an Analytic Hierarchy Process scoring engine, and auto-generates comparative risk reports.
• Enabled stakeholders to compare multiple devices across criteria like exertion, stability, comfort, and cognitive load. - Frontend (3D Digital Twin):
• Developed an interactive React component rendering human silhouettes with dynamic heat-map overlays—green where support is strongest, red where risk peaks.
• Added real-time criteria toggles and “See More” drill-downs for joint-level analysis. - Impact: Transformed a static questionnaire into a living decision-support tool, giving researchers and engineers instant visual feedback on exoskeleton performance.
Tech Stack: NestJS • TypeORM • PostgreSQL • LangChain • OpenAI • React • Three.js • RAG • AHP
Proud moment: I’d never tackled RAG before this—and today, ExoSelect powers data-driven choices in cutting-edge human-augmentation research.





