Inspiration

XR promises immersion—but assumes: two hands, fine motor control, visual-first interfaces, and static body postures. Players with upper-limb disabilities and temporary injuries, stroke survivors, amputees, and neurodivergent users fatigued by hand input often get excluded. XChannels is an accessibility-first interaction framework that expands how people control immersive experiences. Today’s XR systems assume hand-based input, excluding millions of users with upper-limb limitations. XChannels breaks this paradigm by extending interaction across multiple “channels” of the body—feet, voice, and context-aware AI.

What it does

As a proof of concept, we built a multiplayer soccer experience where players use wearable foot controllers to move, kick, and pass, while AI-powered voice or text commands handle game actions. Players with arm or hand disabilities can play alongside non-disabled players in the same match, under the same rules. XChannels also enables cross-platform play, allowing VR and PC users to share one immersive space.

How we built it

We built XChannel as a comprehensive VR football game with AI voice control and hardware-integrated foot tracking using Unity 3D. The core architecture consists of:

  1. VR Framework: Built on Unity with BNG VR Interaction Framework for hand tracking and VR controls
  2. Hardware Integration: Custom IMU sensor shoes (FoottrollerNet/FoottrollerCtrl) providing real foot movement tracking including: (1) Right/Left foot heading and tilt angles (2) Touch state detection (TSLF/TSRF) for kick detection (3) UDP-based real-time communication between hardware and game
  3. Network Multiplayer: Implemented using Mirror networking framework for real-time multiplayer football gameplay
  4. AI Voice Assistant: Integrated Firebase AI (Gemini 2.5-flash) for natural language processing of voice commands
  5. Audio Pipeline: Custom audio recording system that captures WAV files and sends them to AI for analysis
  6. World System: Cesium for Unity integration allowing players to teleport to real-world locations
  7. Physics-Based Football: Combination of VR hand controls and IMU foot tracking for realistic football mechanics The system works by combining physical foot movements detected by IMU sensors with VR hand controls, voice commands processed through AI, and multiplayer networking for a complete football simulation.

Challenges we ran into

  1. Hardware-Software Integration: Synchronizing IMU sensor data from physical shoes with Unity's physics system required custom UDP networking protocols
  2. Real-time Sensor Data Processing: Managing continuous streams of heading, tilt, and touch state data while maintaining 60fps VR performance
  3. Multi-Input Coordination: Balancing input from IMU shoes, VR controllers, and voice commands without conflicts or latency
  4. Network Synchronization: Ensuring foot tracking data, ball physics, and player interactions stay synchronized across multiple VR clients
  5. Sensor Calibration: Implementing reliable calibration systems for IMU sensors to account for different player orientations and play styles
  6. UDP Communication Reliability: Managing connection timeouts, reconnection logic, and heartbeat protocols for stable hardware communication
  7. AI Response Integration: Creating seamless integration between voice AI responses and hardware-controlled gameplay actions

Accomplishments that we're proud of

Transforming XR interaction by breaking traditional barriers and expanding play for all.

What we learned

  1. Hardware-VR Integration: Gained deep experience in bridging physical sensor hardware with immersive VR environments
  2. IMU Data Processing: Understanding sensor fusion, angle calculations, and real-time motion tracking in game contexts
  3. UDP Networking: Mastered low-latency communication protocols for time-sensitive hardware input
  4. Multi-Modal Interaction Design: Designing intuitive interfaces that combine voice, hand, and foot inputs seamlessly
  5. Sensor Calibration Algorithms: Developing user-friendly calibration systems for varying physical setups
  6. Real-time Data Streaming: Optimizing continuous sensor data processing without impacting VR performance
  7. Cross-Platform Hardware Support: Managing hardware compatibility across different VR platforms and operating systems

What's next for XChannels

  1. Advanced Foot Mechanics: Implement more sophisticated kick detection algorithms using machine learning on IMU data
  2. Haptic Feedback Integration: Add tactile feedback to the shoes for better immersion during ball contact
  3. Player Biometrics: Incorporate additional sensors for heart rate and fatigue monitoring during gameplay
  4. AI Coaching: Use foot tracking data combined with AI to provide personalized football technique coaching
  5. Tournament Analytics: Advanced statistical analysis of player movements and performance metrics
  6. Wireless Optimization: Develop more efficient wireless communication protocols for reduced latency
  7. Multi-Sport Expansion: Extend the foot tracking system to other sports like soccer, basketball, and martial arts
Share this project:

Updates