Accurate gait analysis plays a key role in rehabilitation and healthcare monitoring, but wearable-based measurements often suffer from calibration errors and variability across subjects. This paper proposes a visual calibration driven gait analysis model, which integrates wearable sensor data with vision-based calibration cues to enhance accuracy and reliability. The framework leverages multimodal data fusion to reduce drift and misalignment in wearable measurements. Experimental evaluation demonstrates improved gait parameter estimation and robustness across diverse walking conditions, making it a promising solution for real-time, personalized gait monitoring in clinical and home settings.