Smartwatch sensors generate continuous multimodal data streams that can be exploited for human activity monitoring in real time. This paper proposes an attention-based multihead deep learning framework that effectively captures both temporal dynamics and cross-sensor dependencies. The model employs multihead attention mechanisms to learn discriminative feature representations from raw smartwatch signals, enabling accurate recognition of diverse daily activities. Extensive experiments demonstrate that the framework outperforms conventional deep models in terms of accuracy and robustness, highlighting its potential for online, real-world activity monitoring in IoT-enabled healthcare and lifestyle applications.

Full paper