📄
Abstract
User experience (UX) evaluation plays a crucial role in understanding how users interact with digital platforms and in improving product design. Traditional UX evaluation methods, such as surveys and interaction logs, often rely on a single data source, which limits the depth of analysis. This study explores the integration of multimodal data processing techniques in UX research, aiming to enhance the accuracy and comprehensiveness of UX evaluations. By combining interaction logs, visual attention data, and physiological measurements, this approach provides a more holistic understanding of user behavior, emotional responses, and satisfaction. Interaction logs offer objective data on user actions, while eye-tracking and physiological data capture users' emotional states, providing richer insights into usability and user experience. This study highlights the effectiveness of multimodal integration in identifying patterns that traditional methods overlook, such as emotional responses to interface elements and real-time feedback from users. The findings reveal that multimodal data processing improves the precision of UX assessment by combining objective behaviors with subjective emotional responses, offering a more complete view of user interactions. The study also discusses the challenges of data synchronization and the potential ethical concerns related to the use of physiological data. The integration of these data sources shows great potential for enhancing the design process, allowing designers to make informed decisions based on comprehensive insights. Finally, this research underscores the future potential of multimodal analytics in UX research, suggesting further exploration of additional data modalities and real-time applications in various digital environments.