Multimodal Data
Daniele di Mitri’s work is focused on how multimodal data can be supportive for learning. Multimodal data are data collected from multiple sources like smart watches, sensors, video etc. As Di Mitri explains: 'Since my background is in Artificial Intelligence, I am very interested in using multimodal data not simply for analytics but also to generate adaptive feedback through machine learning techniques. For this reason, I follow the Artificial Intelligence in Education and Intelligent Tutoring System Community'. He is working, among other things, on how multimodal data can improve CPR training.
He explains his work in this video.
The Prize rewards how young researchers actively connect and position their research to the field and society. The other two finalists, selected by EC-TEL and UNIR iTED, were Sambit Praharaj (also PhD at the Welten Institute) and Alejandro Ortega-Arranz (Universidad de Valladolid).
The research Multimodal Tutor for CPR forms is part of the European project SafePat, a project that develops excellence in patient safety in cross-border regions through standardised procedures, policies and innovative tools.