All projects showcased here span independent, academic, and industry work—ranging from research to technical implementations.
A multimodal analysis of visuo-auditory & spatial cues and their role in guiding human attention during real-world social interactions.
Pilot study on how visuoauditory robot behaviors influence human attention and intention understanding during collaborative tasks.
Designing multimodal interaction scenarios to investigate joint attention mechanisms in human-robot museum guidance contexts.
A collaborative, open-science audiovisual production for cognitive and behavioral research.
Analyzing how visuospatial features guide collective viewer attention using eye-tracking and semantic modeling.
A knowledge-driven pipeline for cross-domain event annotation and visual attention analysis
Exploring how humans and computational models perceive similarity in everyday hand actions using motion features.
Python-based multimodal data processing for the EU Horizon 2020 DREAM project on robot-assisted autism therapy.
How local motion kinematics and global form contribute to biological motion perception, studied via point-light walker experiments.
How eye-tracking neuroscience transformed brand engagement for luxury retail clients
Comprehensive research-driven consultation to increase discovery, engagement, and usage for a location-based community platform.
Investigating how virtual driving affects perceived body scale and affordance judgments in real-world aperture navigation.
Designing communication strategies and assistive systems to optimize traffic flow in shared spaces by understanding driver behavior and urban movement patterns.
Designing an Android app that enhances transparency between hospitals and patients’ families by integrating patient information into the Hospital Information System (HIS).
Exploring the feasibility of Oculus Rift for depth-based cognitive experiments