Lab Director:
Lab Location:
Partnership 2: 3rd Floor
Lab Contact:
smartlab.ucfls@gmail.com
The SMART Lab focuses on understanding interactions between humans and intelligent learning/training systems by using advanced statistical methods to measure these processes and their impact on learning, training, performance, and transfer. We conduct basic and applied psychological research in several contexts, including laboratory and real-world environments (e.g., classrooms, medical simulators) using multimodal data, including eye movements, physiological sensors, verbalizations and discourse, human-machine interactions, log files, facial expressions of emotions, screen recordings, non-verbal behaviors, etc. These complex data are used to (1) develop theoretical, computational, and mathematical models of human-machine interaction; (2) examine the nature of temporally unfolding self- and other-regulatory processes (e.g., human-human and human-artificial agents); and (3) design adaptive intelligent learning and training systems capable of detecting, tracking, modeling, and fostering human and machine learning, intelligence, and performance across tasks, domains, and contexts to augment the knowledge, skills, and demands of the 21st-century workforce (e.g., empathy training in healthcare).
People
Faculty:
- Roger Azevedo, Ph.D.
Post-doctoral Students:
- Daryn Dever, Ph.D.
- Megan Wiedbusch, Ph.D.
Highlights
Research Areas:
Adaptive Intelligent Learning Systems
Advanced Learning and Training Technologies
Artificial Intelligence
Digital Twins
Human Digital Twins
Human-human and Human-artificial agents
Human-Machine AI Collaboration
Intelligent Environments for Education and Training Across Humans and Contexts
Metacognition and Self-Regulated Learning
Multimodal Process Data in Human-Machine Interactions
Capabilities & Advanced Technologies:
Eye-tracking Systems
Intelligent Learning Systems
Machine Learning
Measurement Tools
Multimodal Data
Training Platforms
Application Areas:
Academics
Defense
Education
Healthcare