Johns Hopkins University

  • Magic Mirror System for Anatomy Education

     

    Research collaborators: JHU Departments of Orthopaedic Surgery and Biology, Garrison Forest High School, Technical University of Munich

    In this project, I am working on the development and refinement of “Magic Mirror”, as an AR based solution for interactive human anatomy learning experience. The concept of “Magic Mirror” is that the user stands in front of a screen (TV), a Microsoft Kinect sensor, and the screen shows the image of the user’s body such that it acts like a mirror. It generates in-situ visualizations of medical information directly on top of the user’s body. Additionally, it allows an affordable, portable and robust solution for medical education purposes.

  • Magic Mirror System for Motor Rehabilitation

     

    Research collaborators: JHU Department of Rehabilitation and Neurology, Applied Physics Laboratory, Universtiy of Ottawa, Technical University of Munich

    In this project, we would like to extend the Magic Mirror system to a "personalized" system that can be used in rehabilitation clinics or at home, resulting in a decrease in clinic visits and associated costs. The Magic Mirror system combines augmented reality and virtual reality in-situ visualization, and automatic tracking of the patient’s upper extremities in a gaming context, and facilitates unobtrusive, cost-effective, patientspecific and patient-friendly methods for objective evaluation and rehabilitation. The extended Magic Mirror will be also able to automatically monitor the exercises and training progressions, by providing real-time feedback and controlling whether the exercises are performed correctly.

University of Central Florida

  • Gesture Assessment of Teachers in TeachLivE Immersive Environment

     

    Research collaborator: UCF School of Education

    My PhD dissertation research was about "multimodal teaching assessment in virtual (and real) classrooms". I have proposed a real-time feedback application that assists teachers in their (nonverbal) interaction in the virtual classroom-TeachLivE. The feedback is provided any time that the teacher exhibits a closed, defensive stance. Using machine learning techniques, my method employs depth information from the environment and the participant teacher to provide real-time feedback (in the form of visual and haptics) for teaching professional development.

    I have been working on different aspects of student-teacher interactions in TeachLivE, including conversational turn-taking, virtual students' talk-time and acoustic features (pitch and intensity) to evaluate the behavior of teachers using multimodal learning analytics in the virtual classroom.