Maria Eugenia Cabrera

PhD in Industrial Engineering 
Research Associate at University of Washington

 

Email:

mecu@cs.washington.edu 

Address:

185 Stevens Way

Seattle, WA 98195

Preferred name:

Maru Cabrera

  • gscholar_logo
  • Icono social LinkedIn

Hello! I'm Maru Cabrera

 

 I recently obtained my PhD from Purdue University in the Intelligent Systems and Assistive Technologies (ISAT) Lab in the School of Industrial Engineering, under the supervision of Dr. Juan P. Wachs. My dissertation topic was about One-Shot Gesture Recognition. I am currently a postdoc working with Dr. Maya Cakmak in the Human Centered Robotics Lab (HCR).



   My research interests include human robot interaction (HRI) and multimodal interactions based on embodiment, including gestures. More specifically, I desire to conduct research in naturalistic and novel approaches to include cognitive and physiological aspects of human performance in collaborative tasks with other robots or other humans, either co-located or remotely. 

 
EDUCATION

PhD in Industrial Engineering

at Purdue University. West Lafayette, IN. United States.

3.6/4 GPA

        Dissertation title: "Gist of a Gest: Learning Gestures for the First Time"

        Preliminary Examination on July 2016

        Graduation Date: August 4th, 2018

2014-2018

Doctoral Degree

MS in Electronic Engineering

at Simon Bolivar University. Caracas, Venezuela.

4.8/5 GPA

        Master's Thesis: "Gesture Recognition System based on the Human Hand"

2008-2011

Master's Degree

2002-2008

Bachelor's Degree

BSc in Electronic Engineering

at Simon Bolivar University. Caracas, Venezuela

Cum Laude 4.3/5 GPA

        Bachelor's Thesis: Designing and Building an Instrumentation and Control System for PoiseBot robotic platform.

EXPERIENCE
2018-Present

Research Associate

Paul G. Allen School of Computer Science and Engineering, University of Washington.

Seattle, WA. USA

2014-2018

Graduate Teaching and Research Assistant (TA/RA)

School of Industrial Engineering at

Purdue University. West Lafayette, IN. USA

I was a TA for undergraduate and graduate courses (IE-343 Engineering Economics and IE-574 Robotics and Flexible Assembly respectively). As part of my RA work, I was a member of a multidisciplinary team working on Surgical Telementoring with Augmented Reality.

2012-2013

Instructor Level Professor

Electronics and Circuits Department at

Simon Bolivar University. Caracas, Venezuela

I was in charge of core courses like Electrical Circuit Analysis and Laboratory of Electrical Measurements. Among the responsibilities were preparing the lectures, assignments and exams, as well as coordinating with other professors teaching other sections to keep a consistent syllabus.

2008-2011

Graduate Teaching Assistant (TA)

Electronics and Circuits Department at

Simon Bolivar University. Caracas, Venezuela

As TA, my responsibilities included providing assistance during lab hours, grading and teaching review lessons before exams.

 
 
AWARDS

Honorable Mention 2018 Graduate Mentoring Award

Purdue University

Received it on May 2018. Article published by Purdue IE News

2018 Outstanding Graduate Research Award

College of Engineering, Purdue University

Received it on April 2018. Article published by Purdue IE News

Travel Grant Recipient from 2017 RSS Workshop of Women in Robotics. Boston, MA. United States

Attended RSS in Boston on July 2017. Poster presentation.

PhD Consortium Fellowship from 2017 IEEE FG Conference. Washington DC, United States.

Attended FG in Washington DC on May 2017. Poster and Oral presentation.

2016-2017 Fellowship Recipient.

Dr. Theodore J. and Isabel M. Williams Fellowship in Industrial Control Systems.

Industrial Engineering, Purdue University.

Awarded in Fall 2016.

1st Place Poster Presentation

IEGSO Research Symposium. Purdue University

April 2016. Poster presentation.

3-Minute Student Presentation Winner from

2015 AAAI Conference. Austin, TX. United States

Attended AAAI in Austin on January 2015. Poster presentation.

PUBLICATIONS

  • Cabrera, M. E., & Wachs, J. P. (2018, May). Biomechanical-Based Approach to Data Augmentation for One-Shot Gesture Recognition. In 2018 13th IEEE International Conference on  Automatic Face & Gesture Recognition (FG 2018) Xi'an, China 15 – 19 May 2018 (pp. 38-44). IEEE. Link

  • Rojas-Muñoz, E., Cabrera, M. E., Andersen, D., Popescu, V., Marley, S., Mullis, B., Zarzaur, B., & Wachs, J. (2018). Surgical Telementoring Without Encumbrance: A Comparative Study of See-through Augmented Reality-based ApproachesAnnals of surgery. Link

  • Cabrera, M. E., Voyles, R. M., & Wachs, J. P. (2018, March). Coherence in One-Shot Gesture Recognition for Human-Robot Interaction. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction HRI (pp. 75-76). ACM. Link

  • Cabrera, M. E., Novak, K., Foti, D., Voyles, R., & Wachs, J. P. (2017, May). What makes a gesture a gesture? Neural signatures involved in gesture recognition. In 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017) Washington DC, USA 22 – 26 May 2017 (pp. 748-753). IEEE. Link

  • Cabrera, M. E., Sanchez-Tamayo, N., Voyles, R., & Wachs, J. P. (2017, May). One-Shot Gesture Recognition: One Step Towards Adaptive Learning. In 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017) Washington DC, USA 22 – 26 May 2017 (pp. 784-789). IEEE. Link

  • Cabrera, M. E., & Wachs, J. P. (2017). A Human-Centered Approach to One-Shot Gesture LearningFrontiers in Robotics and AI, 4, 8. Link

  • Andersen, D., Popescu, V., Cabrera, M. E., Shanghavi, A., Mullis, B., Marley, S., Gomez, G., & Wachs, J. P. (2017). An Augmented Reality-Based Approach for Surgical Telementoring in Austere EnvironmentsMilitary medicine, 182(S1), 310-315. Link

  • Andersen, D., Popescu, V., Lin, C., Cabrera, M. E., Shanghavi, A., & Wachs, J. (2016, September). A Hand-Held, Self-Contained Simulated Transparent Display. In 2016 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)  Merida, Mexico, 19 – 23 September 2016 (pp. 96-101). IEEE. Link

  • Zhou, T., Cabrera, M. E., Low, T., Sundaram, C., & Wachs, J. (2016). A comparative study for telerobotic surgery using free hand gesturesJournal of Human-Robot Interaction, 5(2), 1-28. Link

  • Andersen, D., Popescu, V., Cabrera, M. E., Shanghavi, A., Gomez, G., Marley, S., Mullis, B., & Wachs, J. P. (2016). Medical telementoring using an augmented reality transparent displaySurgery, 159(6), 1646-1653. Link

  • Cabrera, M. E., & Wachs, J. P. (2016, August). Embodied gesture learning from one-shot. In 25th IEEE International Symposium on Robot and Human Interactive Communication RO-MAN New York, USA, 25 – 28 August 2016 (pp. 1092-1097). IEEE. Link

  • Andersen, D., Popescu, V., Cabrera, M. E., Shanghavi, A., Gómez, G., Marley, S., Mullis, B., & Wachs, J. P. (2016, April). Avoiding Focus Shifts in Surgical Telementoring Using an Augmented Reality Transparent Display. In Medicine Meets Virtual Reality 22 MMVR (Vol. 220, pp. 9-14). Link

  • Zhou, T., Cabrera, M. E., & Wachs, J. P. (2016). A Comparative Study for Touchless Telerobotic Surgery. In Computer-Assisted Musculoskeletal Surgery (pp. 235-255). Springer, Cham. Link

  • Zhou, T., Cabrera, M. E., & Wachs, J. P. (2015, January). Touchless telerobotic surgery-is it possible at all?. In Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence Austin, USA, 26 – 31 January 2015 (pp. 4228-4230). Link

  • Andersen, D., Popescu, V., Cabrera, M. E., Shanghavi, A., Gomez, G., Marley, S., Mullis, B., & Wachs, J. (2015). Virtual annotations of the surgical field through an augmented reality transparent displayThe Visual Computer, 32(11), 1481-1498. Link

  • Ruiz, E., Acuña, R., Certad, N., Terrones, A., & Cabrera, M. E. (2013, October). Development of a control platform for the mobile robot Roomba using ROS and a Kinect sensor. In Robotics Symposium and Competition (LARS/LARC), 2013 Latin American (pp. 55-60). IEEE. Link

  • Ralev, D., Cappelletto, J., Grieco, J. C., Certad, N., & Cabrera, M. E. (2013, October). Analysis of oscillators for the generation of rhythmic patterns in legged robot locomotion. In Robotics Symposium and Competition (LARS/LARC), 2013 Latin American (pp. 124-128). IEEE. Link

  • Cabrera, M. E., Bogado, J. M., Fermin, L., Acuna, R., & Ralev, D. (2012). Glove-based gesture recognition system. In Proceedings of the 15th International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines Baltimore, USA, 23 – 26 July 2012 (pp. 747-753). Link

  • Alves, R., Ruella, C., Cabrera, M. E., Fermin, L., Cappelletto, J., Diaz, M., Grieco, J., Fernandez-Lopez, G. & Armada, M. (2010). Communication System for the Underwater Platform PoseiBoT. In Proceedings of the Twelfth International Conference on Climbing and Walking Robots and the Support Technologies for Mobile Machines Istanbul, Turkey, 9 – 11 September 2009 (pp. 358-366). Link

 
 
RESEARCH OVERVIEW

Overview of One-Shot Gesture Recognition

Humans are able to understand meaning intuitively and generalize from a single observation, as opposed to machines which require several examples to learn and recognize a new physical expression. This trait is one of the main roadblocks in natural human-machine interaction.  Particularly, in the area of gestures which are an intrinsic part of human communication. In the aim of natural interaction with machines, a framework must be developed to include the adaptability humans portray to understand gestures from a single observation.

This framework includes the human processes associated with gesture perception and production. From the single gesture example, key points in the hands' trajectories are extracted which have found to be correlated to spikes in visual and motor cortex activation. Those are also used to find inverse kinematic solutions to the human arm model, thus including the biomechanical and kinematic aspects of human production to artificially enlarge the number of gesture examples.

Leveraging these artificial examples, traditional state-of-the-art classification algorithms can be trained and used to recognize future instances of the same gesture class.

Overview of STAR

 

With the System for Telementoring with Augmented Reality (STAR), we aim to increase the mentor's and trainee's sense of co-presence through an augmented visual channel that will lead to measurable improvements in the trainee's surgical performance. The goal of the project is to develop a system which can effectively provide remote assistance to novel surgeons or trained medics, under less than ideal scenarios, like rural areas or austere environments like the battlefield. The system connects the novel surgeon or medic to a remote expert who may be in a Trauma 1 hospital in a major US city.

There are three more areas where the proposed system has the potential to improve quantitative and qualitative outcomes in the military healthcare setting. First, telementoring can avoid or diminish the loss of surgical skills through retraining and competence. Second, it will be useful to provide instructional material in a simulation form, to support doctors serving in Iraq and Afghanistan in traumatic care in the battlefield, in a portable and dynamic style. Finally it will allow recent combat medic graduates to reinforce surgical techniques that were not conceptualized during their training curricula.

Overview of Taurus Tele-operation Project

Current teleoperated robot-assisted surgery requires surgeons to manipulate joystick-like controllers in a master console, and robotic arms will mimic those motions on the patient’s side. It is becoming more popular compared to traditional minimally invasive surgery due to its dexterity, precision and accurate motion planning capabilities. However, one major drawback of such system is related with user experience, since the surgeon has to retrain extensively in order to learn how to operate cumbersome interfaces.

To address this problem, we have developed an innovative system to involve touchless interfaces for telesurgery. This type of solution, when applied to robotic surgery, has the potential to allow surgeons to operate as if they were physically engaged with the surgery in-situ (as standard in traditional surgery). By relying on touchless interfaces, the system can incorporate more natural gestures that are similar to instinctive movements performed by surgeons when operating, thus enhancing the user experience and overall system performance. Sensory substitution methods are used as well to deliver force feedback to the user during teleoperation.

 
CONTACT ME

© 2018 by Maru Cabrera. Proudly created with Wix.com

Maru Cabrera

Research Associate at University of Washington

Email:

mecu@cs.washington.edu 

  • gscholar_logo
  • Icono social LinkedIn
This site was designed with the
.com
website builder. Create your website today.
Start Now