My research background is intensive in Cognitive Science, social emotional technology, non-verbal human communication/expression, augmented reality, and virtual reality. Below, I detail my industry research, graduate school research, and undergraduate research. Each section also lists relevant research methods.
Meta - Reality Labs Face Tracking Team
June 2018 - Present
At Meta's Reality Labs, I started as a Research Assistant, was promoted to Senior Research Assistant, and then to Cognitive Research Scientist in January of 2020. My work here centers around human expression-- specifically in the face and in language. For the face tracking team, I am a point of contact for questions regarding facial expression, visemes, research design for face tracking data collection, and data annotations.
- AR Face Tracking: Point-of-contact for all facial expression-related asks for Meta AR face tracking. Informing engineers and artists on how to capture and model authentic expression.
- Facebook Auto-generated Avatars: Defined types of facial features for avatars such as facial hair and hair types. Carried out user testing and offered facial science expertise
- Validating importance of face tracking for avatar tech: Research design & data collection to validate why face tracking is important for avatar tech, especially in multi-user VR environments
- Data collection design to train ML models: Designing data collection materials with emphasis in facial expression and visemes. Follows a Facial Action Coding System-based structure
- Facial Action Coding System (FACS) Framework
- Observational research
- Research through design
- Competitive Testing
- A/B Testing
University of California, Santa Cruz Social Emotional Technology Lab
October 2020 - Present
In the Social Emotional Technology lab at UCSC, I study how multi-user VR environments can affect collaborative success in meeting use cases where users are represented by stylized avatars.
My M.S. thesis, "Avatar Styles and Choices", seeks to investigate how users choose to design their avatars using environment as a proxy. And, how collaborative creativity performance is affected by both avatar design and environment. See more of what I'm seeking to investigate below:
- What do different avatar styles do for people in Social VR contexts? Is one’s real world identity important when choosing an avatar design
- What matters when picking an avatar design? (e.g., that the avatar looks like me, that the avatar is fanciful/outlandish)
- In terms of avatar design, what’s useful to people in an interpersonal context in VR?
- How does others' avatar style affect how we collaborate with them?
- How does avatar style affect creativity in groups?
- Research Through Design
- Landscape Analysis
- Ethnography, Autoethnography
- Games User Research
- Literature Reviews
- Wizard of Oz
- Think-Aloud Protocol
- Affinity Diagramming
University of California, Santa Cruz
March 2016 - March 2018
As an undergraduate, I served as a Research Assistant (RA) in three labs: developmental psychology, computational and experimental cognition, and high-level perception. Each lab is detailed below.
- Investigation of moral reasoning in college-aged students
- Interviewing particiants on controversial scenarios
- Digitially coding and interpretting data
- Contributions to final publication
Computational & Experimental Cognition
- Research in psycholinguistics
- Investigating articulatory onsets in natural language production
- Coding video and acoustic data
- Intensive research on human perception, virtual reality, face perception, and gaze biases
- Guiding participats through experimental process
- Coding data using MatLab
- Bi-Weekly team meetings
- Contributing author to final publication
- Literature reviews
- Think-aloud protocol