Research

My research journey has deep roots in Cognitive Science, with a focus on social emotional technology, the nuances of non-verbal human communication and expression, and the pioneering fields of augmented and virtual reality. In the sections that follow, I detail my research endeavors across industry applications, graduate studies, and undergraduate exploration. In each section, you'll find a detailed overview of the notable research methods I've used.

Industry Research

Overview

As a Researcher on Meta's AR/VR Face Tracking Team, I leverage expertise in behavioral analysis, facial science, and research design and execution to enhance avatar technology and improve user experiences. I lead research initiatives, collaborate with numerous cross-functional teams, and integrate user insights into product strategy, driving innovations in face tracking, AI enhancements, and usability.

Key Methodologies

  • Surveys
  • Prototyping
  • A/B testing
  • Interviews
  • Competitive testing
  • Research-through-design
  • Observation
  • User studies
  • Facial Action Coding System (FACS) frameworks

Research Areas

  • Facial Expression and Behavioral Analysis - Specializing in the study of human expression, with a focus on facial movements, visemes, and linguistic cues to inform advancements in AR/VR face tracking and avatar technology.
  • User-Centered Design and Usability - Leading research initiatives to evaluate and refine features through usability testing, prototyping, and user feedback, ensuring optimized user experience and satisfaction.
  • AI and Data-Driven Enhancements - Utilizing qualitative insights and data annotations to inform strategic improvements in AI models, enhancing tracking accuracy, aesthetics, and overall system performance.
  • Data collection design for training ML models -  In pursuit of enhancing our machine learning models, I have developed data collection materials aimed at addressing the existing gaps in our product's dataset.

Cross-Functional Partnership

  • Engineering
  • Product
  • Art
  • Data collection
  • Data management / annotation
  • Vendor teams

Graduate Research

Key Methodologies

  • Experiments
  • Research-through-design
  • Landscape analysis
  • Ethnography, autoethnography
  • Secondary research
  • Wizard of Oz
  • Think-aloud protocol
  • Affinity diagramming
  • Observation

Overview

During my undergraduate studies, I had the privilege of contributing as a Research Assistant (RA) across three distinct research labs, each focusing on a different area of psychology and cognitive science: developmental psychology, computational and experimental cognition, and high-level perception.

Key Methodologies

  • Eyetracking
  • Interviews
  • Secondary research
  • Think-aloud protocol

Research Areas

Developmental Psychology

  • Research area: Exploring moral reasoning among college students.
  • Role: Conducted interviews on contentious scenarios, digitally coded and analyzed data, and played a key role in the development of the final publication.

Computational & Experimental Cognition

  • Research area: Delved into psycholinguistics, specifically examining articulatory onsets in natural language production.
  • Role: Analyzed video and acoustic data to understand speech patterns and contributed to data processing and interpretation.

High-Level Perception 

  • Research area: Intensive research on human perception within virtual reality, focusing on face perception and gaze biases.
  • Role: Led participants through experimental protocols, utilized MatLab for data coding, engaged in bi-weekly team discussions, and contributed as an author to final publications.

Research Area

My Master's thesis, titled 'Avatar Styles and Choices,' explored the intricacies of avatar customization in virtual environments and its impact on collaborative creativity and performance. This research delved into the motivations and preferences behind users' avatar design choices, utilizing the virtual environment as a lens to understand these dynamics. Key questions addressed included:

  • The influence of different avatar styles on user interaction within Social VR contexts.

  • The significance of real-world identity in the selection of avatar designs, examining preferences for realistic versus fantastical avatars.

  • The role of avatar design in facilitating interpersonal connections and collaboration in VR settings.

  • The effect of varying avatar styles on group creativity and the dynamics of collaboration.

  • Through this investigation, the study aimed to shed light on how avatar design and environmental context shape user experiences, collaboration, and creative outcomes in virtual reality

    → Check out my thesis , all 68 pages of scholarly delight, under the "Education" tab 

Examples + Success Stories

  • Meta Quest Audio to Expression - Launched November of 2024, Audio to Expression is an AI-based that uses voice input to generate realistic facial expressions on virtual avatars. By analyzing audio signals, such as tone, pitch, and cadence, the model predicts corresponding facial movements, enabling avatars to mirror the speaker's emotions and expressions. This enhances immersion and communication in AR/VR experiences by creating more lifelike interactions. As a key stakeholder in this project, I defined quality standards, designed and executed user studies, A/B tests, and surveys to ensure the technology accurately captured and reflected lifelike facial expressions, driving enhancements to user experience and system performance.
  • Boosting AI model performance through user insights - Through strategic user research and heuristic evaluations, I continously identify over 10 opportunities per month for improving AR/VR avatar software and artistic models. Recently, one key recommendation led to adjustments that increased user preference by 7%. These insights not only enhance the accuracy and aesthetics of avatars, but have also established a feedback loop for continuous AI model optimization.
  • Establishing standards for facial expression quality - I have standardized quality benchmarks across 7+ research projects using the Facial Action Coding System (FACS) and UX evaluation frameworks, ensuring consistent performance in facial tracking models. By aligning efforts across product, engineering, and design teams, these benchmarks enhanced cross-functional collaboration and contributed to significant improvements in lifelike avatar expressions and user experience.
  • Competitive analysis with Apple's Memoji - Conducted a competitive analysis of Apple’s Memoji to benchmark our avatar technology against a market leader. This analysis highlighted our strengths and pinpointed areas for improvement, particularly in artistic representation and face tracking. The insights obtained guided enhancements to our avatars' visual appeal and functionality.

Overview

With the Social Emotional Technology lab at UCSC, I studied how multi-user VR environments can affect collaborative success in meeting use cases where users are represented by stylized avatars.

Undergraduate Research