Selected User Research Work

I specialize in quantitative methods, but always employ a mixed-methods approach. I believe that combining quantitative and qualitative insights is the best way to understand user behavior at scale.


Why do users engage with a new ad format up to 25x the rate of traditional ads on Spotify Free?

Diary Studies, Interviewing, Logs Analysis, AB Testing, SQL, Python, Qualtrics

Problem: An initial test of this new ad format, in a relatively small group of users, was surprisingly successful. In order to grow the product, we needed to understand the root of our success.

Process: I conducted a diary study, interviews, and exploratory logs analysis to understand why users were engaging with a new ad format at substantially higher rates than traditional ads. I identified a key misalignment of qualitative findings with ad engagement rates in the log data; this misalignment suggested the presence of a previously undetectable bug. Then I worked with the team members to run an A/B test and confirm the presence, cause, and scale of the bug. The results showed that the bug caused up to 90% of instances previously thought to be user engagement.

Outcomes: My research led to a key change in how the new ads are created, as well as the projected profitability of the new ad format.


What information do parents prioritize when making choices about genetic tests for their children?

Survey Methods, Experimental Design, Interviewing

Problem: Our team was tasked with designing a digital tool to support parents who need to make decisions about genetic testing for their child. We did not know what parents cared most about when it came to genetic testing; we are were not sure that parents would be able to articulate their preferences on a complex, and potentially novel, topic.

Process: I created a conjoint survey to study parents’ preferences about genetic testing; this method analyzes participant choices rather than self-reported attitudes. Then I conducted in-depth interviews to confirm that parents could make informed choices on the survey; this ensured that analysis of choices would reveal parents’ true preferences about genetic testing . I then analyzed parents choices’ to quantify what they prioritized most when making choices about genetic tests.

Outcomes: My research informed a digital tool to support decision-making about genetic testing. The findings led to the addition of section educating users on disease risk, as well as a section educating users on the mathematical concepts necessary to understand relative risk.

See publication


How do we ensure that a quantitative metric is truly measuring the thing that we think it is measuring?

Usability Testing, Interviewing, Eye-tracking, Python

Problem: I identified flaws in the tutorial for a decision-making game that I had planned to use for my research. This was an important problem because we often assume, in cognitive research and in data science, that a click tells us something about the user’s motivation. However, if the user is confused about how an app or game works, then clicks could be random, and metrics could be misleading.

Process: I overhauled the game tutorial by introducing plain language and data visualizations that communicated the rules and complex dynamics of the game. I conducted usability tests to inform the functionality of the tutorial; I conducted interviews to measure the impact of the new language and visualizations on user understanding. I also tracked users’ eye movements while playing the game; eye-tracking can provide valuable insight into how the user makes choices.

Outcomes: 95% of end-users (n = 65) reported that they were confident in how to play the game following the tutorial and had eye-tracking metrics suggesting that they were motivated and able to make informed decisions during the game.