How Creators Publicly Request Feedback in an Open Online Community
TYPE
UX Research
DURATION
4 months
KEYWORDS
Online community
Feedback exchange
Collaborative and social computing
MY ROLE
Research Assistant
Advised by Prof.Steven Dow
& Regina Cheng
TECHNIQUE USED
Semi-structured Interview
Qualitative Coding
Semantic Analysis
Statistical Modeling
NLP
COLLABORATORS
Regina Cheng
MaySnow Liu
DELIVERABLE
Paper accepted to ACM CSCW 2020
OVERVIEW
Today, creative workers frequently post their works in online critique communities for feedback, such as reddit, dribble, etc., especially those with limited access to formal feedback resources. In order to support this process, previous research focused on scaffolding feedback, such as prompting feedback providers and helping creators make sense of feedback they receive. Our study, instead, aimed for understanding and supporting feedback seekers to effectively request feedback in the first place.

Empirical Setting:
We focused on one particular online critique community, the r/design_cirtiques subreddit. In this community, feedback request are posted as public forum message, and other members could respond their feedback in the thread
PROBLEM
How might we support feedback seekers to compose effective feedback requests in an online community?
HOW MIGHT WE
support feedback seekers to compose effective feedback requests in an online community?
PROGRESS
Study 1
"What Makes An Effective Feedback Request ?"
01.
Semi-structured Interview
We conducted 12 semi-structured interviews with focuses on
  • What do creators usually include in a feedback request?
  • What info do creators look for when they write feedback for others?
Our Participants
12 active users in r/design_critiques community aged over 18
On average 12.64 posts history (min = 8, max = 23)
4.04 years of experience in the community (min = 0.5, max = 7)
02.
Affinity Diagram
We found tensions between requesters and providers opinions on what makes a good feedback request. 
Study 2
"How Different Request Strategies Affect Feedback Responses?"
01.
Qualitative Coding - Thematic Analysis
We first sampled 879 feedback requests from r/design_critiques and qualitatively code feedback request strategies indicated by the interview study. We iteratively developed a coding scheme that describes the feedback request strategies in the community based on themes emerging from Study 1. This iterative approach resulted in 7 strategies for feedback request
02.
Statistical Analysis - NLP
We then used computational methods to analyze 3652 feedback responses to the requests and calculated 5 measures:
03.
Regression Analysis
We looked for relationship between the request strategies and feedback measures through a regression analysis. With the regression analysis, we were able to map out how each request strategy makes a difference in measures of resulting feedback. 
FINDING 1
In general, presenting self-critique, design variants and novice status in feedback request will lead to better feedback.
FINDING 2
Most successful strategies are rarely used
FINDING 3
Calling out their novice status in the request is correlated to better justified feedback 
DISCUSSION & IMPLICATIONS
These findings lead to the design implications on future design of online feedback systems. Systems should help creators effectively frame their requests. Specifically: Online feedback systems could help feedback seekers frame requests such that: