$25
Question [1] (Designing for Self-Disclosure): You and three of your friends have been contracted by a large social computing company, say Facebook, to build a capability, e.g., an app for their platform that would allow self-disclosure. The purpose of this self-disclosure app is to enable people who are survivors of sexual abuse to find support and be able to talk about their (stigmatizing) feelings they are unable to share otherwise or elsewhere, often offline.
a) ) Include some sketches and low fidelity prototypes to illustrate the design of this app (hand drawn sketches are fine as long as they are legible). What features have you particularly included in your design and why? How will these features work in your design? What features you have made sure it doesn’t have? You can draw inspiration from existing platforms and the papers you read in the class this far.
b) ) Propose a study design, of any type, through which you will assess if the above app, if incorporated, will be successful in achieving its goal – support users’ self-disclosure needs. Provide your rationale behind this study design, the challenges in execution of the study, and possible ways to mitigate these challenges. Assume you have unlimited time and resources to conduct this study, however you are external to Facebook, i.e., do not have access to internal policies, curation practices, or server data.
Question 2 (Moderation and Regulation of Behaviors): Please read the chapter titled “Regulating Behavior in Online Communities” by Sara Kiesler, Robert Kraut, Paul Resnick and Aniket Kittur [1]. In the chapter, Kiesler et al. provide an elaborate list of various design choices that could be utilized to regulate or curb non-normative or bad behavior in online communities. The table in pgs. 36-37 gives a summary of these design considerations, categorized into several groups – 1) Selection, sorting, highlighting; 2) Community Structure; 3) Feedback and Rewards; 4) Access Controls; 5) Roles, rules, policies, and procedures; and 6) Presentation and framing. Building on our readings in the class so far, answer the following questions, focusing on three scenarios where non-normative behavior is present, and how Kiesler et al’s design choices may help regulation:
a) Case #1: The goal is to limit hate speech, or pro-harassment attitudes and behaviors (e.g., around gender identity, sexual orientation, or race/ethnicity) targeted at live streamers on Twitch. Discuss which of the six design choice categories will be the most appropriate here. Provide
1
your rationale behind your choice. How does it contrast with relatively naïve
regulation/moderation strategies like removals or bans? Describe how robust is your chosen design choice category – is it easy to be “gamed”; does it need considerable hands-on involvement of the moderators; could it hinder community growth and participation?
b) Case #3: The goal is to limit misinformation and low credibility information shared on Reddit communities around behaviors and strategies that can be damaging to health and well-being. Discuss which of the six design choice categories will be the most appropriate here. Provide your rationale behind your choice. How does it contrast with relatively naïve regulation/moderation strategies like removals or bans? Describe how robust is your chosen design choice category – is it easy to be “gamed”; does it need considerable hands-on involvement of the moderators; could it hinder community growth and participation?
c) Comparison of Design Choices: Were the design choices you suggested for cases #1 and #2 largely similar or dissimilar? In either case, describe reasons driving these similarities or dissimilarities. Comment on the opportunities and challenges of adopting similar or dissimilar designs in regulating different types of non-normative or bad behaviors.