The "Instagram Support Experiences" project was an ambitious endeavor aimed at enhancing the support mechanisms for over 2 billion users within the Instagram community. This initiative was particularly focused on addressing the challenges posed by objectionable content that violates community guidelines and disrupts user experience. The design challenge was multi-faceted, involving the reduction of negative impacts from such content and improving the reporting process, which had been hindered by policy confusion, perceptions of neglect, and the urgency of high-severity issues. A key opportunity identified was to rely on automation, specifically using an automated ranking system to maximize the value of human reviews. The project's scope was extensive, aligning with Meta’s overarching strategy for its family of apps. Spanning approximately six months, the project saw collaboration between Instagram's Well-Being team and Facebook's Integrity team, among others, aiming to unify support experiences across various platforms while leveraging artificial intelligence for content moderation. The project's goals were to enhance user experience and refine machine learning models for content detection, balancing user needs with business imperatives.
The design process for "Instagram Support Experiences" adhered to an agile development framework, utilizing tools such as Sketch/Figma for design and Principle for prototyping. The approach involved breaking down the overarching vision into smaller, manageable experiments conducted on a quarterly basis. These experiments focused on refining design, building, and testing components to gauge success before widespread implementation. A key decision was the hypothesis of moving towards an automation-first world, flipping the traditional content moderation model. This was complemented by the opportunity to use an automated ranking system to enhance the efficiency and impact of human reviews. This vision was crafted amidst resource constraints in content strategy, user research, and data science, which were mitigated by leveraging cross-company relationships and existing research. This collaborative effort was pivotal in shaping a comprehensive and effective design strategy.
The final design solution of the "Instagram Support Experiences" focused on mitigating risks associated with automated content moderation, adhering to principles of supportiveness, legitimacy, and trustworthiness. This was achieved by enhancing transparency, fairness, and user empowerment in the reporting process, and by integrating an automated ranking system to optimize the use of human review resources. Preliminary results before the project's conclusion showed promising impacts: a 43% increase in users opting for report outcome notifications, a 150% increase in the use of self-remediation actions, and a 7% decrease in user-generated report volume, indicating a more efficient and user-centric reporting system. These results underline the project's success in balancing user experience improvements with business needs. However, it also highlighted the ongoing challenge of refining integrity metrics at scale and the continuous need for optimization in balancing user and business needs within large social media platforms.