The recent decision by Meta’s independent Oversight Board to overturn the removal of an Instagram carousel that honored older lesbian couples has reignited debate over how platforms handle queer expression. The carousel, written in Brazilian Portuguese, included photographs and commentary about generations of women whose partnerships were dismissed as friendships. One slide used the phrase “Toda sapatão é uma potência”, with sapatão traditionally a pejorative word that has been reclaimed by some lesbian communities as a term of pride. Meta originally removed the post under its Hateful Conduct policy, but after appeal the Oversight Board concluded the removal was an error and ordered reversal.
What the Oversight Board found
The Board’s ruling focused on two principal problems: misapplication of exceptions for self-referential use of slurs and flawed review of multi-image posts. Reviewers at Meta targeted a single image within the carousel rather than assessing the entire message, and flagged the content as hate speech despite context that framed the language in an empowering way. The Oversight Board emphasized that Meta’s own standards allow community members to use historically derogatory terms in a self-referential, positive sense, and that the company’s enforcement in this case did not reflect that nuance. The Board noted that Meta had reinstated the carousel after the case was selected for review, but still issued a formal rebuke to push for systemic change.
Why context and format matter
The case exposed technical and human moderation challenges that often misfire on LGBTQ+ content. Automated systems and rushed human reviews can strip away the narrative that surrounding images and captions supply. The Oversight Board highlighted the specific problem of the carousel format, where isolated frames may be evaluated out of sequence or without accompanying text. When moderation focuses on a single frame, it risks interpreting community-driven language as abusive rather than expressive. Advocates say this dynamic disproportionately impacts queer users discussing identity, sexuality, and words historically weaponized against marginalized groups.
Reclaimed language and moderation rules
At the heart of the dispute was the use of “sapatão”, a term with a fraught history in Brazil that has been reclaimed in many LGBTQ+ spaces. The Oversight Board made clear that exceptions in the Hateful Conduct policy exist for exactly these situations—when a community member uses a term about themselves or within an affirming context. The Board criticized Meta for failing to apply that exception here, underscoring the need for moderators and algorithms to be trained on community-specific language patterns and cultural context, rather than relying solely on keyword matches.
Carousel-specific shortcomings
Carousels combine images, captions, and ordering to produce meaning, and the ruling highlights that moderation workflows must consider that interplay. The Board pointed out that the image Meta ultimately removed did not even contain the flagged word, yet the whole post suffered because reviewers had not evaluated the complete carousel. This technical blind spot—where platforms treat each slide as an isolated item—can lead to wrongful takedowns of celebratory or educational queer content and contributes to a broader pattern of suppression on social networks.
Reactions and policy context
Advocacy groups responded swiftly. GLAAD described the episode as emblematic of a repeated pattern where LGBTQ+ voices are unintentionally suppressed on Meta’s platforms and called for better training and recognition of queer self-expression. The dispute arrives against a backdrop of wider scrutiny: in January 2026 Meta adjusted parts of its hate speech rules to allow more debate around “transgenderism and homosexuality,” drawing criticism from researchers and activists. The Oversight Board had earlier recommended removing the term “transgenderism” from policy guidance, arguing it frames identity as an ideology rather than an intrinsic characteristic. Meta said in a March compliance report that it was still assessing that recommendation, a process the Board and advocates continue to watch closely. Separately, the Board has faced controversy for permitting some anti-trans material to remain online; in an April 2026 ruling it allowed content misgendering a transgender woman and girl to stay up on Facebook and Instagram.
What this means for Meta and users
The Oversight Board’s finding is both a rebuke and a roadmap: platforms must refine moderation tools to distinguish between harassment and community-specific reclamation, and they must redesign workflows so multi-part posts are evaluated holistically. Meta acknowledged the Board’s decision and said it had restored the carousel and recognized the phrase was used in a positive, self-referential way. The Board itself—an independent body funded through a Meta-created trust—can issue binding outcomes on case-level disputes and nonbinding policy advice. For LGBTQ+ users and advocates, the ruling is a reminder that progress in content moderation depends on sustained improvements in training, algorithmic context-sensitivity, and clearer policy language to prevent future errors.

