Moderators Claim Nextdoor Has Failed To Address QAnon Concerns
The neighborhood-based social network has grappled with the proliferation of conspiracy theories and falsehoods surrounding the 2020 U.S. presidential election, moderators claim.
Nextdoor, the neighborhood-based social network, is grappling with conspiracy theories and falsehoods surrounding the 2020 U.S. presidential election that have been making more noticeable waves on Twitter and Facebook of late.
Moderators of the social network have struggled to deal with QAnon content floating around the site, according to screenshots from the platform obtained by The Verge. Now in the wake of last week’s insurrection at the Capitol, Nextdoor moderators wonder when they can expect concrete action from the company to crack down on content promoting conspiracies.
Since at least October, Nextdoor moderators have requested the social network explicitly ban QAnon content, as the company’s policies currently only ban “violent content” and “misinformation related to the Election and COVID-19,” but do not explicitly address QAnon conspiracy theories, nor the discussion of conspiracy theories more generally. Last week, moderators began pressing the company more aggressively on its policies in the National Leads Forum, a private forum for the company’s moderators.
In the days following the Capitol riot, one Nextdoor user returned to an old QAnon thread to reiterate, “I am bumping this up. It’s January 8th. Any policies yet? After the past week, we need some. I also wrote an email to Next Door Leadership about this three months ago and got no response.”
A few days later, Caty K., Nexdoor’s head of community, replied after directing moderators to the company’s policy on violent content: “I want to reiterate that the broader Nextdoor team is committed to the safety of all members and communities on the platform … The violent events that took place at the US Capitol last week are no exception.”
Although QAnon theories spread on the network might imply violence, some moderators say it can be difficult to remove them based on Nextdoor’s moderation policy if they do not more explicitly express violence.
“The problem is this policy is written so specific to election and Covid-19 information and does not mention any violation that can be used for things like misinformation around politics and inciting fear in the community,” one moderator wrote.
Caty K. clarified in another thread posted on Monday that “Nextdoor views QAnon as a hate group. If you see content or members advocating these ideologies, please report them to our team and we will handle. I recognize we do not have a list of groups available for you all to reference, and I will work on that to make things clearer, but for now this comment serves the purpose of confirming that QAnon content should be removed.”
Although Caty K.’s post serves as a declaration of QAnon as a hate group in one specific thread, the company does not explicitly state Nextdoor’s stance on the conspiracy group in its public policies, where users could easily access such information.
Nextdoor, however, did notify Inman that it has communicated its definition of QAnon as a hate group to its “Leads,” members of the social network’s community with additional privileges like growing a neighborhood and keeping its information up to date, voting to remove content that they believe violates Nextdoor guidelines, verifying members of the community and adjusting a neighborhood’s boundaries.
In response to last week’s storm on the Capitol, the company published separate blog posts on how to report and remove content that violates Nextdoor’s guidelines and how to enforce community guidelines within groups. The second post, which was published on Wednesday, specifies that posts within groups that “condone the violent acts in Washington” will result in the group’s removal from the platform.
Nextdoor’s moderation system is somewhat laissez-fair in that neighborhoods are largely self-governed with unpaid “community leads” who are responsible for reporting and removing content within their communities. As a result, the company has faced criticism in the past for either removing content in error or leaving content up that should have been taken down. For instance, content supporting the Black Lives Matter movement that was posted this past June was removed from the platform in error, The Verge reported.
Back in October, Recode had also reported on QAnon content permeating the social network in the remaining few weeks leading up to the presidential election, calling out the plight of moderators to try and tamp down on such content.
Another component of Nextdoor’s policy that makes it more challenging to moderate content is that because national politics are not allowed to be discussed in the network’s main feed, these discussions are held in public or private groups, where they may be more likely to slip through the cracks.
“My concern is QAnon content, as well as other content with conspiracy theories, promotions of violence, etc., that is in *private* groups that won’t get reported because the members of the group WANT that content,” Carol C., a moderator in Colorado, wrote in one forum last week. “I saw some of this type of content in the public political groups that have since gone private.”
“Any post or content on Nextdoor that organizes or calls for violence will be immediately taken down,” a Nextdoor spokesperson told Inman in an emailed statement. “To be clear, we define ‘advocacy’ broadly to include language that glorifies or demonstrates support for violence or those who commit violence, and we encourage our members to report any accounts that they see behave in this way. Additionally, we will remove any posts that violate our election misinformation and be helpful not hurtful guidelines. This includes false or misleading claims about the results of the election, content that advocates interference with the election process, or calls for actions to prevent a peaceful transfer of power or orderly succession. Nextdoor’s Neighborhood Operations Team also uses a combination of technology and member reports to proactively identify and remove content that violates these rules.”
As President-Elect Joe Biden’s inauguration date draws nearer, some have become concerned that more trouble at the Capitol may be in store. Airbnb announced on Wednesday that it would block and cancel all reservations in Washington, D.C. the week of the inauguration due to an increased risk of violence and rioting.