Facebook just added more regulations that can get you banned

Facebook

Facebook is taking a unique spin on handling harmful activates on its platform. The tech giant is giving moderators the responsibility to take a number of actions besides just banning troll accounts, as the social-media platform is aiming to eliminate any form of misinformation and harassment.

Reuters first reported on the new policy on Friday, noting that the new approach that targets harmful content “uses the tactics usually taken by Facebook’s security teams for wholesale shutdowns of networks engaged in influence operations that use false accounts to manipulate public debate, such as Russian troll farms.”

However, the tech giant is advancing its approach by also targeting “groups of coordinated real accounts that systemically break its rules, through mass reporting, where many users falsely report a target’s content or account to get it shut down, or brigading, a type of online harassment where users might coordinate to target an individual through mass posts or comments.”

Facebook announced the news in a blog post, explaining that a new category within the rules and regulations has been created under the name “coordinated social harm.” One of the examples mentioned is the anti-vax movement “Querdenken,” which targeted German children, and whose members “used authentic and duplicate accounts to post and amplify violating content” as well as health misinformation.

“While we aren’t banning all Querdenken content, we’re continuing to monitor the situation and will take action if we find additional violations to prevent abuse on our platform and protect people using our services,” Facebook clarified.