Facebook Bans Violent Extremist "Boogaloo" Groups, Pages, and Posts -- But Why Did It Take So Long?



According to The New Paper email newsletter, a free, concise, and informative round-up of the day's major headlines, Facebook has taken action against the "boogaloo" movement's use of the platform to spread misinformation, incite violence, and stoke hate:


"Facebook banned a violent anti-government network linked to the “boogaloo” movement yesterday, noting that the network (which included over 1,000 accounts, pages, and groups across Facebook’s platforms) actively sought to commit violence. The announcement comes as Facebook faces increased scrutiny over its content moderation practices (including a growing advertiser boycott)."


You can read Facebook's full announcement about the move here.


What is the Boogaloo Movement?


The Boogaloo movement is a loose group of anti-government extremists who believe a second American civil war is imminent and, in fact, want to cause violence that precipitates said civil war so that society can be remade from the ashes. They see the Covid-19 pandemic and the widespread protests of racial injustice as both evidence of a coming conflict and an opportunity to bring it about.


The Boogaloo movement doesn't have a central organization or leadership. It appears to be more of a banner with which various subgroups and individuals identify. Many of them espouse violence or commit it. One man who identifies with Boogaloo ambushed and killed multiple law enforcement officers, using protests of George Floyd's death as cover.


Facebook has thousands of pages, groups, and posts associated with the Boogaloo movement. Much of this content aims to incite armed violence.


Why Did Facebook Wait So Long to Take Down Boogaloo?


If groups, pages, and individuals that identify with the Boogaloo movement were posting material that incites violence, why did it take so long for Facebook to remove the hateful rhetoric that violates its own policies against such speech?


Facebook says that there's a lot of stuff on Facebook, and it takes time for their content moderators and algorithms to locate and remove material that violates its policies.


I say, if Facebook can present each user with a customized newsfeed, then it can probably design an algorithm that pushes posts promoting violence and hatred so far down the rabbit hole that few, if any, people ever see them. Or better yet, Facebook could block those posts with an algorithm that identifies certain keywords and images that clearly indicate violence.


So why does Facebook continue taking down individual pages, groups, or posts in a labor-intensive, time-consuming, perpetually-too-late, piecemeal fashion?


Until recently, it's been more profitable to make excuses for letting most policy-violating content slip through the cracks and into people's newsfeeds.



Facebook's History of Tolerating Hate and Inciting Violence


This isn't the first time that Facebook has tolerated hateful rhetoric and speech that incited violence. Facebook has always prioritized profits.


Facebook itself admits that its platform was used to stoke genocide against the Rohingya Muslims in Myanmar.


Rumors spread on Facebook have been directly linked to violence against ethnic minorities in Sri Lanka and India.


Rodrigo Duterte, the authoritarian leader of the Phillipennes, has used Facebook to spread rumors of crimes by his political opponents as a pretense for arresting them.


In Libya, rival factions have used Facebook to wage information wars that benefit their real-life fighting.


And can we even describe the effects of misinformation posted to Facebook about the Coronavirus, vaccines, and other health topics?


Rhetoric has consequences.



The Advertiser Boycott of Facebook Is Working


Recently, hundreds of advertisers have begun boycotting Facebook for its failure to moderate content that incites violence and stokes hatred.


Giant conglomerates like Unilever, Verizon, Coca-Cola, Ford, Microsoft, and many more have publicly committed to halting advertising on Facebook during July. And Facebook has noticed.


Even a company as big as Facebook can't ignore the loss of so many major advertisers. Facebook gets 99% of its revenue from ads, so it's dependent on companies and individuals.


It remains to be seen whether these companies will continue their boycotts long enough to motivate real, lasting change at Facebook, but it is encouraging that Facebook already seems to be paying attention to the overwhelming concern about the hateful rhetoric on its platform.


It's a not coincidence that Facebook finally took strong action against the Boogaloo movement after advertisers began voting with their dollars.


Perhaps advertisers can push Facebook to finally take strong action against the ocean of misinformation leading users astray.


Eric Sentell teaches writing and rhetoric. He is the author of How to Write an Essay like an Equation and Become Your Own Fact-Checker.
Learn more about his work, sign up for a newsletter, and get free excerpts at www.EricSentell.com.


Contact Eric Sentell:

jamesericsentell@gmail.com

© 2020 by Eric Sentell. Proudly created with Wix.com