Upset over the number of unvaccinated COVID-19 patients in his hospital, the French doctor logged on to Facebook and uploaded a video urging people to get vaccinated. They soon received a flurry of dozens, then hundreds, then more than 1,000 hateful messages from an anti-vaccine extremist group known as V_V. Groups active in France and Italy have harassed doctors and public health officials, ransacked government offices and tried to disrupt vaccine clinics.
Concerned over the abuse of its platform, Facebook shut down several accounts linked to the group last December. But that didn’t stop V_V, which continues to use Facebook and other platforms and, like many anti-vaccine groups around the world, has expanded its portfolio to include climate change denialism and anti-democratic messaging. .
“Let’s go and get them home, they don’t need to sleep anymore,” reads a post from the group. “Fight with us!” The other reads.
The largely unchecked nature of attacks on the undeniable health benefits of vaccines highlights the clear limits of a social media company, especially its relentlessly aggressive efforts to thwart even the most destructive types of disinformation. Without it.
Researchers at Reset, a US-based non-profit, identified more than 15,000 abusive or misinformation-filled Facebook posts from V_V — activity that peaked in spring 2022, months after the platform announced its actions against the organization. . In a report on v_v’s activities, RESET researchers concluded that its continued presence on Facebook “raises questions about the effectiveness and sustainability of META’s self-reported interventions.”
Facebook’s parent company Meta said in response that its 2021 action was never to eliminate all V_V content, but to remove accounts that participated in coordinated harassment. After the Associated Press notified Facebook of the group’s continuing activities on its platform, it said it removed an additional 100 accounts this week.
Meta said it is trying to strike a balance between removing content from groups like V_V that clearly violates rules against harassment or dangerous misinformation, while not silencing innocent users. This can be especially difficult when it comes to the controversial issue of vaccines.
A Meta spokesperson told the AP, “This is a highly hostile space and our efforts are ongoing: Since our initial removal, we have taken a number of actions against this network’s attempts to come back.”
V_V is also active on Twitter, where Reset researchers found hundreds of accounts and thousands of posts from the group. Several accounts were created shortly after Facebook took action on the program last winter, Reset found.
In response to Reset’s report, Twitter said it took enforcement action against several accounts linked to V_V, but did not provide details of those actions.
V_V has proven particularly resilient to efforts to prevent it. Named for the film V for Vendetta, in which a lone, masked man seeks revenge on an authoritarian government, the group uses fake accounts to evade detection, and often coordinates its messaging and activities on platforms such as Telegram. which lacks Facebook’s more aggressive moderation policies. ,
According to Jack Stubbs, a researcher at Graphica, the data analysis firm that has tracked V_V’s movements, adaptability is one reason the group has been hard to stop.
“They understand how the Internet works,” Stubbs said.
Graphica estimated the group’s membership to be 20,000 at the end of 2021, with a small core of members involved in its online harassment efforts. In addition to Italy and France, Grafica’s team found evidence that V_V is trying to form chapters in Spain, the United Kingdom, Ireland, Brazil and Germany, where there is a similar anti-government movement known as Querdenken.
Groups and movements such as V_V and Querdenken have increasingly worried law enforcement and extremist researchers, who say there is evidence that far-right groups are using skepticism about COVID-19 and using their reach. Vaccines to expand.
Increasingly, such groups are moving from online harassment to real-world action.
For example, in April, V_V used Telegram to announce plans to offer a 10,000 euro reward to vandals who paint the group’s emblem (two in red versus one in a circle) on public buildings or vaccine clinics. The group then used Telegram to circulate pictures of the vandalism.
A month before Facebook took action on V_V, Italian police raided the homes of 17 anti-vaccine activists for making threats against government, medical and media figures for their alleged support of COVID-19 restrictions Telegram was used.
Social media companies have struggled to respond to a wave of misinformation about vaccines since the start of the COVID-19 pandemic. Earlier this week, Facebook and Instagram suspended Children’s Health Defense, an influential anti-vaccination organization led by Robert Kennedy Jr.
One reason is the tricky balance act between moderating harmful content and protecting free expression, according to New York University’s Joshua Tucker, who co-directs NYU’s Center for Social Media and Politics and Kroll, a tech, government and is a senior advisor. Financial consulting firm.
Striking the right balance is especially important as social media has emerged as a major source of news and information around the world. Leave too much bad content and users may be misinformed. Take too much down and users will start to distrust the platform.
“It is dangerous for us as a society to move in a direction in which no one feels they can trust information,” Tucker said.