YouTube and Facebook on Thursday pledged to take additional steps to remove violent content from their platforms as part of efforts to combat online extremism. The Alphabet-owned streaming service also said it would take steps to educate young users to recognize misinformation and manipulation tactics. Microsoft also said it would provide a cheaper version of the tool used to detect and prevent violence for schools and small organizations, according to a report. Internet companies faced government scrutiny following the January 6, 2021, attack on the US Capitol.
At a White House summit to combat hate-fueled violence, YouTube said it would remove content that promotes or glorifies acts of violence from the video streaming platform, even if the uploader is not an extremist member. organization, accordingly Report via Reuters.
Meanwhile, the report said that YouTube also announced that it will use the service to teach young users ways to spot misinformation and manipulated content online.
Meta-owned Facebook has partnered with researchers at the Middlebury Institute of International Studies’ Center on Terrorism, Extremism and Counterterrorism, while Microsoft will reportedly provide schools and small organizations with cheaper versions of the company’s AI and machine learning tools. and prevent violence.
Earlier this month, it was reported that Parlour, a social media application that has gained popularity among conservatives in the US, returned to the Google Play Store a year after it allegedly failed to moderate violent content. On January 6, 2021, supporters of Donald Trump attacked the US Capitol.