A video showing a decapitation was uploaded on YouTube and remained online for several hours, leading to inquiries as to why it was not removed earlier.

A video showing a decapitation was uploaded on YouTube and remained online for several hours, leading to inquiries as to why it was not removed earlier.

A disturbing video from a man in Pennsylvania who is charged with decapitating his father has gone viral on YouTube, bringing attention to the ongoing issue of social media companies’ limitations in stopping violent content from being shared online.

On Wednesday, authorities announced that Justin Mohn, 32, had been arrested and charged with first-degree murder and desecration of a body. He had allegedly decapitated his father, Michael, in their home in Bucks County and shared the gruesome act in a 14-minute YouTube video that was accessible to anyone.

The incident was reminiscent of the beheading videos shared by the Islamic State on the internet almost ten years ago. It happened while the CEOs of Meta, TikTok, and other social media companies were testifying before federal lawmakers who are concerned about the lack of progress in ensuring child safety online. Google-owned YouTube, a popular platform for teenagers, did not participate in the hearing.

The disturbing video from Pennsylvania follows other horrific clips that have been broadcast on social media in recent years, including domestic mass shootings livestreamed from Louisville, Kentucky; Memphis, Tennessee; and Buffalo, New York — as well as carnages filmed abroad in Christchurch, New Zealand, and the German city of Halle.

According to Middletown Township Police Captain Pete Feeney, a video was shared in Pennsylvania around 10 p.m. on Tuesday and remained online for approximately five hours. This delay raises concerns about the effectiveness of social media platforms’ moderation policies, especially during current conflicts in Gaza and Ukraine, as well as a heated presidential election in the United States.

Alix Fraser, the director of the Council for Responsible Social Media at advocacy organization Issue One, stated that this is yet another instance of companies failing to safeguard us. She expressed doubt in their ability to fairly assess their own actions.

A representative from YouTube stated that the video was taken down and Mohn’s channel was deleted. The company is actively removing any duplicate uploads that may appear. The video-sharing website utilizes both AI and human moderators to oversee its platform, but did not provide a response regarding the means by which the video was detected or why it was not addressed earlier.

Large social media corporations utilize advanced automated systems to moderate content, which is effective in identifying and removing prohibited material before it reaches a human moderator. However, these systems may not always be equipped to handle new or uncommonly violent and graphic videos, according to Brian Fishman, co-founder of the trust and safety technology startup Cinder.

According to him, human moderators play a crucial role in this situation. Although AI is advancing, it still has room for improvement.

Around 12:40 AM EST on Wednesday, the Global Internet Forum to Counter Terrorism (GIFCT) informed its members about a video in efforts to prevent its spread. The group, established by tech companies, allows the platform hosting the video to submit a “hash” (digital fingerprint) for the video. This triggers notifications to approximately 20 other member companies, enabling them to restrict the video from their platforms.

However, on Wednesday morning, the footage had already been shared on X. The platform featured a graphic video of Mohn holding his father’s head, which was visible for at least seven hours and received 20,000 views. Despite a request for comment, the company (previously known as Twitter) did not provide a response.

According to specialists in radicalization, the widespread use of social media and the internet has made it easier for individuals to become involved in extremist organizations and beliefs. This has created a platform for those who are inclined towards violence to connect with like-minded individuals and receive validation for their perspectives.

In the footage shared following the homicide, Mohn portrayed his father as a federal employee of 20 years and expressed beliefs in various conspiracy theories, while also ranting against the government.

Many popular social media sites have rules against sharing violent and extremist material. However, these rules are not foolproof and the rise of newer, less strictly monitored platforms has created a space for hateful ideas to spread without regulation. According to Michael Jensen, a senior researcher at START (Consortium for the Study of Terrorism and Responses to Terrorism) based at the University of Maryland, this lack of moderation has allowed these dangerous ideas to thrive.

According to Jacob Ware, a research fellow at the Council on Foreign Relations, social media companies must increase their efforts to monitor and control violent content, despite the challenges they may face.

According to Ware, social media has become a primary battleground for extremism and terrorism. This will demand dedicated and determined efforts to combat it.

Nora Benavidez, an attorney for the media advocacy organization Free Press, expressed her desire for technology reforms that include increased transparency regarding which employees are affected by layoffs and a greater focus on investing in trust and safety workers.

Google, the parent company of YouTube, recently terminated numerous staff members from its hardware, voice assistance, and engineering departments. In 2020, the corporation announced it had reduced its workforce by 12,000 individuals in various positions, divisions, and locations under the Alphabet umbrella, but did not disclose further specifics.

___

This report was contributed to by AP journalists Beatrice Dupuy and Mike Balsamo in New York, as well as Mike Catalini in Levittown, Pennsylvania.

___

Private foundations provide assistance to The Associated Press in order to improve its explanatory reporting on elections and democracy. To learn more about AP’s democracy initiative, click here. The AP is solely responsible for all of its content.

Source: wral.com