Meta recently shut down thousands of fraudulent Facebook accounts that were designed to manipulate voters and create division in anticipation of the 2024 election.

According to Meta, an individual in China has allegedly fabricated numerous false social media profiles that appear to be of American origin. These accounts were then used to disseminate divisive political material, potentially with the intention of causing division in the United States leading up to the upcoming elections.

The tech company, which owns Facebook and Instagram, detected and removed a network of about 4,800 fake accounts that were trying to gain followers. These accounts used fake images, names, and locations to mimic the profiles of regular American Facebook users participating in political discussions.

Instead of disseminating false information like other social media platforms, these accounts shared posts from X, formerly known as Twitter, that were originally created by politicians, news organizations, and other sources. The linked accounts shared content from both liberal and conservative perspectives, suggesting that their intention was not to favor one side, but rather to exacerbate political divides and intensify polarization.

The recently discovered network reveals how America’s enemies from other countries use technology platforms based in the U.S. to spread conflict and mistrust. It also highlights the significant dangers of online misinformation as national elections take place next year in the U.S., India, Mexico, Ukraine, Pakistan, Taiwan, and other countries.

Ben Nimmo, the head of investigations on Meta’s platforms, stated that these networks continue to face difficulties in attracting viewers. However, they serve as a cautionary reminder that foreign malicious parties are actively trying to reach users online in anticipation of upcoming elections, and we must stay vigilant.

Meta Platforms Inc., based in Menlo Park, California, did not publicly link the Chinese network to the Chinese government, but it did determine the network originated in that country. The content spread by the accounts broadly complements other Chinese government propaganda and disinformation that has sought to inflate partisan and ideological divisions within the U.S.

In order to seem like regular Facebook profiles, the platform would occasionally share content related to fashion or animals. Recently, several of these profiles suddenly changed their usernames and profile pictures to ones that implied they were located in India. These profiles then proceeded to share pro-Chinese posts about Tibet and India, demonstrating how fabricated networks can shift their focus to different subjects.

Meta frequently cites its actions against fabricated online communities as proof of its dedication to safeguarding the integrity of elections and democracy. However, detractors argue that the platform’s emphasis on counterfeit profiles diverts attention from its failure to address its role in allowing misinformation to spread on its platform, which has further fueled division and skepticism.

For instance, Meta will accept paid advertisements on its site to claim the U.S. election in 2020 was rigged or stolen, amplifying the lies of former President Donald Trump and other Republicans whose claims about election irregularities have been repeatedly debunked. Federal and state election officials and Trump’s own attorney general have said there is no credible evidence that the presidential election, which Trump lost to Democrat Joe Biden, was tainted.

The company has stated that its ad policy is centered on upcoming elections rather than past ones. Any advertisements that spread baseless doubt about future elections will be rejected.

And while Meta has announced a new artificial intelligence policy that will require political ads to bear a disclaimer if they contain AI-generated content, the company has allowed other altered videos that were created using more conventional programs to remain on its platform, including a digitally edited video of Biden that claims he is a pedophile.

Zamaan Qureshi, a policy adviser at the Real Facebook Oversight Board, expressed doubts about the credibility and reliability of this company. The organization, which consists of civil rights leaders and tech experts, has been vocal about their disapproval of Meta’s handling of disinformation and hate speech. Qureshi advises to pay attention to Meta’s actions rather than their words.

On Wednesday, executives from Meta held a conference call with reporters to discuss the network’s actions for the upcoming election year. These policies were mostly carried over from previous elections.

According to experts studying the relationship between social media and disinformation, 2024 will bring about new obstacles. Along with numerous national elections, the advancement of advanced AI technology makes it simpler to produce realistic audio and video content that could deceive voters.

According to Jennifer Stromer-Galley, a professor at Syracuse University who specializes in digital media, platforms are not fully acknowledging their responsibilities in the public domain.

Stromer-Galley called Meta’s election plans “modest” but noted it stands in stark contrast to the “Wild West” of X. Since buying the X platform, then called Twitter, Elon Musk has eliminated teams focused on content moderation, welcomed back many users previously banned for hate speech and used the site to spread conspiracy theories.

Both political parties, Democrats and Republicans, have urged for legislation to address algorithmic recommendations, misinformation, deepfakes, and hate speech. However, it is unlikely that any substantial regulations will be passed before the 2024 election. This leaves the responsibility of monitoring and regulating these issues to the platforms themselves.

Kyle Morse, deputy executive director of the Tech Oversight Project, a nonprofit advocating for new federal regulations on social media, believes that Meta’s current efforts to safeguard the election are a poor indication of what we can expect in 2024. He urges Congress and the administration to take immediate action to prevent social media platforms like Meta, TikTok, Google, X, and Rumble from assisting foreign and domestic actors who are actively working against our democracy.

Meta recently discovered several fake accounts this week that were very similar to accounts on X. Some of these accounts frequently shared Musk’s posts through retweets.

The accounts are still active on X. The platform did not respond to a message requesting a comment.

On Wednesday, Meta published a report assessing the possibility of foreign enemies such as Iran, China, and Russia utilizing social media to manipulate election outcomes. The report observed that Russia’s recent spread of false information has targeted Ukraine rather than the United States, using state-controlled media and misleading content in an attempt to weaken support for the country under attack.

Nimmo, the lead investigator from Meta, stated that Russia’s main goal in spreading disinformation during America’s upcoming election will be to sway public opinion against Ukraine.

In preparation for the 2024 election, Nimmo emphasized the significance of staying vigilant. He warned that as the conflict persists, we can anticipate heightened efforts from Russia to manipulate election discussions and target candidates who advocate for Ukraine.