A Meta engineer is appearing before Congress after witnessing his own child being harassed on Instagram.


At the same time that whistleblower Frances Haugen was giving testimony to Congress about the negative effects of Facebook and Instagram on children in the autumn of 2021, a previous engineering director at the social media company, who had returned as a consultant, sent a concerning email to Meta CEO Mark Zuckerberg regarding the same issue.

Arturo Béjar, known for his expertise on curbing online harassment, recounted to Zuckerberg his own daughter’s troubling experiences with Instagram. But he said his concerns and warnings went unheeded. And on Tuesday, it was Béjar’s turn to testify to Congress.

“I stand before you today as a father who has personally witnessed his child receiving unwanted sexual advances on Instagram,” he stated to a group of U.S. senators.

From 2009 to 2015, Béjar held the position of engineering director at Facebook and gained recognition for his efforts in addressing cyberbullying. He believed that the situation was improving. However, upon rejoining the company in 2019 as a contractor, Béjar discovered that his own daughter had started using Instagram.

On Tuesday, he stated that she and her friends started experiencing terrible situations, such as unwanted sexual advances and harassment. Despite reporting these incidents to the company, no action was taken.

According to The Wall Street Journal’s 2021 report, Béjar emphasized a significant discrepancy in the company’s handling of harm and the actual experiences of its users, particularly youth.

Two weeks prior, my 16-year-old daughter, who enjoys creating content on Instagram, shared a post about cars. Unfortunately, someone left a comment stating, “Get back to the kitchen.” This was extremely distressing for her. However, while the comment does not violate any policies, our options of blocking or deleting it mean that the person will likely continue to spread misogynistic views on other profiles. I do not believe that implementing stricter policies or increasing content review is the solution to this issue.

On Tuesday, Béjar spoke before a Senate subcommittee regarding the impact of social media on the mental health of teenagers. She aimed to bring attention to the fact that Meta’s leadership, including Zuckerberg, were aware of the negative effects Instagram was having but failed to take significant action to address them.

The individual suggests that Meta should revise their platform policies to prioritize addressing instances of harassment, unwarranted sexual advances, and other negative experiences, even if they do not explicitly violate current rules. For example, sending inappropriate sexual messages to minors may not necessarily be against Instagram’s guidelines, but the individual believes that teenagers should have the option to inform the platform if they do not want to receive such messages.

According to Béjar, it is evident that the executives at Meta were aware of the negative impact on teenagers, and there were feasible actions they could have taken, but they chose not to. This lack of action leads to a lack of trust in their ability to protect our children.

On Tuesday, Senator Richard Blumenthal, a Democrat from Connecticut and chair of the Senate Judiciary’s subcommittee on privacy and technology, began the hearing by introducing Béjar as an engineer highly regarded and well-liked in the field. Béjar was brought in to address the issue of protecting children, but unfortunately, his suggestions were disregarded.

Missouri Senator Josh Hawley, the top Republican on the panel, praised what you have presented to this committee as crucial information for all parents to know.

According to Béjar, the company conducted user surveys that revealed 13% of Instagram users between the ages of 13-15 had experienced unwanted sexual advances on the platform in the past seven days.

According to Béjar, the changes he proposes would not have a significant impact on the earnings or profits of Meta and other similar companies. He clarified that the goal is not to penalize these companies, but rather to support teenagers.

According to Béjar, the company may have described it as a complex issue, but it is not. Allowing teenagers to express that they are not interested in certain content can be used to improve and train other systems. This feedback will ultimately lead to improvement.

Amid a bipartisan effort in Congress, the witness statement is being made to support the implementation of regulations to safeguard children on the internet.

Meta released a statement acknowledging their ongoing efforts, both internally and externally, to ensure the safety of young individuals on their platform. They addressed concerns brought up by user perception surveys and how they have utilized these surveys to implement features such as anonymous notifications for potentially harmful content and comment warnings. Collaborating with parents and experts, Meta has also introduced more than 30 tools to assist teenagers and their families in promoting a safe and positive online experience. These efforts are ongoing.

According to Meta, there are guidelines in place for 2021 that address how Instagram handles content that is not in violation of their rules but may be considered problematic or low quality. This type of content, including clickbait, fact-checked misinformation, and borderline posts like sexually suggestive photos or posts containing profanity, hate speech, or gory images, will automatically receive reduced distribution on users’ feeds.

In the year 2022, Meta also implemented “kindness reminders” to remind users to display respect in their direct messages. However, this only applies to users who are sending message requests to a creator, not a regular user.

On Tuesday, there was a testimony following a lawsuit from multiple U.S. states against Meta for negatively impacting young individuals and exacerbating the mental health crisis among youth. These legal actions were taken in both state and federal courts, alleging that Meta intentionally creates features on Instagram and Facebook that cause children to become addicted to their platforms.

Béjar emphasized the crucial importance of Congress enacting bipartisan measures to promote transparency and access to assistance for teens, with the guidance of qualified professionals.

In his prepared statement, the author suggests that the best method for overseeing social media companies is to mandate the creation of measurable standards that can be used by both the company and outsiders to monitor and assess instances of harm reported by users. This approach capitalizes on the expertise of these companies, as data is a crucial aspect of their operations.

Source: wral.com