Legal records highlight that Meta has a history of hesitancy in safeguarding minors on Instagram.

Legal records highlight that Meta has a history of hesitancy in safeguarding minors on Instagram.

According to the complaint, recently revealed files from New Mexico’s legal action against Meta highlight the company’s past reluctance to prioritize the safety of children on its platforms.

In December, Raul Torrez, the Attorney General of New Mexico, filed a lawsuit against Meta, the owner of Facebook and Instagram. The lawsuit claims that the company did not adequately protect young users from being exposed to material related to child sexual abuse and also allowed adults to request explicit images from them.

Newly revealed sections of the legal case on Wednesday include internal communications and presentations from 2020 and 2021 that demonstrate Meta’s knowledge of problems such as potential communication between adult strangers and children on Instagram, the sexualization of minors on the platform, and the risks associated with its “people you may know” feature, which suggests connections between adults and children. However, according to the revealed sections, Meta was slow to take action in addressing these concerns.

In 2021, Instagram implemented restrictions on adults messaging minors. An internal document mentioned in the legal case revealed that Meta was in a rush to address a situation in 2020 where an Apple executive’s 12-year-old was solicited on the platform. The document stated, “this is the type of behavior that angers Apple and could result in our removal from the App Store.” The complaint states that Meta was aware of the issue of adults soliciting minors on the platform and only took urgent action when necessary.

In a report titled “Child Safety – Current Status (7/20)” released in July 2020, Meta outlined potential vulnerabilities in their products that could pose a threat to children. These included difficulties in reporting disappearing videos and the absence of certain safeguards on Instagram that were available on Facebook. The company’s justification for this was to not hinder communication between parents and younger relatives on Facebook, according to the complaint. The author of the report criticized this reasoning as inadequate and accused Meta of prioritizing growth over the safety of children. However, in March 2021, Instagram announced measures to restrict adults over the age of 19 from messaging minors.

During a private conversation in July of 2020, an employee inquired about the company’s actions towards preventing child grooming on TikTok. Another employee responded by stating that their efforts towards child safety were minimal and not a priority for the current half-year. This information was revealed in a lawsuit.

According to the complaint, Instagram did not properly handle the problem of inappropriate comments being made on posts by minors. This was also mentioned by Arturo Béjar, a former director of engineering at Meta, during his recent testimony. Béjar, who is known for his knowledge in preventing online harassment, shared his own daughter’s concerning encounters with Instagram.

As a parent, I am here today to share my personal experience of my child being targeted with unwelcome sexual advances on Instagram. During this testimony to a group of U.S. senators in November, I explained how my child and their friends have endured terrible situations, including persistent sexual advances and harassment.

In a child safety presentation in March 2021, it was pointed out that Meta has not invested enough in preventing sexualization of minors on Instagram. This is evident in the sexualized comments on posts made by minors. This not only creates a negative experience for creators and bystanders, but also allows harmful individuals to find and communicate with each other. The lawsuit highlights the social media company’s past hesitation to implement necessary protections on Instagram, despite having them in place on Facebook.

Meta, which is Menlo Park, California, has been updating its safeguards and tools for younger users as lawmakers pressure it on child safety, though critics say it has not done enough. Last week, the company announced it will start hiding inappropriate content from teenagers’ accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders.

The state of New Mexico has also filed a complaint, joining 33 other states in a lawsuit against Meta. The lawsuit alleges that Meta is causing harm to young individuals and exacerbating the mental health crisis among youth by intentionally creating addictive features on Instagram and Facebook.

The chief executive officers of Meta, Snap, Discord, TikTok, and X (formerly known as Twitter) are expected to appear before the United States Senate in late January to discuss child safety.

Source: wral.com