Many states are implementing measures to restrict the amount of time children spend on social media.

Many states are implementing measures to restrict the amount of time children spend on social media.

More and more states are beginning to mandate that social media companies develop child-friendly versions of their websites, while Washington grapples with how to protect children.

Connecticut in June updated a privacy law to require online platforms to conduct children’s safety assessments, make design changes to help kids avoid harmful material and limit who can contact minors using messaging tools.

Earlier this month, Vermont presented a similar bill and a lawmaker in Illinois plans to introduce one next week. Legislators in New Mexico, Maryland, and Minnesota are currently working on updating bills that were submitted last year.

States are taking action due to their belief that social media is a contributing factor to the rise of mental health issues among minors, while Congress has not taken action. There is agreement among both parties on Capitol Hill to take further action, but they are divided on whether a national privacy standard should supersede state laws.

Vermont state Representative Monique Priestley, a Democrat, proposed a bill for child-safe design and expressed the importance of a national level protection for all children. She also mentioned the collaboration among states to address this issue in the interim.

Certain states, such as Utah and Arkansas, do not require website layout modifications, but have enacted legislation that mandates minors to obtain parental consent before accessing social media. Lawmakers in South Carolina and New York are currently discussing bills that aim to regulate the algorithms utilized by social media companies to target content towards minors.

In a different approach, 33 states filed a lawsuit against Meta, the parent company of Facebook and Instagram, in October in a federal court in San Francisco for allegedly violating children’s privacy. If the lawsuit is successful, it could potentially require the company to make changes to its websites.

The legal dispute is extensive and the result is uncertain. A law passed in 2022 in California, which requires modifications to website design, is currently on hold as a technology organization has filed a lawsuit against it in a federal court.

Tech companies are concerned about the possibility of having to follow different laws in each state. They are taking action to persuade state legislators that new regulations are unnecessary.

In order to achieve this, companies are implementing stricter measures to monitor children’s online content. Meta is introducing additional safeguards to shield children from potentially harmful content, including posts about violence, sex, and eating disorders.

The companies claim they do not object to regulation, but they would rather have a single national standard instead of a collection of 50 state regulations.

According to Liza Crenshaw, a public affairs manager at Meta, varying laws in different states that impose different standards on apps will result in teenagers having inconsistent online experiences.

Federal rules or federalism

Eighteen months ago, the House Energy and Commerce Committee voted 53-2

Proposed legislation known as the American Data Privacy and Protection Act aims to grant greater authority to Americans over their personal information and prohibit the use of targeted advertisements towards minors.

This would have also established a fresh department within the Federal Trade Commission responsible for examining more regulations to safeguard children on the internet.

However, it was unable to progress in the Senate due to the chair of the committee having jurisdiction.Maria Cantwell

Senator Cantwell (D-Washington) refused to consider the proposal. She argued that it lacked sufficient measures for enforcing privacy laws and would override stricter state laws, such as California’s.

expressed previously.

Supporters of national privacy laws are still working to strengthen their argument. The head of the Energy and Commerce Committee is leading these efforts.Cathy McMorris Rodgers

Representative (R-Wash.) has conducted seven hearings on the subject of data privacy and initiated an inquiry into the ways in which data brokers make profits from data.

However, there is ongoing debate regarding whether a federal law regulating social media would establish a minimum standard for states to follow, or restrict states from going beyond a certain level, and this issue has not yet been resolved.

Evidence of harm

However, there is an increasing demand for action. According to the Centers for Disease Control and Prevention’s latest Youth Risk Behavior Survey, over 40% of high school students experienced feelings of sadness or hopelessness for a consecutive two-week period in 2021, causing them to lose interest in their usual activities. The survey also revealed that 30% of teenage girls have contemplated suicide, a significant increase from 10 years ago when only 19% reported the same.

There is concern among experts that social media companies are adding to the issue and making a profit from it.

During a hearing with a Senate Judiciary subcommittee in November, Arturo Béjar, former director of engineering at Facebook, stated that data from the company revealed that 20% of 13- to 15-year-olds were victims of bullying on the platform and 13% experienced unwanted sexual advances. Additionally, 40% reported engaging in negative social comparisons.

A report from the Center on Technology Policy at The University of North Carolina at Chapel Hill in 2023.

Supporters of the design regulations argue that they aim to establish safety guidelines for digital products targeted towards children, similar to those already in place for physical products. The proposed legislation is comprehensive in order to prevent any potential gaps, and mandates that technology companies evaluate the features of their websites and address any potential risks they may pose.

According to Minnesota state Representative Kristin Bahner, a Democrat who is working on one of the upcoming bills, the technology industry has not yet prioritized child safety.

Instead of attempting to limit access to content in a general manner, which may violate the First Amendment, Priestley’s proposed legislation seeks to prevent companies from exploiting children’s data and using it to target them with potentially harmful content that they did not intentionally seek out.

In 2021, the United Kingdom was the first to establish child-safe design standards. According to the International Association of Privacy Professionals, a privacy advocacy organization, this legislation has compelled social media companies to decrease the amount of data they gather on children and limit certain features.

For example, YouTube has deactivated a feature that automatically plays videos for young users, which may be seen as addictive. This feature would play videos continuously on the site. Additionally, the video-sharing platform owned by Google has implemented a “take a break” option and bedtime reminders for children.

According to Roy Wyman, a privacy attorney at Bass Berry & Sims, if social media companies become tired of operating their platforms differently in each state, a significant number of states could establish a unified national standard.

In accordance with U.K. regulations, Google’s parent company Alphabet made modifications to Google sites and YouTube worldwide to ensure child safety. As stated in a policy blog from last year, the company advocates for design principles that are suitable for different age groups.

The two-fold reaction from the technology industry.

Social media corporations are utilizing a dual strategy in response to the push for regulation.

Meta is taking proactive measures to modify its websites in order to safeguard children. Additionally, it has promised to back government legislation aimed at establishing regulations.

After Béjar’s testimony, Meta released a blog post advocating for a law that mandates parental consent for children under 16 when downloading apps.

Meta has recently introduced additional features to restrict suggested content for teenagers, enhance privacy settings, and make it more difficult to access content related to self-harm and eating disorders.

However, two membership organizations in the industry, NetChoice and the Computer & Communications Industry Association, are actively lobbying against proposed legislation that would require design modifications. NetChoice is also taking legal action to prevent the implementation of these laws.

The argument is that they infringe on the First Amendment rights of both children and businesses.

According to Carl Szabo, General Counsel for NetChoice, these regulations place the responsibility on tech companies to determine what is suitable for teenagers.

“He stated that The New York Times submitted a brief against the CA AADC, as the law is focused on restricting free speech on the internet.”

A legal case brought by NetChoice caused a federal court in San Jose, California to stop the enforcement of a child safety law in September.

The organization most

Recently, a federal judge in Columbus was persuaded.to stop Ohio’s new Parental Notification by Social Media Operators Act — which would require children to get parental consent to start an account — from going into effect while the judge considers the First Amendment argument.

In the previous month, Democrat Rob Bonta, who serves as the Attorney General of California, submitted an appeal against the court’s ruling to temporarily halt the state’s law.

The American Psychological Association and the American Academy of Pediatrics later submitted a legal document in favor of Bonta’s stance.

They state that the internet and social media pose special dangers to children.

Adolescents are “vulnerable to many of the manipulative design and privacy practices commonly employed by social media and digital platforms,” they said. “Broad protection across childhood and adolescence is needed.”

Source: politico.com