09/23/2024 | News release | Distributed by Public on 09/23/2024 07:55
In recent years, there has been considerable attention paid to youth online safety and well-being. Legislators have proposed different technical interventions to improve the internet's safety for young people, including age-appropriate design codes, increasing parental controls, and requiring age verification. Some legislators have suggested banning young people from social media altogether. As the Open Technology Institute (OTI) has underscored in our report on age verification, youth online safety requires a holistic approach, as it's unclear if any of the commonly proposed technical interventions can fully or directly address the challenges that young people face online.
Many of the proposed technical interventions present feasibility issues as well as constitutional and privacy concerns for users. For example, OTI, along with other civil society organizations and legislators, have raised concerns about the censorship risks of the Kids Online Safety Act (KOSA), especially for youth in marginalized communities. In August 2024, a federal appeals court upheld a partial block of California's Age-Appropriate Design Code due to similar fears over censorship and speech regulation. And the age verification mandates appearing across state legislatures raise data privacy and security concerns and can result in over-censoring access to content for all users. OTI recently joined other civil society organizations in filing an amicus brief in support of an appeal challenging the constitutionality of Texas's new age verification law, which will be heard by the Supreme Court.
Online spaces should be safer for youth. However, quick tech fixes that can cause more harm than good are inappropriate solutions. Instead, youth deserve a thoughtful, holistic approach to improving their online experiences and overall well-being. Recent reports from the U.S. Surgeon General and the White House Task Force for Kids Online Health and Safety discuss both the benefits and risks of social media use and highlight the need for more research to fully understand its impact on children and youth.
Improving young people's mental health and overall well-being requires a nuanced approach that acknowledges and addresses the complex socioeconomic, community, and tech-based contributing factors. Addressing youths' well-being also requires recognizing the vast breadth of social, developmental, and online contexts of the millions of children, teens, young adults, and families across the United States. While many stakeholders seek to better understand technology's impact on youth mental health and development, common-sense policies that center user privacy, safety, and rights can help improve youth experiences and mitigate known online risks.
Policy Recommendations for Policymakers, Industry, and Civil Society
Growing concerns over social media's impact on youth have prompted states, schools, and parents to try to limit the tech's potential negative outcomes for young people. A 2023 Surgeon General report concluded that while social media benefits youth, it can also pose a meaningful risk to their mental health. Legislators have often sought to tackle these challenges through technical intervention. However, some of the most commonly advanced technical interventions have concerning implications for users and may cause more harm than good.
Inspired by the U.K. Children Code, there has been a recent campaign to pass age-appropriate design codes (AADCs) at the state level. The campaign proposes 15 standards for company practices based on developmental stages, including privacy-by-design and by-default and required data impact protection assessments.
Despite the laws' promising components, they face criticism for being overly broad and having larger implications for user access. As written, the AADCs charge online operators to prevent youth from "potentially harmful" material. However, without clear definitions of what that means, online operators may feel obligated to over-censor content, limiting access to protected speech and causing a chilling effect for all users. This may disproportionately impact marginalized communities and access to politicized content.
These concerns, in part, led to the blocking of California's AADC in 2022 and were reaffirmed in a recent federal appellate ruling. Similar censorship concerns have surrounded the Kids Online Safety Act, which proposes a "duty of care" to implement design features to "prevent and mitigate" mental health disorders such as anxiety, depression, eating disorders, and substance use disorders. Such a broad mandate could have significant consequences for free speech online. Further, there is no clear way to determine what may influence the development of such disorders for every individual.
Parental controls allow parents to filter content, set restrictions, monitor activity, enable permissions, and link their account with their child's. However, these tools place a high burden on parents, who may not have the capacity, willingness, or digital skills to use them effectively. In fact, despite their availability, parents simply are not using these features.
Recent data shared by Discord and Snapchat shows that fewer than one percent of minors have parents who use monitoring tools. This aligns with a 2016 Pew study of parents of 13-17-year-olds that found parents are "relatively less likely to use technology-based tools to monitor, block or track their teen" than other measures.
In addition, these tools raise concerns for youth privacy and safety. Parental controls, when overly invasive and restrictive, can be particularly dangerous for youth who are already vulnerable, such as LGBTQ youth, those seeking access to or information about reproductive health care, or those experiencing child abuse and neglect at home.
Current age verification practices require users to provide a government-issued ID, credit card, or biometric data to verify their age. Such requirements can significantly challenge all users' right to access content, as millions of Americans, particularly those in marginalized communities and under the age of 16, do not own a valid government-issued photo ID or hold a credit card.
Even when users have appropriate ID, without proper safeguards, the process of verifying user ages can endanger their data privacy and security. Previous efforts to implement similar age verification requirements have been ruled unconstitutional. In July 2024, however, the Supreme Court agreed to hear an appeal concerning Texas's new law requiring age verification to access adult content. This ruling could impact future determinations of if and in which cases age verification is constitutional-OTI's amicus brief to the court details the risks of such legislation.
Online spaces should be safer for youth. The challenges online environments and activity can pose to youth safety and well-being are serious. Pressing concerns such as child sexual abuse material (CSAM), cyberbullying, and access to age-inappropriate content require further attention and action to mitigate. And policymakers and public health officials' current focus on social media's impact on mental health and development deserves further exploration to deepen our understanding of exactly how social media-and technology use-impact youth across different stages.
At the same time, it is necessary to recognize that technological solutions can create additional challenges, such as potentially censoring access to content or putting user data at risk. And, on their own, technological solutions cannot adequately or fully address the challenges that youth face, many of which are rooted in offline issues. For example, today's generation of young people faces myriad challenges impacting their development and their outlook on life-only some of which may be amplified by online activity. Societal challenges, including increasing gun violence in communities, accelerating climate change, higher reports of sexual violence, increasing income inequality, and an ongoing loneliness epidemic, can all contribute to youth mental health outcomes and development. Simultaneously, the COVID-19 pandemic disrupted key events and life experiences while changing people's relationship with technology. Society is still trying to understand the pandemic's total impact on our collective physical and mental health, particularly for young people whose key developmental years were impacted.
While technological design can help mitigate some harms young people face online, it's important to be realistic about the limitations of technology's ability to address complex online safety issues comprehensively. Mental health and well-being are shaped by various complex socioeconomic, community, and tech-based factors. While online spaces can amplify challenges youth face offline, strategies to improve online spaces for youth cannot be addressed in isolation; rather, they must holistically consider all contributing risks and protective factors. Policies that promote common-sense protections can improve young people's experiences online by advancing privacy, security, transparency, and user agency. At the same time, these policies can reduce the potential for bad practices that amplify real-world challenges online and encourage excessive or problematic internet use.
Social media and other forms of online spaces can offer important benefits to youth, fostering a sense of connection and belonging as well as creating avenues for activism and advocacy. Rather than supporting policies that can exclude young people from online spaces, it's important to recognize the value of these spaces and work toward improving young people's experiences within them. There is a wide range of approaches, including investments in digital literacy and specific interventions to counter CSAM, that can improve online safety for youth. The following recommendations are not a comprehensive list of solutions to online safety challenges. Instead, they are foundational ways to center user privacy, safety, and rights that address some of the underlying concerns driving the technological interventions discussed above.
These recommendations collectively offer a clear starting point to improve experiences for all users, not just young people online. To more closely address discrete challenges, greater collaboration between stakeholders is needed. Tackling technology's role in the youth well-being and negative outcomes requires greater exploration to differentiate what challenges can be addressed through technical design and features and what must be addressed through larger social initiatives. There is no easy or quick solution. Improving youth online safety requires a thoughtful, nuanced, and holistic approach. Rather than attempting to implement one-size-fits-all technical interventions that do not address root causes, policymakers, industry, and civil society should tackle known issues by advancing legislation and solutions that promote user agency and platform accountability in a rights- and privacy-respecting manner.