Balancing Act: Preserving Children’s Rights in the Digital Age

Key Points

  • The time has come for a national conversation about youth rights in the digital age.

  • Our nation’s youth deserve broad access to safe resources and information on the Internet.

  • With the right guardrails in place, the internet, including social media, has the potential to be a positive place for children.

By: Teddy Hartman

Social media’s impact on youth’s well-being has become a top concern for lawmakers on both sides of the aisle. With growing concerns over mental health, misinformation, and data privacy, policymakers have placed social media giants in their sights through legislation like KOSA or congressional hearings. A more recent and drastic attempt to thwart these concerns is the set of new laws in Utah that will require children under 18 to have parental consent to access social media platforms and allow parents to have access to their children’s accounts.

While many have already posed questions about how these laws could be effectively enforced or what unforeseen consequences exist, approaching this issue through the lens of educational rights illuminates additional considerations. The United Nations established that children have a fundamental right to access information and express themselves freely. Under COPPA, children also have the right to be free from online tracking and targeted advertising. As a former high school teacher and current Head of Privacy & Data policy at a company focused on creating safer digital learning environments for children, I understand and respect that thinking about children’s rights in the digital age is a true balancing act.

As policymakers increasingly wade into regulating children’s access to and use of online technologies, the time has come for a national conversation about youth rights in the digital age. Otherwise, we will find ourselves with a patchwork of state laws in which a teenager in one state has a different set of digital rights from a teenager in another. And when there is no national standard, inequities naturally arise.

Having this conversation at the national level would help us think critically about the intersection of technology, children’s rights, and equity. Should youth in different states have different rights to access information? What happens to youth with different family structures or custody situations? What happens to youth whose socio-economic status prevents them from accessing traditional news sources blocked by a paywall? What kind of “chilling effect” happens if LGBTQ+ teenagers are forced to provide parents access to their accounts — and those parents don’t accept their sexual orientation or gender expression? In other words: how could overly restrictive policies actually create more harm to youth in the name of protecting their privacy?

Having this conversation at the national level would help us think critically about the intersection of technology, children’s rights, and equity.

Teddy Hartman

To be sure: these are not easy questions to answer.

In the edtech space, however, we have been trying to balance students’ privacy and safety with their right to access information for decades. When enacting CIPA in 2001, Congress saw both the enormous potential of the internet to enhance learning, and the dangers of providing children with unfettered access to the Internet. Navigating this balance, Congress enacted a law that requires schools and libraries to implement safeguards that protect children from harmful and explicit content in order to receive certain federal funding.

Fast forward to today and we find that more than 80% of students use a school-issued device as part of their learning. The ubiquity of devices in children’s hands has further fueled the debate around what type of content children should or should not be exposed to. While parents and educators overwhelmingly agree that the internet is a useful learning tool that should be used in schools, 77% also agree that unrestricted internet access can be detrimental to student mental health. This is where products like GoGuardian come into play, letting schools easily add filters that protect students from harmful and explicit content while still allowing for safe internet exploration.

Our nation’s youth deserve to have broad access to safe resources and information on the internet. With the right guardrails in place, the internet, including social media, has the potential to be a positive place for children. We put great effort into protecting students from harmful content in schools, so we should consider approaching social media in a similar way. Requiring thoughtful content filters and age-appropriate design that encourages digital citizenship will be far more effective in the long run than sweeping restrictions.

Even critics of social media companies agree with this. At a recent hearing before the Senate Judiciary Committee on kids online safety, all six of the witnesses testifying indicated they would not be in favor of setting an age limit for social media. Instead, they advocated for policies that would make online places safe for people when they do decide to join social media.

As we continue to forge ahead with new technologies like generative AI and the metaverse, our nation’s youth deserve a rational and common sense approach to creating open, accessible, and safe online environments. We do ourselves a disservice as a nation if we use youth access to technology as a political pawn. Rather, the time has come for all of us to have a national conversation about youth rights in the digital age.

Teddy Hartman is the Senior Director of Privacy & Data Policy at GoGuardian.

Guest Author

Getting Smart loves its varied and ranging staff of guest contributors. From edleaders, educators and students to business leaders, tech experts and researchers we are committed to finding diverse voices that highlight the cutting edge of learning.

Discover the latest in learning innovations

Sign up for our weekly newsletter.

0 Comments

Leave a Comment

Your email address will not be published. All fields are required.