top of page
Youth Network - Icon.png
Youth Network - LogoType.png
Stephan Dreyer.jpg

Q&A Dr. Stephan Dreyer:

Why do you think initiatives like Meta’s Youth Experts Network are important? 

One of the main tasks of the Youth Experts Network is to support selected influencers who produce content regarding digital wellbeing and online safety. Influencers, with their high profile in relevant audiences and their possibility to create awareness for relevant topics, play a pivotal role in shaping youth’s digital culture. Their creation of content promoting digital wellbeing and online safety is vital, in my opinion: Firstly, they represent the increasingly important voice of young people on online platforms - a voice that we as adults have not listened to enough in the past. Secondly, influencers can demystify complex risks and topics using accessible language, making risk awareness and solution finding more approachable for a younger audience. Thirdly, they serve as role models; their endorsement of safe online practices can encourage their fans and followers to adopt healthy or safe behaviors. In promoting good practice, they bridge the gap between authoritative guidance and youth culture. If participating influencers require information on a specific topic, be it pedagogical or related to legal aspects, if they ask for support in coming up with relevant topics and best practice(s), or in case they just need a sparring partner in the context of their content creation process, approachable expert networks like this one can help multiplicators to tell their story in the best possible way.

 

How do you see the intersection of youth wellbeing and the online world?

Youth wellbeing and online environments are strongly intertwined. On the one hand, digital platforms offer unprecedented opportunities for learning, socializing, and self-expression, which are integral to the developmental needs of young people. On the other hand, these platforms can expose youth to various risks, including cyberbullying, exposure to inappropriate content, and privacy issues. The challenge lies in supporting and strengthening the opportunities while minimizing the risks. An important task here is to underpin the opportunities while using such platforms to build awareness, resilience, and to inform about good practices as well as to deepen knowledge about features, functionalities, and helpful coping strategies. Information-based measures that aim at preventing risk and harm are fundamental to digital wellbeing. Where risks materialize despite good awareness initiatives and preventive measures, the task is to mitigate any harm and actively help young people to find guidance, counseling, and support. Promoting digital literacy, fostering resilience, and encouraging healthy online habits on online platforms are essential in ensuring that the digital world contributes positively to youth wellbeing.

 

Looking back on 2023, what is the main challenge you saw when it comes to youth online safety? 

Reflecting on 2023, a primary challenge in youth online safety has been the adoption of digital services by increasingly younger audiences, leading to more inexperienced users prone to harmful content and exploitative actions like privacy violations, cybergrooming or disinformation. The ongoing shift in online use of minors has outpaced both parental understanding and regulatory responses when it comes to young children (i.e., under 13 years) that use services aimed at teens and adults on mobile devices, either with knowledge of their caretakers, or without.

 

What is on your radar for youth online safety and youth wellbeing in 2024? 

While the phenomenon of generative AI (e.g., ChatGPT, Bard, Midjourney or DALL-E) has already been a hot topic during 2023, we will see its impact on children’s everday life taking off in 2024 – both in good and in potentially harmful ways. When it comes to online safety, content generated by such technologies will make it more difficult for young people to recognize false information and propaganda; they will increasingly be affected by manipulative messages, deep fakes, impersonation, and infringements of their right in one’s own image and voice. At the same time, the same technology will enable them to learn better (maybe even faster) and in more diverse ways, let them find new creative ways to express themselves and – maybe – make it easier to identify artificially produced or enhanced content.

 

What would be your advice for young people and their guardians when it comes to a safe and healthy use of social media? 

My advice for all families is to cultivate digital literacy and mindfulness in everyday life: Understand the trending platforms, their privacy settings, and be aware of the personal information you share. Engage critically with content, recognizing that not everything online is accurate or well-intentioned. 

For parents and caretakers, it’s crucial to maintain open, ongoing conversations about online experiences: talk, reflect, discuss! Show genuine interest in the content the children like as well as in the interactions they are part of. Encourage a balanced online-offline life and set a good example with your own digital habits as a parent. Above all, create an environment where your children feel comfortable discussing their online experiences with you!

 

What role do you see policymakers playing in youth online safety? 

Policymakers play a crucial role in shaping the landscape of youth online safety nowadays. Their power (and responsibility) lies in ensuring a digital environment that systematically keeps digital children’s rights in mind, with a special focus on considering their three core dimensions of protection, empowerment and participation. Modern online safety legislation incorporates these aspects into proactive, preventive, and educational measures, and ensures that the voices of young people are systematically included in the respective policy discussions. To circumvent the legal gap, i.e.,ensuring rules keep up with fast-evolving digital environments, it will be a good idea to use rather broad and principle-based legal duties combined with co-regulatory frameworks, allowing co-regulatory bodies to lay the groundwork for concrete measures and initiatives of service and content providers.

 

What role do you see for tech companies in continuing to better young people’s experience on social media?

Tech companies, as the actual architects of digital environments, have a significant responsibility in shaping young people’s experiences online. For digital children’s rights to thrive in social media, companies must prioritize safety and wellbeing in their platform design systematically (“positive platform governance”). This is a structural change in many companies’ logics that needs continuous backing and active devotion. Here, age-appropriate design is a current approach to offer services and functionalities depending on the evolving capacities of young users. However, to fully understand the skills, assets, needs and expectations of children, it is crucial to have them participating in the product development cycle. Besides designing their services, companies should also invest in educational initiatives that promote digital literacy and wellbeing – including initiatives that have an impact beyond their own services.

 

What’s your viewpoint on the DSA and its approach to harmonized youth regulation? How does it benefit and how does it pose a challenge to youth online safety?

The Digital Services Act (DSA) represents a significant step in harmonizing children’s online safety regulations across online platforms. Its main benefits include setting a minimum standard for platform providers who are obliged “to ensure a high level of privacy, safety, and security of minors” (Art. 28(1) DSA), with more obligations for very large online platforms to assess and mitigate systemic risks regarding, inter alia, children’s rights (Art. 34, 35 DSA). Since the DSA incorporates Art. 24 ECHR it is accepted that Art. 28(1) DSA obliges platform providers to use a digital children’s rights perspective when implementing these due diligence obligations.

However, the DSA also poses challenges, especially regarding its vagueness: As the minimum standards are not easy to be operationalised, are context-dependent, and will also vary across different online platforms, the legal uncertainty for platform providers will remain high in the first years after the DSA’s applicability. Narrowing the platforms’ duties of care down to something that can be objectively measured and supervised will be a hard task; this will be one of the challenging objectives of the special group on the EU Code of Conduct on Age-Appropriate Design. Another challenge will be the platforms’ task to balance the need for safety with the rights to empowerment and to participation as well as the preservation of freedom of expression: There is a structural risk of overengineering due diligence measures aiming at protection, potentially leading to restricted access to potentially beneficial online resources or to stifling the options to participate in digital culture.

Finally, vague legal obligations and providers’ ample leeway in implementing due diligence measures lead to a wide phalanx of different features, functionalities, and procedures; this makes it challenging for children and their caretakers to keep track of all measures provided for their safety. Here, policymakers, platform providers, safer internet centers, counseling services, and schools will have to cooperate to support awareness building and to provide offers of assistance to children and their caretakers. This takes time and needs significant resources – something the DSA has not considered sufficiently.

THY Logo - Old Green No Claim.png

ThinkYoung is a not-for-profit organisation, aiming to make the world a better place for young people by involving them in decision-making processes and providing decision-makers with high-quality research on youth conditions. ThinkYoung conducts studies and surveys, makes advocacy campaigns, writes policy proposals, and develops education programmes: up to date, ThinkYoung projects have reached over 800,000 young people.

META - THY.png

Meta is a tech company with apps that you may know, like Instagram or WhatsApp. We work hard to build online spaces where young people can learn, connect, create, and have fun. We want young people to enjoy our platforms and to be safe, so creating spaces for young people to have their say on the future of platforms like ours is crucial.

bottom of page