Q&A Vanessa Lalo:
​
Why do you think initiatives like Youth Network are important?
​
Meta's Youth Experts Network is important because it relies on the user experience of young creators and their audiences. It is essential that we trust their online expertise and take into account what young people experience on platforms. This allows us to improve their wellbeing, to offer them experiences adapted to their practices and the modern values they promote, and to protect them from content they report.
How do you see the intersection of youth wellbeing and the online world?
The online world is an extension of the tangible world, so young people's wellbeing in online spaces also depends on how they feel about life and their overall mental health.
Social networks are spaces where the issues of adolescence and society are played out, so young people’s experiences online are crucial. Social media allows young people to maintain social ties with their friends, to identify themselves and build their identities, and to develop their creativity. Similarly, it's very important for teenagers to experiment together, to be part of groups and communities (which can be supportive, supportive, inspiring...) and this is what they can experience, in addition to other spaces, in online spaces.
Young people's online presence therefore needs to be supported to ensure that social media is a safe space for them to communicate, and provide them with the resources they need to build their own lives.
Social media can also help break down taboos surrounding mental health, enabling young people to better identify disorders to which they were previously unaware, and thus have more opportunities to seek advice, help, guidance and appropriate treatment.
In cases where social media reinforces young people's pre-existing disorders, it is imperative to offer them other models of identification, provide professional help, and teach them how to protect themselves by reinforcing their psychosocial skills.
In all cases, it's essential to guide young people towards content adapted to their age, interests, and needs, so as to better respond to any problems they may encounter, and guide them towards "safe spaces".
From the research, how can we support parents/guardians to play a more active role in young people’s online experiences?
Scientific research shows that, despite many clichés, being a "good parent" doesn't mean focusing on your child's screen time, but rather emphasizing parental support for digital practices.
Current (French) media and legislative discourse focuses on technical solutions to the detriment of a discourse to reinforce support for reasonable practices and quality content. Control and surveillance only work to limit screen time, but cannot be the solution to educate young people to evolve with online worlds, as tools cannot replace education and dialogue.
Similarly, the prevailing discourse on screen time focuses on "how to deal with the digital world", rather than encouraging a much more pertinent question: "how to work WITH" digital worlds, so that they can be a source of benefits?
Indeed, current debates tend to dramatize young people's digital practices rather than highlighting the technical and psychosocial skills they are developing with the platforms, and the gap is widening between the generations born with digital technology and their parents, who are increasingly worried, out of ignorance.
The main concerns of parents, however, lie outside the realities of young people’s experiences, who are only asking to be listened to and not judged on their practices, so that they can share and feel better understood and supported.
A majority of young people feel that adults aren't interested in their online experiences. This is because their parents tend to view platforms rather negatively, and have a rather anxious feeling about their children’s time spent online, or about possible harmful experiences (notably the notions of harassment, predators, and inappropriate/shocking content). The aim is not to neglect the possibility of negative experienceut protect children from all potential bad experiences, but to accompany childrenthem in the digital world in the same way as in the real world, by integrating the reality of a society undergoing technological change into educational principles.
At the same time, it's worth noting that a majority of parents would like to be better supported in their digital parenting, as they find themselves at a loss in terms of how to educate their children with tools they haven't grown up with or educated on, and so don't know any parental framework on which to base limits and frameworks that would be relevant to 21st century society.
The perception of young people and their parents is therefore very different, and this is the most essential point in understanding how to propose more appropriate prevention through dialogue and a better understanding of digital tools, so as to set frameworks that are better adapted to the realities of young people and their constructions.
What advice do you have for parents or guardians and teens to build together safe and empowering experiences online?
Taking an interest in young people's practices and engaging in dialogue with them is the key to reducing potential risks and proposing appropriate frameworks and limits.
Let's not forget that a large proportion of the time young people spend online is devoted to messaging, and that it's all about socializing with their friends - a fundamental part of their development. So, it's important to distinguish between time devoted to friends and time devoted to entertainment.
Similarly, it's important not to stigmatize online time as a waste of time for adults, when adults also spend a considerable number of hours online, even if the practices and content are very different.
For me, the best way to support young people's practices and understand how to find one's place as a parent is to pool everyone's experiences together, to avoid the vicious circle where everyone is alone on their screen in the family and doesn't talk to anyone about their online experiences and the content they enjoy. To facilitate intergenerational dialogue, there's nothing like sitting down together and showing each other one's favorite publications, videos or current events, so that everyone has a more objective view of the practices of other family members and can engage in appropriate dialogue, while at the same time promoting prevention that isn't based exclusively on negativity or fantasies. The advantage of being present side by side is that you can see how the young person is evolving online, so that you can discuss his or her practices, encourage critical thinking, and strengthen the bond of complicity and trust between parent and child.
Spending time together to better understand the world of young people also opens the door to discussions about the economic models of digital tools and the role models that young people consume, and thus plays an educational role in all areas of young people's lives.
Finally, a complementary solution is to scour the platforms as a parent to find accounts to follow that appear educational to adults. After all, young people will spend time online no matter what, so it's just as well to offer quality content to follow (reliable explanations of current events, awareness-raising accounts in a variety of topics, cultural and artistic accounts, learning supplements in math, French, history, science, etc.).
The aim of these collective support initiatives is to benefit from the best connected experiences, while helping to empower young people towards a civic life. It is important to respect young people's intimacy and privacy, without supervising them. Especially key is that parents and guardians forge a relationship of trust with their children that can protect them from inappropriate content, while offering a sufficiently attentive, non-punitive ear, so that young people can turn to their parents if they encounter a problem.
How can private companies/policy-makers/civil Society support safe use of online platforms?
​
There is little scientific research to provide reliable results, as platforms lack the openness of data and access to algorithms to obtain reliable methodologies and data for large-scale analysis. The recently implemented Digital Services Act is a step in this direction.
Another crucial element would be to oblige algorithms to offer positive content and suggestions for help when users' queries seem to lock them into negative bubbles, or when the user seems to be trying to put himself in danger. In particular, alert systems for risks taken by minors should be considered.
Platforms can also work to push only those influencers publishing quality content suitable for young audiences.
In addition, moderation is already a subject in which everyone is heavily involved, but efforts are not yet sufficient. Young people are turning away from reporting procedures because there are too many of them to process, and too often the feedback is inconclusive, even when the content is particularly offensive or impersonates other users. Moderation should be much more extensive and, above all, more reactive and humanized to encourage users to report with tangible results.
Peer-to-peer moderation could also be used to encourage positive online behavior and alert users who break the rules of use, or even collectively ban harmful behavior.
On the other hand, digital and media literacy is a field in which professionals are highly mobilized, but lack the human and financial resources to raise broad awareness among young people and their parents. Information and resources for responsible practices, provided by the platforms and designed in partnership with digital experts, would be essential if we are to make resources available to a wide audience, and aim to develop both the critical thinking skills of individuals and the mobilization of psychosocial skills.
What's more, to empower everyone, video playback on platforms should not be automatic, so that users have the choice of clicking to see the content, and avoid cognitive biases preventing users from being fully aware of their practices.
Finally, moving towards decentralized, more community-based platforms, where users feel responsible for each other, is an interesting avenue. What's more, increasingly ethical digital environments seem to be in line with the growing desire of users. So, social networks are at a turning point in their maturation, and all we have to do is join in.
ThinkYoung is a not-for-profit organisation, aiming to make the world a better place for young people by involving them in decision-making processes and providing decision-makers with high-quality research on youth conditions. ThinkYoung conducts studies and surveys, makes advocacy campaigns, writes policy proposals, and develops education programmes: up to date, ThinkYoung projects have reached over 800,000 young people.
Meta is a tech company with apps that you may know, like Instagram or WhatsApp. We work hard to build online spaces where young people can learn, connect, create, and have fun. We want young people to enjoy our platforms and to be safe, so creating spaces for young people to have their say on the future of platforms like ours is crucial.