top of page
Youth Network - Icon.png
Youth Network - LogoType.png
Alessandro Niccolò Tirapani.jpg

Q&A Alessandro Niccolò Tirapani:

Why do you think initiatives like Meta's Youth Experts Network are important? 

 

First and foremost, to raise awareness on a crucial issue affecting a key audience of social media. Engaging actively with platforms is crucial to understand the opportunities and challenges that emerge in regulating these actors. The field is fast evolving, so it is paramount to have a detailed, critical, and updated discussion. The voice of these stakeholders must be heard loud and clear.

How do you see the intersection of youth wellbeing and the online world? 

 

Digital natives are immersed in social media, and form their opinions about the world, acquaintances, and themselves through these lenses. While this allows for incredible opportunities, it is also dangerous because content is not mediated in the same way as cable TV or magazines used to be. Unrealistic body images, fake news, and even doomscrolling (the over exposure to highly stimulating and cruel images, like war, injustices, or traumas) have dire effects on the human psyche, especially in developmental years. What is more, compared to older generations like Millennials, Gen Z does not have an equal benchmark with an offline world. Hence, they might even lack the tools to match offline and online adverse experiences of trauma, exclusion, or self-esteem.

 

Looking back on 2023, what is the main challenge you saw when it comes to youth online safety? 

 

Many are the challenges, but the chief one is the normalization, due to ease of production, of fake images, videos and audios. This goes well beyond fake news: the idea that media can prove something happened is being undermined. Without the ability to fact check or compare to offline sources, young users of social media may eventually be unable to distinguish what is real. This can lead to the idea that nothing can be trusted, undermining a sense of certainty and the basis of learning.

 

What is on your radar for youth online safety and youth wellbeing in 2024? 

 

There is no credible way back from AI-generated content. It is here to stay and will continue to evolve quickly. Waiting for overarching solutions, what is imperative is to develop quick tools to help young people to apply critical thinking and not to be negatively affected by them. Equally, the mass use of chat bot indistinguishable from humans has to be flagged, to avoid biases, damages, or unrealistic expectations in users. Finally, continued exposure to new information hampers the ability to reflect on what happened and process it (or even remember it). This is the challenge we face short term.

 

What would be your advice for young people and their guardians when it comes to a safe and healthy use of social media?

 

We do not have to reinvent the wheel. Parents taught youngsters how to deal with the TV or mobile phones. So, for those who can, it is imperative to walk youngsters through the use of new technologies. Second, keep an open channel of communication: young people should know they can talk of traumatic experiences with parents or guardians in a safe space. For instance, an advice is not to discard social media as something incomprehensible and mysterious for older people, but as a new media that is in continuity with the past. Critical thinking, avoiding taking information at face value, and not benchmarking oneself with unrealistic stereotypes is as true today as it was in the past.

 

What role do you see policymakers playing in youth online safety? 

 

This is complex but it is also high time to have a regulation. We see a deluge of local, sectorial, and voluntary regulations. This is not sufficient. Only the EU (for Europe) has the power, weight, and expertise to craft consequential laws. The UN should step up as well. Too often smaller attempts are aspirational but ineffective – or have even unintended consequences. Hence, policymakers should focus on targeted and concerted regulations that look at the future and cater to all the stakeholders involved.

 

What role do you see for tech companies in continuing to better young people's experience on social media?

 

My main advice is for tech companies to disclose as much information as possible in all areas where no opacity is needed. When a new product or feature is released, it is always hard to assess what could go wrong. But having a strong monitoring system (which I think should be independent and externalized within a legal, European framework) is the first step to avoid bad outcomes. For instance, users can signal with a button if content distresses them and why. Platforms should explain changes to algorithms promptly and in basic terms to all users. What is more, social media in particular have to move away from the attention economy, that is fighting tooth and nail for every split second of attention (for instance, how many milliseconds you look at a video). This race to the bottom in grabbing attention is not benefitting quality and business innovation (as there is an incentive to create more stimulating content and not better content). But it is also highly damaging for young users, as it rewires their brain in an addictive way to seek compulsively for new stimuli. A ‘slow internet’ is not only possible: it is a huge market, as saturation of attention and fatigue affect users. Driving towards endlessly stimulating, yet not deep, contents, alienates content creators.

 

What challenges and opportunities do you see for EU politicians in better understanding youth and leveraging that understanding in regulation?

 

The opportunities are many. Tech companies are bringing about radical innovations whose value is incommensurable. Innovation should not be stopped. But without guardrails, the attention economy of big data (not least due to the emergence of large language models) will suffocate good ideas. It will lead to monopolies and unchecked bad outcomes or practices. So EU politicians must work with the main stakeholders and develop clear and consequential regulation within the limits of the single market and the Digital Markets Act. They must also engage with programmers and hear their voices. A main challenge is to keep the power asymmetry steady: Lawmakers should not end up resorting to private regulation due to the inability to keep track with the complexities of technology or the economic size of platforms. This is needed not just for social justice or for youth wellbeing, but also to ensure there will be a lively market for ideas and products that is not biased by incentives to produce quick-and-dirty services that kill more innovative or useful ideas in the crib. Overall, the opportunity for EU politicians is to ensure that Europe becomes a leader in safe and useful technologies; and the threat, to crystallize oligopolies and allow the market of pernicious tech products because it was impossible to assess responsibilities and chains of causality.

THY Logo - Old Green No Claim.png

ThinkYoung is a not-for-profit organisation, aiming to make the world a better place for young people by involving them in decision-making processes and providing decision-makers with high-quality research on youth conditions. ThinkYoung conducts studies and surveys, makes advocacy campaigns, writes policy proposals, and develops education programmes: up to date, ThinkYoung projects have reached over 800,000 young people.

META - THY.png

Meta is a tech company with apps that you may know, like Instagram or WhatsApp. We work hard to build online spaces where young people can learn, connect, create, and have fun. We want young people to enjoy our platforms and to be safe, so creating spaces for young people to have their say on the future of platforms like ours is crucial.

bottom of page