A group of people launched a petition a year ago defending the ban on the use of social networks by minors under the age of 16, collected more than 12 thousand signatures and delivered it to parliament, where it remains “under consideration”.

In December last year, a new petition emerged calling for the virtual world to be regulated and for children up to the age of 16 to be banned from accessing social media. The idea was to create “a safer and healthier environment for the growth and development” of young people and, in just a few months, the idea gained thousands of supporters. In April he arrived at the Assembly of the Republic (AR).

The petitioners ask for a model similar to “the one being implemented in Australia”a country that in December should go down in history as the first in the world to ban the use of social networks by children under 16.

The measure is among the most comprehensive ever taken by an Australian government to address concerns about social media use among teenagers and children. The idea is that platforms such as TikTok, Facebook, Reddit,

A study commissioned by the Australian government revealed that 96% of children aged 10 to 15 use social media and that seven in 10 had already been exposed to harmful content or behavior, which can promote eating disorders and even suicide. Furthermore, one in seven children surveyed reported having been the target of harassment by adults or older children, and more than half said they had suffered cyberbullying.

“Algorithms create an environment of intense social comparison and teenagers are often exposed to images and stories that portray apparently perfect lives, which can lead to feelings of social inadequacy and low self-esteem”, warns the petition that is being analyzed by the Parliament’s Committee on Constitutional Affairs, Rights, Freedoms and Guarantees, according to the AR website.

The problem has gained increasing prominence and the European Parliament is also expected to debate a report this Tuesday that calls for the acceleration of the application of European Union (EU) laws for Digital Services and Artificial Intelligence, filling legal gaps and ensuring that services become safer for minors. MEPs will vote on the report on Wednesday that also defines 16 as the minimum age to access social media.

The idea corroborates the Portuguese petition, which argues in its text that just as it happens “in the real world” where it is prohibited to sell alcohol and tobacco to children, access to social networks should also be limited.

The petitioners warn that the excessive use of social networks is “associated with an increase in mental illnesses in adolescents and pre-adolescents, such as anxiety and depression”, and accuse social networks, such as Facebook, Instagram or TikTok, of not having mechanisms that verify the already existing ban on use by children under 13 years of age.

Concern about the effects of using social media led the Ministry of Education to ban the use of smartphones in schools up to the 6th year, with the possibility of this ban being extended to older people on the table.

PE wants to evaluate banning access to social networks for children under 13 and supervision up to 16

The European Parliament will discuss a report that recommends a ban on the use of social media by children under 13 and a minimum age limit of 16, even though this limitation is far from reaching consensus.

MEPs will discuss during this Tuesday’s plenary session in Strasbourg, France, a report that points out that social networks are doing little to protect children from content that could be harmful to them.

Therefore, the report that will be voted on on Wednesday proposes recommending that all countries in the European Union (EU) ban the use of social networks by children under 13 years of age.

For minors between 13 and 16, use had to be subject to parental authorization and supervision.

The report, authored by Danish socialist European parliamentarian Christel Schaldemose, was approved on October 16 and finds that the main digital platforms failed to protect minors, not only from the content they consumed, but also did little or nothing to prevent content that could create dependency, health problems and exposure to illegal content.

Calling for a concrete application of the legislation that these platforms must comply with, the report also lamented the lack of harmonization in digital rules between the 27 countries of the community bloc.

However, the position is far from generating consensus among the political groups represented in the Brussels and Strasbourg hemicycles.

Socialists, for example, are in favor of banning social networks for children under 13 and tightening access rules for minors in general, while conservatives prefer to focus on prevention and the parental role of supervision.

Even within political groups themselves, positions still do not generate consensus.

At the beginning of October, the European Commission sent requests for information to technological platforms Apple, Google, Snapchat and YouTube to check whether they respect the protection of minors on their online platforms in the EU because “this does not always happen”.

“We sent requests for information to four online platforms — Snapchat, YouTube, App Store and Google Play — to evaluate the practices they are adopting to protect minors”, informed the European Commissioner for Digital, Henna Virkkunen, upon arrival at an informal ministerial meeting in Horsens, for the EU presidency held by Denmark.

“It is necessary to ensure that the content that our children use on the internet is appropriate for their age”, which is also why the EU now has a Digital Services Law, which “clearly establishes that, when minors use online services, very high levels of privacy, security and protection must be ensured”, said the European responsible for this responsibility.

“Unfortunately, this does not always happen, and that is why the European Commission is strengthening the application of its rules”, said Henna Virkkunen.

These actions do not yet constitute formal investigations, but rather requests for clarification from the platforms in question under the new legislation.

In August 2024, the EU became the first jurisdiction in the world with rules for digital platforms, which are now obliged to remove illegal and harmful content, under the new Digital Services Law.

To this end, the European Commission defined very large online platforms, with 45 million monthly active users, which now have to comply with new rules.

The law was created to protect the fundamental rights of online users in the EU and became unprecedented legislation for the digital space that holds platforms responsible for harmful content, including disinformation and inappropriate content.

Technological companies that do not comply with these two new laws may face fines proportional to their size, which could result in fines of up to 6% of the companies’ global annual turnover.

Last September, in her State of the Union speech, the president of the European Commission, Ursula von der Leyen, admitted banning social networks for children in the EU, studying such a limitation until the end of the year.

“Just as in my time – we, as a society – taught our children that they could not smoke, drink and watch adult content until a certain age. I believe it is time to consider doing the same with social media”, said Ursula von der Leyen, at a time when countries such as Denmark, Greece, France, Spain, among others, are evaluating how to limit this use to minors.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *