Today’s children are digital natives, but that does not necessarily mean they know how to use information and communications technology (ICT; more commonly: IT) correctly and responsibly.
Having grown up surrounded by technological devices and the everyday use of the internet, today’s children learn to operate IT tools without any difficulty. Despite that, however, as pointed out in a guide put together in Spain by the Observatorio de la Infancia (Children’s Agency) and the National Cybersecurity Institute (Spanish only), children and adolescents do not know how to navigate the digital environment safely, just as they do not know how to do so in many other areas of their life, because they are neither aware of the potential dangers nor do they how to surf the web responsibly.
As a result, in the past several years we have seen the development of legislation and various political measures dedicated to protecting the rights of minors in the digital environment as well as the promotion of a number of initiatives, from both the European Union and the individual states, aimed at educating and raising awareness among children in this area.
At the European level
In 2021, the European Commission published the European Strategy for a Better Internet for Children 2021–2024, devoting one of the sections to the area of the digital and information society. In this section, it specifies the measures that the EU, the member states (MS) and the IT companies have to take or promote in order to ensure that children can safely navigate the digital environment and harness its opportunities.
Within the European Union’s legal framework for protecting children’s rights in this area, we find more directives than regulations: the EU leaves the vast majority of policy decisions on this subject in the hands of the member states, merely defining the objectives to be achieved. To that end, there are four European Commission directives that concern the rights of children in the digital environment, one framework decision (which has the same policy implications as a directive) and a regulation which has a general scope and imposes an obligation for all the EU member states. So children’s rights on the internet are protected by the following legal framework:
This directive sets the minimum standards for defining criminal offences in this area and also introduces provisions for improving the prevention of these crimes and the protection of the victims.
- Council Framework Decision 2008/913/JHA on combating certain forms and expressions of racism and xenophobia. This decision states the obligations of the member states, the behaviour that can be prosecuted, and the vulnerable people and groups.
- Directive (EU) 2018/1808 on audiovisual media services. This directive creates and ensures the proper functioning of the market of audiovisual media services in the European Union and sets standards for proper protections of consumers and minors.
- Directive (EU) 2019/882 on accessibility requirements for products and services. This directive provides some general requirements regarding the accessibility of certain products and services, including in the digital environment.
- Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data. This regulation offers some “digital rights” to the citizens of the EU, and obligates the member states as well as all companies, both European and foreign, that handle the data of EU residents to ensure these rights and comply with the regulation.
- Directive 2005/29/EC concerning unfair commercial practices. This directive is aimed at protecting consumers from unfair commercial practices. In the case of children, for example, it requires them to be protected from potentially aggressive advertising.
In order to demonstrate its commitment to children’s rights in the digital environment, the European Commission teamed up with Insafe – a European network of awareness centres promoting a safer and better use of the internet – and Inhope – an NGO that fights against child sexual abuse content on the internet – to launch the Better Internet for Kids platform. This website contains information about the legislation of 30 European countries (i.e. the EU member states, Norway, Iceland and the United Kingdom) that is dedicated to protecting children in the digital environment.
United Nations Convention on the Rights of the Child
In 2021, the UN Convention on the Rights of the Child also stressed the importance of the states “protecting minors from harmful and unreliable content by means of creating legislative and administrative measures that impede any sort of violence, abuse, maltreatment or exploitation”, as explained in this article (Spanish only) by María del Pilar Tintoré, who specialises in children’s rights in the digital environment.
General Comment No. 25 of the Committee on the Rights of the Child aims to “provide guidance on relevant legislative, policy and other measures to ensure full compliance with their obligations under the Convention and the Optional Protocols thereto in the light of the opportunities, risks and challenges in promoting, respecting, protecting and fulfilling all children’s rights in the digital environment”.
This comment from the UN specifies that the policy of the member states must be progressing in five areas in order to protect children’s rights in the digital environment. States must first of all ensure that children have the right to justice and reparations if their rights have been violated; they must also prohibit the use children for commercial purposes, ensure children’s right to privacy, reinforce the safety and security measures to protect them from digital violence, and promote the creation of age-appropriate content that allows them to find information that is useful to them on the internet.
The individual states
In the third report by the Better Internet for Kids (BIK) platform, published in 2020, there is an analysis of how each of the EU member states, Iceland, Norway and the United Kingdom are pursuing the strategy promoted by the platform to make the internet a safer and better place for children.
According to the data from the BIK report, in the majority (50%) of the thirty states included in the study, there is national-level legislation or regulation in place directed at creating a safe environment for children on the internet (Pillar 3 of the strategy). In five countries (16.6%) there are non-regulatory policies, i.e. the state does not prohibit or require anything by law, but it can enact other types of policies aimed at creating a safe environment for children on the internet. In eight countries (26.6%), the policies in this area are part of broader policies, i.e. there are no specific policies. And, finally, in two of the countries (6.6%), there is no policy that addresses this topic.
Some of the topics included in the report that are used to rank the countries’ policies are: the existence of age-appropriate privacy settings, parental controls for online platforms, age and content ratings, and controls for online advertising.
In recent years, policies aimed at making the internet a safer environment for children have gained stronger legal status, as a result, there is a high number of countries that currently have regulations or rules in place (50%). According to the same report, this is due above all to the fact that EU member states have had to adapt their national legislation to comply with the Audiovisual Media Services Directive (2018), which deals specifically with increasing the availability and use of parental controls.
If we look at the protective and regulatory measures – including but not limited to laws – adopted by European countries, we can determine whether these measures were put into place in the past year, prior to that, or whether they even exist at all. In the UK, Spain and Portugal, for example, there were no mechanisms in place for reporting content potentially harmful to minors, such as grooming or cyberbullying, before 2020.
In some countries, such as Sweden, Hungary, Spain, Croatia and Bulgaria, there are no self-regulatory measures on the industry’s part concerning age-appropriate privacy settings. By contrast, in other countries, such as France, Ireland and Norway, they do exist. Countries like Ireland, France, Portugal and Lithuania have had, since 2020, some sort of mechanisms for improving the cooperation between the reporting channels – the police – and the industry or digital platforms in order to take down material containing any form of child abuse. And, in general, most of the countries were already promoting age and content ratings in one way or another before 2020.
The case of Spain
While Spanish legislation does not have a specific regulation on viral internet challenges, there are a number of laws that mention children’s digital rights. Furthermore, with the new law on the protection of children and adolescents from violence (Spanish only) put forward in 2021, a few changes have been made to the regulation which could protect minors from some harmful viral challenges.
Rodolfo Tesone, a lawyer specialised in digital transformation issues and one of the 15 members in the group of experts formed to draft the Spanish Charter of Digital Rights (Spanish only), states that the regulation vague and there is no single body that regulates digital matters. The penal code provides the most protection in that regard, but it does not have many applications in the digital sphere, states Tesone. There is no comprehensive regulation on the access to content, either, since it depends predominantly on the same digital platforms that spread the content. This “opens up the debate on freedom of expression versus filtering/censorship, although it is not up to these platforms – private companies – to defend or limit citizens’ liberties”, Tesone explains.
In this connection, the recent law on the protection of children from violence (8/2021 [Spanish only]) introduces a number of recommendations with the goal of creating “safe digital environments”. According to this law, public institutions need to foster their collaboration with the private sector in order to standardise the age-rating system and “smart labelling of digital content” (article 46).
In Spain, as in the rest of the European Union, there is a hybrid model involving the state and private regulation of companies for monitoring the content that gets published. However, as Tesone points out, one of the problems encountered in regulating digital platforms is that content is only reviewed after it has been published. So once content has gone viral, even after a platform has taken it down because it was identified as being potentially harmful to children, the “harm” has already been done. “If we think about this from a child protection perspective, digital platforms also need to have mechanisms in place for checking content before it is published”, Tesone explains. But, he continues, due to a lack of resources, a lack of interest or simply because it is not their duty to protect individual rights, companies do not always do so.
Ultimately, children’s access to online content depends in most instances on parental supervision. And when parents are not digitally literate or aware of the risks, they may not be supervising their children in an appropriate or deliberate enough way. Therefore, in the interest of preserving children’s fundamental rights, article 84 of the organic law on the protection of personal data and guarantee of digital rights (3/2018 [Spanish only]) calls on the parents or guardians of minors to ensure that their children are using the internet and other digital services in a balanced and responsible manner. The same article also states that the Prosecution Ministry (Spanish: Ministerio Fiscal) shall intervene if images of minors or minors’ personal data are used against their rights.
In the case of crimes committed in the digital environment that could affect minors, in 2015, , as per Organic Law 13/2015, the crimes committed via technological means were amended and added to the penal code. According to Pilar Tintoré, a lawyer specialised in children’s and adolescents’ rights, this update has helped increase the scope of the law to include crimes that were difficult to pin down before. The recent law protecting minors from violence also incorporated a number of criminal offences in the penal code. For example, in article 143 of the law, it states that the public dissemination or spread of content via internet, telephone or any other technological means that could promote, foster or incite the suicide of minors shall be punishable by imprisonment for one to four years.
Underage children’s access to digital platforms: Spain
In Spain the minimum age requirement to sign up for a social network is 14 years, according to the law on the protection of personal data and guarantee of digital rights (3/2018 [Spanish only]). However, the European regulation on the protection of personal data explicitly states that the processing of a child’s personal data is only considered lawful if the child is 16 years or older (article 8). Before age 16, signing up for a social network requires the consent of a child’s parents or guardians. So there is a discrepancy between the two laws and, on top of that, the companies are subject to the law of the country where they are registered. All of that means the minimum age to join a social network can vary from platform to platform.
Instagram and Facebook require users to be at least 14 years old to sign up; in the case of TikTok it is age 13. Underage children need consent from their parent or guardian in order to sign up. But aside from the minimum age requirement, there is the issue of the verification method the platforms use to determine whether the people requesting to sign up are actually the age they claim to be. This issue is pointed out in a study by Laura Davara (Spanish only) published by the Spanish Data Protection Agency. Tesone supports this argument, stating that, since the consent is given online, “platforms do not have a substantive mechanism for properly verifying the users’ age”.
In order to tackle this problem, the Spanish social network Tuenti pioneered a double-verification method, i.e. both an online and offline verification, in order to sign up a minor. Users had to provide a copy of their ID card in order to validate their age and, thanks to this mechanism, the platform managed to ensure that users met its age and parental consent requirements. Davara’s study includes a statement from Tuenti’s own executives that over 90% of the users who had been asked to verify that they were 14 or older by providing their ID did not respond to the request and, therefore, had their accounts blocked. However, Tuenti no longer exists and the other online platforms have not incorporated mechanisms like these.