Dr Rachel O’Connell
2016 is a pivotal year for the future of the Internet and its governance, as Internet-related issues are high on the political agenda, with the principal concerns being human rights, privacy, security and countering terrorism. This year brings opportunities to shape a model of Internet governance that promotes children’s rights; UNICEF Innocenti Research identifies one in three Internet users as children, with a higher proportion in developing countries, where most Internet growth is happening. This post examines how both the European Commission and the UK Government have proposed legislation designed to ensure that online businesses respect children’s rights online and limit access to content, which may be injurious to a child’s wellbeing.
The first post in this series will examine both the UK’s proposed Digital Economy Bill and the revisions to the Audio Visual Media Services Directive (AVMSD) through the lens of recent technology and policy innovations. These include both the emergence of globally streamlined age classification systems and the capability to conduct pseudonymous, affordable, privacy-preserving, secure and reliable age-related eligibility checks in compliance with the provisions of the General Data Protection Regulation. Age-related eligibility checks enable a business to query, for example, “Is this person under 16 years of age?” which elicits a Yes/ No response. These developments remove many of the technical, legal and financial barriers that to date have impeded businesses efforts to better protect children from exposure to harmful content.
Enforcement methods and legal remedies will be examined in Part 2 of this series, and Part 3 will explore the tax implications associated with the proposed revisions to the AVMSD.
Balancing human rights
To date, the fulcrum of debates about the accessibility of both adult content and other types of content deemed to be injurious or harmful to children online have relied on John Stuart Mill’s views on the right to freedom of expression, which are reflected in Article 10 of the European Convention on Human Rights (ECHR). However, as the ECHR stipulates, the right to freedom of expression is not absolute. What we are witnessing in a raft of legislative measures, that include the new General Data Protection Regulation, the proposed UK Digital Economy Bill and the revisions to the Audio Visual Media Services Directive, is the recognition of the need to balance both Article 10 of ECHR and Article 17 of the UN Convention on the Rights of the Child (UNCRC), which states:
“States Parties recognize the important function performed by the mass media and… encourage the development of appropriate guidelines for the protection of the child from information and material injurious to his or her well-being.”
The UNCRC’s monitoring body, the Committee on the Rights of the Child, is comprised of a group of experts that expects to preside over a complaints mechanism, through which children and adults responsible for their wellbeing can bring violations of their rights to international attention. This will be explored in more depth in Part 2.
Legislative changes designed to protect children online
Current EU legislation adheres to the country of origin (COO) principle, which states that each provider of audiovisual media services comes under the jurisdiction of only one member state. However, both the proposed UK Digital Economy Bill and the revised EU AudioVisual Media Services Directive (AVMSD) seek to create an exception to the COO principle, for child protection purposes. The basic prohibition outlined in the UK’s Digital Economy Bill is:
“A person must not make pornographic material available on the Internet on a commercial basis to persons in the United Kingdom except in a way that secures, at any given time, the material is not normally accessible to persons under the age of 18.”
The Bill stipulates that this new regime extends to both domestic and extraterritorial online pornography providers that serve a UK audience. It requires adult content providers to conduct age-checks on UK users accessing content, or be deemed to be operating in contravention of UK law. The UK Government’s view is that bringing a criminal case against a platform provider would be disproportionate, and that a civil remedy, in concert with self-regulatory measures, will be equally as effective. The date for MPs to discuss the bill at a Second Reading is yet to be announced.
The UK Government’s approach aligns with that of the European Commission, which in its review of the Audio Visual Media Services Directive, states that it intends to bring audio visual content services, including both EU and non-EU online platform providers within the scope of EU regulation as outlined in an explanatory memorandum
“It is appropriate to ensure that the same rules apply to video-sharing platform providers which are not established in a Member State with a view to safeguarding the effectiveness of the measures to protect minors and citizens set out in this Directive and ensuring a level playing field in as much as possible.”
The exception to the COO rule would require content providers to comply with the age classifications that are applicable in the country of destination (COD) i.e. in member state(s) where the content is consumed. COD is controversial because it would extend EU law to non-EU based businesses that serve content to EU citizens and raises significant challenges in terms of enforcement. It is worth noting that the General Data Protection Regulation (GDPR) has set a significant precedent on the extraterritorial application of EU law.
The European Regulators Group for Audio Visual Media Services (ERGA), which is composed of representatives of independent national regulatory authorities (NRAs), reported that a proportion of NRAs (excluding Ofcom) favoured changes to the COO approach in one or more specific areas, including the protection of minors. The ERGA will facilitate further in-depth discussions on possible variations to the country of origin approach.
Globally streamlined age classification systems
As a result of cultural and legal specificities, countries apply differing age ratings to content; however, this is no longer necessarily a barrier to the effective operation of a Country of Destination approach. Administered by many of the world’s game rating authorities, the International Age Rating Coalition (IARC) provides a globally streamlined age classification process for digital games and mobile apps, helping to ensure that today’s digital consumers have consistent access to established and trusted age ratings across game devices.
IARC requires developers to complete a questionnaire that is programmed with unique algorithms that generate ratings reflecting each participating rating authority’s distinct standards, along with a generic rating for the rest of the world. IARC rating assignments also include content descriptors and interactive elements identifying apps that collect and share location or personal information, enable user interaction, share user-generated content, and/or offer in-app digital purchases.
The IARC system currently includes Europe and a proportion of the world’s rating authorities, which collectively represent regions serving approximately 1.5 billion people, with more expected to participate in the future. Similarly, the BBFC and the Dutch regulator NICAM recently received a Comenius EduMedia Seal of Approval for ‘You Rate It’, an international tool for the classification of User Generated Content (UGC).
Secure Eid And Age-Related Eligibility Checks
On July 1, the new EU regulation for electronic identification and e-signatures (the “eIDAS Regulation”) came into effect, helping to pave the way for global adoption of secure eID and by extension, age-related eligibility checking services. Secure eID is increasingly recognised as a key enabler of the “digital economy” and is defined in eIDAS Regulation as:
“the process of using personal identification data in an electronic form uniquely representing either a natural or legal person or a natural person representing a legal person”
The technical architecture, standards, legal framework and policies that underpin secure eID also enable one or more attributes of a person’s identity to be checked, for example, an individual’s age-related eligibility, which is a right of access to goods and services based on age or age band. An age check service provider is an organisation responsible for all the processes associated with establishing and maintaining a subject’s identity attributes; they provide assertions of the attributes to the individuals, other providers, and relying parties
In effect, it is now possible to conduct pseudonymous, affordable, privacy preserving, secure and reliable age-related eligibility checks online. Sponsored by the Digital Policy Alliance (DPA), the British Standards Institution is facilitating the development of Publicly Available Specification (PAS) 1296 Age Checking code of practice. The PAS gives recommendations for the public-facing service providers’ implementation of tools to check that the age-related eligibility data provided by age checking service providers for each online user is acceptable for the websites that they are accessing. It is highly relevant to online companies that handle European citizens data and wishes to operate in compliance with the GDPR, eIDAS Regulation and AVMSD.
In a recent press release on the proposed revisions to the Audio Visual Media Services Directive, the Commission encourages content platform providers to explore the scope to leverage secure eID, to conduct age-checks, and thereby limit children’s access to harmful content online. Commissioner Oettinger announced the formation of a multi-stakeholder group entitled the Alliance for Child Protection that will examine how businesses can leverage secure eID to enhance the e-safety of children online.
Globally streamlined age classification systems and the emergence of age-checking services that adhere to internationally recognised technical standards, and operate within a predictable regulatory environment create opportunities for the Commission, in concert with businesses, to realise a major strategic objective of creating a better Internet for children. Both the UK Government and the European Commission advocate self-regulatory mechanisms as the most appropriate form of regulating the Internet and mobile technologies. Those businesses that have not yet engaged in technical and policy discussions on secure eID need to avail of opportunities to participate.
Businesses that do not pay sufficient attention to the rights of the child, the changing European regulatory, legal and technical environment may be held to account in The Supreme Court in the UK, the European Court of Human Rights or the European Court of Justice.
Part 2 of this series explores the proposed enforcement methods outlined in the Digital Economy Bill