In July, the Secretary of State for Digital, Culture, Media, and Sport Karen Bradley announced that the British Board of Film Classification would delay the implementation of section 14(1) of the Digital Economy Act 2017, colloquially known as the ‘Porn Block Law’ for 6 months at the least. The aim of s14(1) is to stop Britons under the age of 18 from accessing pornographic material online. Achieved through the establishment of age-gate systems prior to entry onto the sites, the new law would be ditching the older system of age verification previously reliant upon the person accessing the site saying if they were over 18. The new law requires that someone wanting to access a site where the majority of content is sexual, must present a form of identification.

This means that erotica and porn sites will be affected but so will fan-fiction sites and art sites that host nude art. This should cause fear as 56% of Britain’s watch porn and 46% read erotica, with 3/4 of all men watching porn and around 2/3 of all women reading erotica.

The law itself is an Orwellian nightmare and is one of the largest potential security risks for those living in the UK. It requires that porn distributors create their own method to check a person’s age, and as such a booming industry of age-checking services has arisen. Consequently large porn distributors like MindGeek, who PornHub and YouPorn are subsidiaries of, have created their own form of this service. This has led to fears of the monopolisation of the porn/erotica/sexual content market under a few big distributors with means to pay for or create their own age-block services.

Two of the services, Age ID (owned by MindGeek) and Age-Pass, require that you send your passport information, drivers licence information, or credit card information to them to then confirm your age to allow entry. With some other services requiring multiple forms of identification, like a passport and credit card to confirm ones age. This in turn will create mass databases of Britons personal and credit card information that are ripe for hackers to access. A hacking of such information is compromising enough to ruin a person’s financial future. The law has been derided by civil liberty groups for not doing enough to ensure user protection and facilitating the push for monopolisation. Large databases like these are frequent targets of hacking, like the Singaporean Health Services server. The potential for blackmail of high profile persons following a breach is also staggering.

The implementation delay was due to having not informed the European Commission and failing to do the prerequisite paperwork surrounding a regulation of a service. While Britain is a member of the EU it is subject to rules and regulations about constricting the movement of services; and information of this sort is protected by the GDPR. The law must first be reviewed by the Commission to be valid under EU rules. Although there is a possibility of this law being judicially reviewed by the EU while the UK is still a member; it is doubtful that it will be stopped. With Brexit occurring before the 6 months that the government estimated, coupled with the fact that the EU allows for some regulations on the grounds of ‘public morality’, porn having a long precedent as a regulated service.

 The block can be circumnavigated through the usage of a VPN, such as the free TOR browser which masks someones computer, indicating that it was entering the sight from another country, rendering the verification wholly inefficient at stopping unchecked entries in the UK.

This all being said, the issue with the law itself is not that it is inherently bad to create methods of content curation in an effort to hide lewd content from younger eyes. Content curation of the web by the government is on some level necessary as to ensure that heinous material is not spread with ease. Ensuring that snuff films are not readily available is a perfectly acceptable policy goal.

The issue is that this law ties into a broader history of successive governments acting wantonly with the security and privacy of Britons personal information. Successive governments have expressed a desire to strip away peer to peer encryption for all non-financial transactions for the purposes of government oversight; under the auspices of national security reasons. Both the May and Johnson Governments desire(d) to create back doors to encryption in messenger apps like WhatsApp, SnapChat and Facebook Messenger to allow GCHQ to monitor private communications to search for terror threats; a terrifying prospect.

Nearly all communications are able to be monitored by the government, and encrypted message services like WhatsApp were a last vestige away from constant government oversight. This is terrifying, as sending a copy of a pro-IRA chant by a Celtic fan to another, although distasteful, would classify as an act of material support for a “prescribed organisation” – a designated terrorist organisation under the 1999 Terrorist Act, which the IRA is – under the 2006 Terrorism Act, and could lead to ones arrest and prosecution for something so innocuous. The creation of a backdoor will then inevitably allow for third party breeches of the encryption once they figure out how to enter in through the back door.

In this broader context the law looks less like a way to stop children from watching porn, but another attempt by the government to create mass databases of personal information, and further chipping away of digital autonomy in favour of government oversight. Databases like these should scare people. They are everywhere as companies like Facebook, Google, and Twitter track their users personal data, even when you are in incognito mode, to sell to advertisers. These aren’t even safe, as hacks of mass databases like these are an almost monthly occurrence, as they are high reward treasure chests for hackers.

Peoples personal information and autonomy should be protected at all cost, not put in reckless endangerment to stop young people from looking at porn.