Saturday, February 27, 2016

Consultation response to "Keeping Children Safe in Education: proposed changes"

Profs. Ian Brown and Douwe Korff, February 2016

Introduction
1.              We have only just learned of the consultation, for which we apologise. The comments below are as a result written quickly to reflect our main concerns.

2.              We were both co-authors (with others) of the 2006 report “Children’s Databases – Safety and Privacy”, written by the Foundation for Information Policy Research (FIPR) for the Information Commissioner and available here:

We believe the report – although written in a different time and context – contains many observations that are relevant to the consultation, and include it here by reference.

3.              We will focus on the proposed duty of schools and other educational establishments for under-18s to monitor online activities by students, as set out in paragraph 75 of the Draft Statutory Guidelines, as follows:

As schools and colleges increasingly work online it is essential that children are safeguarded from potentially harmful and inappropriate online material. As such governing bodies and proprietors should ensure appropriate filters and appropriate monitoring systems are in place. Children should not be able to access harmful or inappropriate material from the school or colleges IT system. Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. Guidance on e-security is available from the National Education Network- NEN. Guidance on procuring appropriate ICT is available at: Buying ICT advice for schools.

4.              We fear in particular that the above-mentioned duty of schools to have “systems ... in place that will identify children accessing or trying to access harmful and inappropriate content online” will be read by many school “governing bodies and proprietors” as requiring them to monitor the online activities of their students continuously and in detail. More specifically, we are concerned that schools will try to obtain filtering and monitoring software that will not only prevent children and young people from accessing important information, e.g., on sexual health and gender issues, or religious or political matters – including contemporary contentious issues including terrorism and jihadism; but that will also automatically detect and single out individual students deemed by the software – that is, by an algorithm – to being in some sense deviant.

Main issues
5.              The three most important general issues identified in the 2006 FIPR report were:
I.               Children have human rights – including a right to privacy and to seek, receive and impart information without undue interference;
II.             There are serious dangers in conflating “safeguarding” children with “promoting the welfare” of children that can lead to breaches of their rights; and
III.           There are serious dangers inherent in trying to predict and prevent “bad” outcomes for children, especially if this is done on the basis of profiling, data mining and what is now called “algorithmic decision-making”, that can lead to further breaches of their rights, without effective remedies.

Brief elaborations on the main issues

I.              Children have human rights – including a right to privacy and to seek, receive and impart information without undue interferences;

6.              The UN Convention on the Rights of the Child (CRC) was adopted as long ago as 1989 and has been in force since 1990; the UK signed up to it in the same year and ratified the convention in December 1991. Although “the child, by reason of his physical and mental immaturity, needs special safeguards and care”, this should not lead to undue interference with its privacy or other rights and freedoms: see in particular Article 13 – 17 CRC. Measures that intrude on a child’s rights and freedoms must serve a legitimate aim and must be necessary and proportionate to the achievement of that aim.

7.              The assessments of “legitimate aim”, “necessity” and “proportionality” will vary depending on the nature of the aim, the intrusiveness of the interference – and in relation to children (defined in both the CRC and the Draft Statutory Guidelines as anyone under the age of 18), the level of maturity of the child. Interference that may be legitimate and proportionate if applied to a 5 or 10-year old may be not justified and disproportionate if applied to a 16 or 17-year old.

8.              In this, it should be borne in mind that ubiquitous monitoring of a person’s online activities constitutes a very serious interference with that person’s private life and with his or her data protection rights. As the Court of Justice of the EU (CJEU) has put in an important recent judgment:

legislation permitting the public authorities to have access on a generalised basis to the content of electronic communications must be regarded as compromising the essence of the fundamental right to respect for private life, as guaranteed by Article 7 of the [EU Charter of Fundamental Rights].”
(CJEU judgment in Schrems, C‑362/14, para. 94, with reference to Digital Rights Ireland and Others, C‑293/12 and C‑594/12, para. 39; emphasis added).
The reference to such actions “compromising the essence of the fundamenal right” means that such “generalised” access or surveillance can never be justified: monitoring of a person’s electronic communications – which include online browsing – must always be targeted, on the basis of objective criteria indicating a need for such intrusion (cf. para. 91 of the Schrems judgment).

9.              The above dictum is as true with regard to children as it is in relation to adults. It may well be easier to justify the monitoring of a child’s online activities than the surveillance of an adult; the threshold may be lower – but a threshold there must be. For the state to authorise – nay, to demand[1] – the ubiquitous suspicionless, untargeted monitoring (possibly by automated means) of all the online activities of a child in all educational environments compromises the essence of the child’s rights to private life, (online) association and freedom of expression (which includes the right to seek, receive and impart information and ideas without interference by public authority and regardless of frontiers).

II.            There are serious dangers in conflating “safeguarding” children with “promoting the welfare” of children that can lead to breaches of their rights

10.           The 2006 FIPR report stressed that:

It is important to be clear about the distinction between the government’s broad policy goal of ‘safeguarding children’ and the narrower focus of ‘child protection’, since they pose different data protection issues.
‘Safeguarding’ covers all the problems of childhood and is defined by the government as:
The process of protecting children from abuse or neglect, preventing impairment of their health and development, and ensuring that they are growing up in circumstances consistent with the provision of safe and effective care which is undertaken so as to enable children to have optimum life chances and enter adulthood successfully.
This comes from a standard DfES reference, which was the subject of extensive consultation, and which also gives the following definition for child protection:
“The process of protecting individual children identified as either suffering, or at risk of suffering, significant harm as a result of abuse or neglect”

11.           The report accepted that intrusive measures such as broad data sharing are not just justified but essential for child protection in this narrow sense – but argued strongly that the same does not hold true when it comes to “preventing problems from developing” in a much looser sense. Exactly the same holds true when it comes to intrusive, ubiquitous monitoring of young peoples’ online activities. If there are objective indications that a child or young person is at real risk of being drawn into crime, violence or “jihadism”, surveillance and interventions by educators, social workers and in serious cases the police may be justified. But that does not mean that children and young people should, without prior suspicion, be ubiquitously monitored for signs that they might be tempted into “extremism” or other bad behaviour (or even thoughts).

12.           We discern the same erroneous conflation of issues in the Draft Statutory Guidelines. Specifically, it says that:

Protecting children from the risk of radicalisation should be seen as part of schools’ wider safeguarding duties, and is similar in nature to protecting children from other forms of harm and abuse. During the process of radicalisation it is possible to intervene to prevent vulnerable people being radicalised. (para. 51)

13.           In fact, there is a fundamental difference between noting signs of actual (physical or mental) harm or abuse in a child and trying to identify whether a child or young adult is “at risk” of becoming “radicalised”, especially when “radicalisation” and “extremism” are defined as broadly as this:

Radicalism refers to the process by which a person comes to support terrorism and [other?] forms of extremism. (para. 52, emphasis added)
Extremism is vocal or active opposition to fundamental British values, including democracy, the rule of law, individual liberty and mutual respect and tolerance of different faiths and beliefs. We also include in our definition of extremism calls for the death of members of our armed forces, whether in this country or overseas. (footnote 13, emphasis added)
14.           Whatever exactly may be meant by “vocal or [note the ‘or’!] active opposition to fundamental British values” – it is clear that what is addressed here goes well beyond what is criminal under the law.
15.           In its seminal Handyside judgment, the European Court said, as long ago as 1976:

Freedom of expression constitutes one of the essential foundations of [a democratic] society, one of the basic conditions for its progress and for the development of every man. Subject to [the specified exceptions], it is applicable not only to "information" or "ideas" that are favourably received or regarded as inoffensive or as a matter of indifference, but also to those that offend, shock or disturb the State or any sector of the population. Such are the demands of that pluralism, tolerance and broadmindedness without which there is no "democratic society". This means, amongst other things, that every "formality", "condition", "restriction" or "penalty" imposed in this sphere must be proportionate to the legitimate aim pursued. (para. 49, emphasis added)
16.           As already noted, the imposition of certain interferences with freedom of expression and freedom to seek, receive and impart information may be justified in relation to young children that are not justified in relation to adults. However, we believe this cannot be stretched to the extent that children – all children under the age of 18 – must be prevented from looking for or discussing or even indulging in the dissemination of anything that “opposes fundamental British values”. Schools and other educational establishments should be places of learning, discovery and exploration – including learning about, discovering and even exploring information and ideas that “offend, shock or disturb” the British State or British mainstream society.

17.           In our view, the preventative ubiquitous monitoring of young peoples’ online behaviour, without clear prior evidence of serious dangers to them, to spot signs, not of criminal matters but of matters that are otherwise societally frowned upon, is in fundamental breach of their human rights.

III.          There are serious dangers inherent in trying to predict and prevent “bad” outcomes for children, especially if this is done on the basis of profiling, data mining and what is now called “algorithmic decision-making”, that can lead to further breaches of their rights, without effective remedies.

18.           Using data mining/profiling software tools to seek out from large datasets (like the browsing records of all students at an establishment) “possible” or “probable” targets is fraught with danger – in particular if the aim is to find rare targets, when such tools will inevitably lead to many “false positives” or “false negatives” (or most likely both). We have both written about this in many publications. A quite detailed write-up of the issues is contained in a report one of us wrote with a French colleague in 2015.[2] Here, it may suffice to note two clear and present dangers:

(i)            Profiling tools are extremely likely to lead to “discrimination by computer”. The use of software to try to identify students supposedly “at risk” of becoming “radicalised” (in the sense of deemed to be drawn to ideas that are “opposed to fundamental British values”) will undoubtedly lead to the singling out of many individual students who have committed no criminal offences and most probably would not go on to commit criminal offences – but who will forever be stigmatised by an official label of being “anti-British” or “extreme”.

(ii)          It is becoming increasingly impossible to challenge the outcomes of such “algorithmic decision-making”, even if applied to more-or-less verifiable matters (such as whether a person actually went to a terrorist training camp).[3] When the label is so opaque as the definition of “extremism” used in the Draft Statutory Guidelines, it becomes even worse. How can a child prove that she was only “exploring” notions and ideas that run counter to mainstream ideologies, rather than “supporting” them?

19.           We fear that the Draft Statutory Guidelines are a fundamentally flawed attempt to counter bad ideas, or even to prevent children from being attracted to them. The measures proposed, the ubiquitous surveillance that is implied, will on the contrary alienate those already disenchanted with our society, and drive some of them into bad actions, let alone bad thoughts.

20.           They are also wide open to challenge in the European courts.


[1] As the Draft Statutory Guidance makes clear, schools and colleges must comply with them (unless [unspecified] exceptional circumstances arise) (p. 3)
[2] Marie Georges & Douwe Korff, Passenger Name Records, data mining & data protection: the need for strong safeguards, report prepared for the Consultative Committee of the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data (T-PD) of the Council of Europe, available at:
See in particular section I.iii (p. 22ff) on “The dangers inherent in data mining and profiling”.
[3] See the report mentioned in the previous footnote, in particular the sub-section on “The increasing unchallengeability of profiles - and of decisions based on profiles”, p. 28ff (with references).

No comments: