Child Abuse Law
  • Home
    • About
  • Case Law
  • CICA Claims
  • Contact
  • Blog

Is Facebook committing criminal offences?

17/4/2017

0 Comments

 
Picture
Last week, the Times carried a disturbing headline “Facebook publishing child porn” followed by a lengthy article illustrating the kinds of images that journalists found on Facebook. There are also references to Facebook groups exchanging child abuse imagery. Later on, in the same edition of the newspaper, there is a leading article entitled “Face Facts”. It makes the point that if a newspaper were to publish cartoons of children performing sex acts, it would be the subject of outrage and could well end up in court. The Times says that Facebook has apparently been hosting such images with impunity even after those images have been identified and reported to them. In addition, it is alleged that the company has been hosting images and news put out by terrorists. Julian Knowles QC, a barrister at Matrix Chambers has said that it is “very strongly arguable” that some of this content is illegal. What is suggested is that Facebook is at risk of committing criminal offences in circumstances where it has been made aware of such images on its website, but has failed to remove them. Consequently, the company might be regarded as “assisting and encouraging” the publication of child pornography.

However, Facebook did remove the offending imagery after being contacted by the newspaper. Currently there are no investigations into the company itself.  Plainly the police and the Crown Prosecution Service do not regard companies such as Facebook, as committing any kind of criminal offence.

So, what is the law relating to child pornography, and is the charge against Facebook fair?

First of all, we need to remind ourselves that there are appropriate ways of describing this ghastly crime, one of which is child abuse imagery. The Internet Watch Foundation (of which Facebook is a member) exists to root out child sexual abuse on the internet. They put out the following statement on their website:- 
“We use the term child sexual abuse to reflect the gravity of the images and videos we deal with. Child pornography, child porn and kiddie porn are not acceptable descriptions. A child cannot consent to their own abuse.” 

The law for the protection of children in this area, is a great deal older than may be first thought. The Obscene Publications Act 1857, and its successors the 1959 and 1964 Acts may have had as their primary purpose, the protection of the public from any material that might “deprave and corrupt”, but they could be and are still applied to child abuse imagery offences. The original 1959 and 1964 Acts have been updated to deal with internet offences and they are still used in prosecutions. See Section 168 of the Criminal Justice and Public Order Act 1994 which amends section 1(3) of the 1959 Act.

What precisely does obscene mean? Section 1(1) of the 1959 Act says:- 
“(1)     For the purposes of this Act an article shall be deemed to be obscene if its effect or (where the article comprises two or more distinct items) the effect of any one of its items is, if taken as a whole, such as to tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it.” 

There is a “public good defence” – to the effect that legitimate works of art, news, documentaries and the like are not illegal if they contain obscene material. Section 2 of the Act says that publication of such an article is illegal, and a person publishes according to section 1(3) who “distributes, circulates, sells, lets on hire, gives, or lends it, or who offers it for sale or for letting on hire”. The Act has been updated to include electronic publication.

The first specific mention of child abuse imagery in statutory legislation appears in the Protection of Children Act 1978. Section 1(1) makes it an offence to take, make, distribute and show an advertisement of indecent photographs or pseudo-photographs of children. This is now the main means by which persons making or possessing child abuse imagery are prosecuted. Downloading or viewing an indecent image of a child from the internet has been held to be an act of ”making”’ under the 1978 Act.

 The 1978 Act has been extended by subsequent statutes. The Criminal Justice Act 1988 Section 160(1), outlaws the mere possession of indecent photographs or pseudo-photographs of a child. The Criminal Justice and Public Order Act 1994 extended the definition of “photograph” to data stored on a computer disc or by other electronic means and “pseudo-photographs” namely images whether made by computer graphics or otherwise which appeared to be photographs.

The Sexual Offences Act 2003 created the offences of abusing children through or exposing them to pornography. It also contains an offence of “grooming” which is aimed at the grooming of children over the internet by means of “chat rooms.”  The relevant sections are Sections 12, 14, 15, 47-51 and 67 of the Sexual Offences Act 2003. 

Section 63 of the Criminal Justice and Immigration Act 2008, prohibits the possession of an extreme pornographic image. Section 62 of the Coroners and Justice Act 2009 criminalises the possession of prohibited images of children, namely computer-generated child sexual abuse images, as well as cartoons and drawings involving child sexual abuse. Some of the images that appear in the Times article last week would come under this definition.

So how could Facebook be said to be “assisting and encouraging” the publication of child abuse imagery? Facebook is simply accused of failing to act to remove the offending material in a timely fashion. Their primary defence would be that their internet platform is not covered by any of the above statutes. They are not taking, making or distributing indecent images, nor do they “possess” any such images. There is a raft of other defences available to them, including the defence available under Section 2(5) of the Obscene Publications Act 1959 which exonerates the Internet Service Provider providing it can be proved that it did not examine the offending material; and had no reason to suspect that the material was offensive in the first place.Section 50 of the Sexual Offences Act 2003 contains the offence of “arranging or facilitating” child pornography but this offence has to be intentional. The words “assisting and encouraging” come from Part 2 of the Serious Crime Act 2007. Section 44 contains the crime of “intentionally encouraging or assisting an offence.” Section 45 goes further – it defines an offence of “encouraging or assisting an offence believing it will be committed.” However, Section 50 of the 2007 Act contains the defence of acting reasonably in the circumstances.

Further, despite the acceptable-use policies in place for Internet Service Providers (“ISP’s”), the EU Directive 2000/31/EC on electronic commerce (implemented into UK law by the Electronic Commerce (EC Directive) Regulations SI 2002/2013) exempts ISP’s from liability when they unknowingly provide access to offensive material. This means that ISP's cannot be placed under a specific duty to monitor general content. They are simply the wall on which messages are posted. On the other hand, the Directive also defines the circumstances under which an “Information Society Service” will be liable for unlawful content communicated by a third party. Unlawful content would include obscene and terrorism-related content. Under Regulation 19, if an Information Society Service has actual knowledge of the content's unlawfulness, or hosts material that it knows or is apparent to it is unlawful, then it must act expeditiously to remove or disable access to the content or risk incurring liability.

However, in practical terms, it appears that there is no appetite within the police or the Crown Prosecution Service to mount a costly prosecution against Facebook.  It would be peculiar if they did, since the police, Facebook and the Internet Watch Foundation all work together to remove this kind of content.

However, the Times has a valid point, particularly as the responsibility of social media platforms has become of increasing interest to the government. In November 2016, the health secretary, Jeremy Hunt said that he wanted social media platforms to block explicit images from young users automatically, following a request from their parents. This year, the Times reported that ministers intended to summon Facebook, Twitter, Apple and others to Whitehall, to demand that they develop new technological solutions similar to those used to thwart child abusers and terrorists. The call will be backed by the threat of legislation, with a green paper promised in the summer. Theresa May will commit today to making Britain the safest place in the world for children to be online. The Children's Commissioner, Anne Longfield has said that the "incomprehensible" terms and conditions of social networks mean children have little idea what they are signing up to".  Young people were left to fend for themselves in the digital world. She has also said that schools should teach children "digital citizenship" from the age of four as part of the curriculum, and that children should have a digital ombudsman to help them remove content from social media companies.

Moreover, the pressure on companies such as Facebook is not merely political. Recently a 14 -year old girl brought a civil action against Facebook seeking damages for misuse of private information, negligence and breach of the Data Protection Act. She alleged that a naked photograph (obtained from her by blackmail) had appeared on their website on several occasions. She was also bringing a claim against the man who posted the photograph as a form of “revenge porn”. Facebook launched an application to halt her legal action, but the application was refused by a judge in Belfast. Facebook argued that the company always took down the picture when it was notified to them. They relied on European Directive 2000/31/EC, to which we referred above. The objective of this Directive was “to create a legal framework to ensure the free movement of information society services between Member States…..” Information society services include websites such as Facebook. Article 12 of the Directive directs Member States to exempt information society services from liability where they are a “mere conduit” for any offending information, which is transmitted, such as a sexualised photograph. Articles 13 and 14 contain similar provisions in relation to “caching” and “hosting”. In another case, MM V BC, RS and Facebook Ireland Limited [2016] NIQB 60 15 June 2016 (another case from the High Court of Northern Ireland), the Plaintiff was the victim of “revenge porn” which appeared on Facebook, who in turn were brought into the action in order to preserve information on their databases.

It might be said that governments have an agenda, which goes wider than curbing the appearance of child abuse imagery and terrorist material on sites such as Facebook. For instance, there is the concern about “fake news” and calls for children to be “educated” in what is fake and what is not. Governments all over the world have realised the power of social media, and the way in which it circumvents established media.

Facebook itself has a clear and published policy of removing content, disabling accounts and working with law enforcement when it believes that there is a genuine risk of physical harm or direct threats to public safety. That includes content that threatens or promotes sexual violence or exploitation. It has developed a series of “Community Standards” which it uses to judge content. The corporation is also (like many other social media platforms and internet companies) a member of and a financial supporter of the Internet Watch Foundation.  This is a charity, founded in 1996 by the Internet industry in co-operation with the Home Office and the police. The Foundation works internationally with the global internet industry and the European Commission to make the internet safer by removing images of child sexual abuse and criminal obscene adult content. It relies on reports from the general public, but it also actively seeks out offending material, using the latest technology. More than 1,000 webpages are assessed and removed each week by their analysts. The Foundation claims that as a direct result of their work, child sexual abuse content hosted in the UK has reduced from 18% in 1996 to below 1% today and content hosted in the UK is removed quickly – usually in less than 2 hours. It also works with local, national and international police to help them identify and rescue child sexual abuse victims. In addition, there is the UK Council for Child Internet Safety. This is a group of more than 200 organisations drawn from across government, industry, law, academia and charity sectors that work in partnership to help keep children safe online. The Council was established in 2010 and discusses and takes action on topical issues concerning children’s use of the internet. I could not find any mention of the work of the Foundation in the articles that appeared in the Times last week.  

There is no question that Facebook has the will and the capacity to take down material and does just that on a regular basis. The complaint from the Times was that certain material had been there too long, despite Facebook being alerted and that it was alarmingly easy for groups of people to set up illegal websites.There are concerns about the way in which the Internet Watch Foundation operates. I read a fascinating article published in the International Journal of Law and IT (Int J Law Info Tech (2012) 20 (4): 312) by Emily Laidlaw of the University of East Anglia. She describes the work of the Internet Watch Foundation, and points out that it is in effect a form of self-regulation introduced as an alternative to government legislation.  

Self-regulation of industries is common in the United Kingdom. The regulation of the way in which insurance companies compensate victims of uninsured and untraced drivers is handed by the Motor Insurers’ Bureau, a company limited by guarantee set up and funded by the insurance industry itself, but overseen by the government and (at present) the European Community. Many professions are self-regulated.

Emily Laidlaw says that the UK’s trade association for Internet Service Providers, the UK Internet Services Providers Association defers to the Internet Watch Foundation with regard to filtering of unlawful content. Members of that Association are bound by its Code of Practice, which states that membership in the Foundation is not mandatory. However, it makes clear that the Association co-operates with the Foundation and that its procedures in this regard are mandatory for Association members. The problem is that the membership arrangement could be tightened up and made more transparent. The Foundation is that it is carrying out a public function, in a private capacity which means that it may not be covered by Human Rights legislation. A request under the Freedom of Information Act cannot be made against it.  There are also concerns that by blocking entire websites with legal and illegal content, the Foundation may have had an adverse effect on freedom of speech.

My own view is that the debate is not just about illegal imagery and material; it is also about the ability of abusers and terrorists to groom vulnerable people via social media platforms. We also need to examine the use of the internet to perpetrate racist and hate based crime, as well as other types of offences. What is needed is a comprehensive and transparent form of regulation. I don’t pretend for a moment that such regulation would not be anything less than a highly expensive and technical task.

Another idea would be to impose a form of statutory liability on social media platforms to try to ensure that their sites are not used for the kinds of imagery such as “revenge porn” or child abuse. These images cause enormous damage to their victims. The law could impose a duty on a social media platform, along the lines of that contained in the Occupiers Liability Act 1957. This decades’ old statute imposes a duty on property owners “to take such care as in all the circumstances of the case is reasonable to see that the visitor will be reasonably safe in using the premises for the purposes for which he is invited or permitted by the occupier to be there.” That statutory duty could be judged along the lines of the social media platform’s duty to remove offensive material within a certain time.
​
 Certainly, the pressure is on, for social media platforms such as Facebook and we can expect legislation in the future.
0 Comments

Monk accused of running "sex club" allowed to stay at Ampleforth

6/4/2017

0 Comments

 
Picture
The Times reports today on the case of a monk said to have run a weekly "sex club" for young boys, who was allowed to remain at the Ampleforth College. 

This follows on from a story on which I reported back in August 2016, concerning a teacher who was arrested on charges of sexual abuse against a pupil there in 1989.

http://childabuselawyer.blogspot.co.uk/2016/08/ampleforth-college-private-schools-duty.html

Originally he had been due to stand trial on charges of abusing several other pupils, but those charges were dropped, leaving only one witness to give evidence against him in court.  At trial, the teacher was acquitted. He has denied any wrongdoing. 

An investigation by the Times has apparently discovered that Ampleforth asked the teacher to leave in 1989 after allegations were made against him by pupils of inappropriate contact. The police were not informed. It is also alleged that the police failed to contact two other former pupils, who could have been witnesses at trial.

Section 218(6)(a) to (c) of the Education Reform Act introduced regulations prohibiting  or restricting the employment of teachers. That section applied to local education authorities (section 218(c) together with teachers employed otherwise by LEA’s (section 218(b) and further and higher education authorities (section 218(a)). 

However private schools were not covered at that time.

The regulations introduced under Section 218(6)(a) were  the Education (Teachers) Regulations 1989 No. 1319 which came into force on the 1st September 1989.

Regulation 7 stated that the regulations would apply in relation to the employment of persons— “(a) by a local education authority, as teachers (whether or not at a school or further education institution) or as workers with children or young persons;”

Regulation 10(2) allowed the Secretary of State to bar or restrict a person’s employment. The grounds for exercising that power were set out under Regulation 10(1)):-
“(a) on medical grounds;
(b) on grounds of a person’s misconduct (whether or not evidenced by his conviction of a criminal offence); or
(c) in relation only to employment as a teacher, on educational grounds.”

Regulation 11 stated:-
“Where a person is dismissed from relevant employment on grounds of his misconduct (whether or not he is convicted of a criminal offence) or his employers would have so dismissed him, or considered so dismissing him, but for his resignation, his employers shall report the facts of the case to the Secretary of State.”

The 1989 Regulations were replaced by the Education (Teachers) Regulations 1993 No. 443 which came into force on the 1st April 1993. These are broadly the same as the 1989 Regulations.

Later on, section 218(6)(d) of the 1988 Act enabled the Secretary of State to make regulations in relation to the restriction of the employment of teachers atprivate schools. 

That requirement came into force on the 1st January 1994 under Section 290(3)(b) of the Education Act 1993 and was preserved by Schedule 38 of the Education Act 1996.

Section 49 of the Education Act 1997 also inserted a new section 218(6A) & (6B) into the 1988 Act.

The grounds for restriction/disqualification were further extended by section 5 of the Protection of Children Act 1999 which inserted section 6ZA into the 1988 Act.

The regulations for private schools were introduced by Education (Teachers) (Amendment) Regulations 1994 No. 222 which came into force on the 1st March 1994. Regulation 3 amends Regulation 10 of the Education (Teachers) Regulations 1993 so that those regulations applied to private schools.
​
“Misconduct” was nowhere defined in the statutes and statutory instruments, but it was made clear that it was not dependent on a criminal conviction.
0 Comments

Anonymity for victims of abuse in New Zealand

4/4/2017

0 Comments

 
Picture
​I have been sent two reported decisions from New Zealand on anonymity for victims of abuse by Sonja Cooper, a lawyer acting for child abuse victims in that jurisdiction.

The first is Y v Attorney General [2016] NZCA 474. This was an appeal against a judge's decision not to grant anonymity to witnesses, who were alleging non sexual abuse. The court said that it would not grant anonymity to any class of witnesses, but rather that each witness had to present his or her own evidentiary basis for anonymity.

The next case is X v Attorney General [2016] NZCA 475. This involved the application of an organisation providing care for young people for the suppression of its name. The organisation argued that a) that young people in its care could come to harm b) its staff would face difficulties c) it would suffer irreparable damage to its reputation. The New Zealand Court of Appeal decided the judge who originally granted the order was right to do so. The loss of anonymity would harm young people in its care, particularly as those young people had no interest in these proceedings.

At present in England and Wales, it is usually relatively straightforward, getting an anonymity order in child abuse proceedings. These decisions, if taken into our law, might open the way for organisations to obtain anonymity for themselves and for courts to deny anonymity to certain witnesses in child abuse cases.
0 Comments

    Author

    Malcolm Johnson, Specialist Child Abuse Lawyer

    Categories

    All
    Child Abuse In Sport
    Children & Social Media
    CICA
    Failure To Take In To Care

    Archives

    November 2022
    April 2022
    March 2022
    April 2019
    January 2019
    November 2017
    October 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017
    January 2017
    December 2016
    November 2016
    October 2016
    September 2016
    August 2016

    RSS Feed

Contact Us

    Subscribe to Updates Today!

Submit

The contents of this site remains the sole responsibility of Malcolm Johnson as a private individual, and is not endorsed by any business by which he is employed.  In particular Malcolm Johnson does not hold himself out as preparing this website for or on behalf of any business by which he is employed, or as having been authorised by any business or employer to do so.  It is not intended to stand as legal advice in any particular case, and should not be relied upon as such.   To the extent permitted by law, Malcolm Johnson will not be liable by reason of breach of contract, negligence, or otherwise for any loss of consequential loss occasioned to any person acting omitting to act or refraining from acting in reliance upon the website material or arising from or connected with any error or omission in the website material.    Consequential loss shall be deemed to include, but is not limited to, any loss of profits or anticipated profits, damage to reputation, or goodwill, loss of business or anticipated business, damages, costs, expenses incurred or payable to any third party or any other indirect or consequential losses.

  • Home
    • About
  • Case Law
  • CICA Claims
  • Contact
  • Blog