It is a dilemma that every IT manager will face at some point: what do you do if you discover illegal content on your system? The answer should be simply report it and remove it, but unfortunately in the real world things aren’t always so black and white. Office politics and guilt-trips aside, the answer will depend largely on what type of content we are talking about and on your perception of its illegality. That may sound a little strange, since illegality shouldn’t be a sliding scale (if it is against the law, it is against the law), but most of us tend to apply one anyway.
For instance, would you deal with a small stash of copyright-infringing MP3s as harshly as a directory full of cracked Office applications? If your company has a sound AUP (Acceptable Use Policy) covering use of company equipment for illegal purposes, the answer should be ‘yes’ to both and they should be dealt with according to the disciplinary route laid out there. The culprit should be punished for abusing company resources, regardless of the nature of the software. A secondary question then arises about informing the police of the crime. On paper, this is likely to be covered by the AUP, but common sense means it is highly unlikely anyone would get frog-marched to the nick for a stash of Girls Aloud music (criminal as that may be), nor even for a bunch of dodgy WaReZ.
However, everything changes if the illegal content is child pornography, and I mean everything. There is absolutely no doubt about the illegality or the seriousness of the crime, but there is considerable doubt about the legal position of the person who discovers and reports the material. Nobody wants to risk being labelled a paedophile, and no company wants to risk being associated with it either. For many companies and individuals who come across such material accidentally, the easiest option is to delete it and ignore it.
Recently, I had reason to research this issue while drawing up an AUP for a new client, a mid-sized company working in the ‘adult new media’ market whose directors were only too aware of the risk of exposure to such content. The Protection of Children Act (POCA) 1978 England and Wales states quite clearly that it is an offence to ‘take, permit to be taken, make, possess, show, distribute or advertise indecent images of children in the United Kingdom.’ Yet, section 46 of the Sexual Offences Act 2003 states that in proceedings for an offence under section 1(1)(a) of making an indecent photograph or pseudo-photograph, the defendant ‘is not guilty of the offence if he proves that it was necessary for him to make the photograph or pseudo-photograph for the purposes of the prevention, detection or investigation of crime, or for the purposes of criminal proceedings, in any part of the world’. These contradictory provisions have led to much confusion, particularly in recent years when there is been such a media furore over prosecutions for the possession of child pornography.
The fact is that many companies and individuals alike are running scared from reporting pornographic material, websites and emails, because they fear that they themselves will be arrested for ‘possession’ or ‘publication’ if they do. The Internet Watch Foundation (www.iwf.org.uk) has posted some telling research on its site that reveals how corporations perceive the problems of discovery, reporting and the law. When asked what action would be taken if potentially illegal indecent images of children were found on an employee’s PC, a mere 27 per cent said they would report it to the police (although 70 per cent would dismiss the employee), and only 13 per cent were aware of a recent amendment to the Sexual Offences Act that permits the retention of such images to provide evidence for the police or the IWF during an investigation. This ‘Memorandum of Understanding’ does away with the previous problem, whereby someone might be prosecuted for looking at child pornography, even if the exposure was accidental or occurred, ironically, while policing a network. The IWF makes clear that it is still ‘against the law to actively seek out such images, and doing so in order to report to the IWF wouldn’t be a defence in court’, but it does perhaps demolish one of the barriers that’s been preventing people from reporting this filth – and reporting it is vital if those responsible for its production and distribution are to be brought to justice.