The launch of the government’s consultation document on Building Confidence in Secure Electronic Commerce (reference number URN 99/642) provides an opportunity for informed comments on the nature of the proposed regulation of this vitally important feature of the emerging, digital economy. It also provides an opportunity both to view and to shape the government’s own thinking in this area. Indeed, as a general, prefacing comment to the more detailed discussion below, it is worth reflecting on and applauding the openness of the procedure itself. Ministers and officials in the Home Office and in the Department of Trade and Industry have actively sought and – most importantly – acted upon the views of practitioners, professionals and interested parties throughout this period.
The welcome result is a readable, comprehensive and altogether well informed document, for which those concerned are to be congratulated.
The primary objective of the proposed legislation is also to be applauded. The government’s stated targets of 25% for electronic interaction with government; the positioning of the UK as central to the emerging digital economy; the recognition of the role of the law and of regulation in making this work – and of course, the concerns of the police and other investigators surrounding the use of encryption for illegal purposes. All of these elements have clearly helped in the shaping of this document which will – subject to the few critical comments below – provide an ideal framework for ‘UK plc’ in the coming decade.
The comments presented below are informed by, but by no means represent, the views of a variety of organisations with which the author has an involvement, and of individuals active in the field. First of all, the company for which I work – Bull Information Systems – is a key supplier of many of the foundations of the information society, from encryption products themselves, through the computer and network foundations, to the services and expertise to make it all work in practice. As such, the company has a keen interest in ensuring that the national infrastructure is well-formed, well-protected and well-supported; the regulation and the legislation represented in the consultation document represents a potential digital society with which we must actively engage.
Beyond this, I have a personal involvement with a variety of interested parties in this arena. I have undertaken work as a computer expert witness for defence and for prosecution in cases involving Internet-distributed pornography, encrypted using powerful tools; I routinely lecture at the police staff college in Bramshill, teaching investigators how to recover computer evidence, and at Glamorgan University where I have been nominated as external Professor in this field; I sit on the advisory board for one of the new Trusted Third Party companies, anxious to ensure that they can conform to the new legislation as it emerges. I am a member of the ACPO Computer Crimes Group sub-group advising on the policing of the Internet, and a member of the CBI Information Security panel – both of these crucially important bodies involved in shaping and advising this framework for electronic commerce. And finally, I am a published author, with books, articles and research papers in this vitally important area.
Notwithstanding these influences, this commentary is a personal one and should be read as such.
There are three principle areas in which constructive criticism of the document should be made:
In addition to these detailed areas, there is a general criticism that should be made in passing: the consultative document purports to be about the encouragement of electronic commerce; there is very little discussion, however, of the ways in which the basic commercial practices of the digital age are to be encouraged. The document purports to be in part about electronic government, though the focus seems rather to be on the electronic delivery of services rather than the activities of democratic and well informed debate – and in any case, there is very little about the way in which this is to be encouraged.
Instead, the document is about the way in which central government departments will seek to constrain digital practices, a step which is perhaps slightly myopic given the way in which the emerging digital society is unlikely to feel itself quite so bound by geography as past societies have been, and which governments almost by definition have to be. The result, however, is a rather one sided debate, in which the rules of play are clearly articulated, but yet there is no clear statement of why one would wish to play the game according to those rules – and indeed, no clear statement as to why one would want to play the game on that particular field.
The consultation paper is clear-headed in its distinction between the use of encryption technology for confidentiality and its use for authentication purposes. Indeed, the distinction is a practical and a regulatory one: the management of encryption keys and certificates for proving authorship and responsibility are quite distinct from those of maintaining documents in a secure but recoverable state. The paper is also quite clear in the nature of safeguards that are required in each case, and in the fact that lawful access – covered in more detail below – is appropriate in the case of confidentiality but not in the case of authentication, though in practice this might not prove to be sufficient.
Where the paper has shortcomings is in the purported technological neutrality: self-evidently, the major thrust of the paper is concerned with digital signature practice embodied in the use of public-private key pairs. However, on-line signatures already exist – on facsimile transmissions, on telex and in a variety of other mechanisms. A signature is anything that the two parties to a transaction agree to recognise as such, although of course certain ‘weak’ signatures can be falsified: a typed name at the bottom of an e-mail message, for example. A signature in this regard is, in many ways, a forensic concept: can it be proved to have been falsified?
The presumption that a signature is a valid one but that it can be rebutted if necessary is a powerful one in commerce, and indeed the practice of distance selling relies upon this as part of the introduction of trust factors into the transaction. But requiring that a digital signature be made in a particular way, and that it can have some degree of legal acceptability only if it is made in that way, is a somewhat clumsy step. It runs the risk of eroding the validity or acceptability of transactions that were previously acceptable.
Where the consultation paper is quite correct is in the proposals for extending the concept of ‘writing’ to include electronic writing – of any form. Moreover, to anticipate a comment below, the concept of ‘reading’ or of producing information in a readable form, should also be extended so as to specify that, if a particular transformation to the data is required for a reader to interpret it correctly, it is a part of the requirement for production. This is not simply a point for lawful access – though that aspect will be reiterated there – but also for the production of terms and conditions, operating addresses, owners and so forth, that form a part of a lawful contract between two parties. This is particularly a concern when one of those parties is a consumer, seeking protection on-line.
The other element of the use of digital signatures that should be considered is that of authentication of the signature production process. The digital signature can be used in complete confidence by both parties, but only where they can have the related confidence that the signature has been maintained in an appropriate way. The owner of the signature, for instance, might have unknowingly lost control of the system governing the signature: for example, through the use of an intrusive computer virus or hostile element of software. ‘Back Orifice’, ‘Netbus’ and a range of so-called ‘Hostile Applets’ have been produced and disseminated, allowing mischievous, malicious or criminal individuals to take control of the PC issuing the signature. Whilst the consultation paper discusses the requirements on the Trusted Third Parties involved (although these requirements are insufficient; see below), it makes no mention of the conditions that should be applied to the users of these signatures in terms of their management of the keys.
It is not sufficient merely to say that the signature holders are completely responsible for their own protection. They are responsible, but they should be offered systems that can adequately protect them – and this requirement should be reflected in the consultation document.
The question of lawful access to encrypted material has become – as the consultation paper reflects – one of the most wretched and vexed of the modern digital society, on both sides of the Atlantic.
It is certainly true that encryption which can protect legitimate business transactions can equally well protect illegal transactions, and it is also true that police investigations can be hampered by such encryption. The law enforcement proposals for access to encrypted material therefore centre on two aspects: firstly, on powers to require suspects to release their own keys or to decrypt seized material themselves (making it ‘legible’ to the investigators); or on police powers to require trusted third parties to release keys to which they have access – and moreover, not to inform suspects that such a warrant has indeed been served, referred to as ‘Tipping-off’.
There are several areas of concern here. First of all, requiring trusted third parties not to advise their clients of the release of their key flies in the face of established security practice. Once a key has been compromised, it should be immediately revoked. In the context of a criminal investigation this is clearly undesirable, but one can easily envisage situations in which a suspect’s key could have been released – thereby compromised – but that the subsequent investigation has found no grounds for further investigations. Would the former suspect then be informed that the key should be changed, and thereby run the risk of compromising further investigations? This is as much a commercial as a legal concern, since it touches on the trust which an individual encryption user can place in their trusted third party. The consultation paper does not address this issue, but common-sense dictates that some reasonable response should be proposed.
In many ways, however, important though this issue is, the more crucial aspects of the law enforcement proposals centre on the access requirements to the keys themselves; that is, on the extension to police powers, and the concomitant erosion of civil liberties that are implied by such an extension. The fundamental question here is this: Should we restrict the civil liberties of presumed innocent individuals simply because police investigations would otherwise be hampered?
Perhaps it is best to approach this question by removing some of the moral or ethical considerations, in which the more emotive responses have tended to ground themselves. Instead, ask the question in terms of costs and benefits of such an extension of police powers: the cost to me – as an innocent, law-abiding citizen – is the removal of one of my rights (that of silence, at least with regard to a password/pass phrase/PIN); the benefit is the increased protection that I would thereby enjoy, resulting from increased police effectiveness against the other, criminal users of the technology.
But would I in fact enjoy increased protection as a result? Arguably not. There are several reasons to deduce that police effectiveness against many target criminals benefiting from the use of encryption would not be increased as a result of the powers.
First of all, criminals using encryption technology – PGP, for example, as used by paedophiles – are unlikely in the extreme to consider lodging their keys with an escrow agent, or indeed to use any form of key recovery service to which law enforcement could gain access. And of course, the use of technology such as PGP is, quite correctly, not outlawed in the proposed regulation. Police powers to induce a third party to release keys are therefore less than effective against the primary targets of the law enforcement effort in this area: paedophiles, drug runners, terrorists, etc. And those criminals are unlikely to release the keys even if the police have powers to coerce such a release: they can claim to have forgotten the passwords, most obviously, accepting the lesser penalty for such refusal rather than a more severe penalty associated with any material that might be uncovered through decryption.
There are also situations in which investigators would wish to decipher intercepted communications, rather than seized material. Again, however, the current extent of technology available to criminals makes this improbable. The ‘Secure Socket Layer’ (SSL) protocol, for instance, does not provide a mechanism whereby the keys can be retained for later production. Intercepting Internet Relay Chat (IRC) sessions that have been encrypted would therefore not be materially helped by the proposals, and it is unrealistic to presume that local regulation requirements in the UK will influence the broader international use of the technology.
The proposals are anyway somewhat insufficient even if they were going to provide significant assistance to the investigators. For example, the police do not seek powers to recover encryption keys used purely for authentication, but a scheme exists – referred to as ‘Winnowing’ – whereby the authentication keys can be used to disguise true transmissions in amongst a variety of misleading ones. Such a scheme would effectively bypass even the proposed controls.
Against determined criminals, the police proposals are therefore ineffective. Would they, however, provide assistance in the recovery of computer evidence from individuals whose activities were found to have been illegal, but which they had not thought to effectively disguise? Again, it is uncertain that they would be in fact required anyway – and even those few examples given in the consultation paper show only that the investigations were hampered rather than prevented, a fact that tallies with my own personal experience in this area. Police investigations, in fact, rarely if ever rely wholly on the computer-derived evidence or even on intelligence derived from e-mail communications intercepted in clear. This is quite simply because the current state of computer forensic evidence is less than ideal: courts are poorly placed to appreciate such evidence, and the state of the law of computer evidence makes its reliable recovery and production problematic in the extreme.
Investigators therefore ensure that as much ‘real-world’ evidence as possible is in place, with the computer-derived evidence in a supporting role. This is equally the case with intelligence operations against terrorists or espionage agents, who have a long history of using ciphers: if investigators ever only relied on intercepting and decrypting such information – rather than on so-called ‘Human Intelligence’ and related practices – we would have been far less successful than we in fact are.
Because of these aspects, it is misleading to assert that, simply because investigations are hampered by encryption, they would be that much more effective with the extended powers. The proposed powers therefore, in my personal opinion, erode individual civil liberties, but without guaranteeing a compensatory improvement in police effectiveness.
This area should therefore be reconsidered, perhaps simply by providing support to the existing PACE powers: specifically, by defining the concept of ‘legibility’ in the production of evidence to encompass decrypted material, but without the associated requirements of encryption key production, tipping-off, etc. This would provide material assistance to the police – allowing them to read evidence already in their possession, for example – but without eroding liberties.
There are several aspects to the licensing regime proposed that one can criticise, both from the technical and the commercial perspectives – although it is important to reflect on the worthwhile aspects of this regime as well. The licensed TTP will be regulated by an official body, with clearly articulated requirements on the level of service provided; there is a willingness to examine the issue of limited liability for such a TTP, an effective balance between commercial reality and the protection for its users.
Where the proposed conditions fail, however, is in addressing one of the most fundamental of questions: Why would I, as a potential TTP, wish to become licensed? It will require a deal of investment on my part, but for what level of reward? It is as yet unclear that users will feel more confident using such licensed parties, especially for confidentiality services in the context of lawful access – although key recovery services for the individual are quite obviously attractive. Secondly, the competition for the TTP business is, of course, going to include organisations from overseas – the very nature of the Internet means that this competition is both inevitable and attractive. These competitors will have lower cost overheads – not having to become licensed – and would fall outside of the regulated region, but yet could still provide entirely adequate levels of service to the individual.
From a commercial perspective, therefore, the case for becoming licensed is far from established. The conditions for licensing are also – from a potential user’s perspective – insufficient. There are, in the consultation document, a series of somewhat weak statements about ITSEC or Common Criteria certification for the core technology, and about the vetting of staff and so forth; a provider is expected to have at least sought BS7799 accreditation – though it is unclear what happens if they don’t subsequently achieve accreditation. In summary, the controls and the reassurance that I would look for as a consumer of the service are insufficient to reassure me that trust can confidently be placed in the licensed organisation.
Of course, one could argue that a potential TTP might not be able to afford such certified products and such a level of technical competence and reliability: this is the position of many industry commentators on this issue. However, that is to miss the point; one should become a supplier of trust-related services only if one is able to induce that trust amongst potential clients. If that trust cannot be engendered, the potential supplier has no right to try and do the job ‘on the cheap’, and the licensing conditions should spell this out starkly and simply.
As a professional active in the several fields encompassed by the consultation paper, the government interest in the area is welcome – and the realistic, open approach it reflects is particularly refreshing. UK industry, financial services and infrastructure providers will benefit tremendously by the implementation of workable regulation in this area, which should be established as much with a view to controlling the illicit use of the technology as it is with a view to attracting inward investment to the country. Our national future is intimately bound with the future of the digital economy, a fact which the government has self-evidently grasped wholeheartedly.
Several aspects of the proposed regulation, unfortunately, are unrealistic, reflecting perhaps a relatively weak understanding of some of the core technologies, and particularly of the speed with which those technologies will develop and evolve. Given the openness of the consultation process to date, however, I have every confidence that the many comments that will inevitably be generated by the industry and by interested parties will inform subsequent considerations, and that what will result will be attractive, sensible and workable.
I look forward to working with government to establish the digital economy – and in particular, I look forward to working in that digital economy itself.
Dr Neil Barrett CEng MBCS – Bull Fellow – 26th March 1999.
Go to the library of current responses.
Go to FIPR home page.
Last Revised: April 16 1999