foundation for information policy research
> Home
> About
> Policy Work
> Trust in E-commerce and E-government
> Surveillance and security
> Intellectual property and the public domain
> International law and the Internet
> Academic freedom
> Achievements
> Friends of FIPR
> Events
> Contact FIPR

Response to Export Control Consultation by FIPR

The Foundation for Information Policy Research [] is an independent body that studies the interaction between information technology and society. Its goal is to identify technical developments with significant social impact, commission research into public policy alternatives, and promote public understanding and dialogue between technologists and policy-makers.

We lobbied for the research exemption to the Export Control Act (section 8) because of our concern that the extension of export controls to intangibles would otherwise infringe basic freedoms and would also significantly hinder innovation. Innovation nowadays often involves many people in ad-hoc collaborations, exchanging ideas, work-in-progress and fragments of software in fluid and rapidly evolving collaborations. A requirement that all software development work impinging on dual-use technology be subject to export licensing will seriously get in the way of these established methods of working. There will be specific, and serious, negative effects on both research and on the free and open-source software communities.

Section 8 of the bill applies only to control powers granted under 2(1), not 2(5). 2(5) deals with orders made to implement EU regulations. We had anticipated that the principles accepted in section 8 after the debate in the Lords, and clarified in the discussions that Henry Miller and I subsequently held with Lord Sainsbury, would be followed when implementing the EU dual-use regulation. We are deeply disappointed that the new Act Team at the DTI has reverted to the old view of the world, and is going for a maximalist interpretations of European regulations in conflict with the clearly expressed will of Parliament.

The EU regulations have exemptions for material in the public domain, and basic scientific research, but not for applied research. The difference between the two is hard to draw, especially as the funding councils run by the DTI (such as EPSRC) more or less require that all research emphasise applied aspects in order to get funded. The `relevance to beneficiaries' section in the EPSRC grant application form naturally supplies the case for the prosecution should the grant holder discuss her work with a foreign colleague, or with a UK colleague who happens to open the relevant email while at a cyber-cafe overseas. Certainly much work on cryptography (the proximate cause of the dispute) will be considered to be `applied' and will thus fall within the net of the dual-use regulations.

One mechanism used to mitigate the effect of the dual-use (and other) regulations is the open general export licence. Some fields of research have these, but many more do not. For example, everything to do with cryptanalysis is licensable, and cryptanalysis is broadly defined. It appears to include power analysis; so if in future I exchange fragments of code for differential power analysis with scientific colleagues, every single such exchange is subject to prior licensing. This is a significant nuisance since vulnerability to power analysis is an important factor in the design and programming of smartcards and other access tokens that are very widely used. For example, I was recently a principal investigator in a 30 month, Eu 2.4m project to develop a smartcard CPU resistant to power analysis. This was one of the flagship projects of the EU Fifth Framework; under the proposed regulations it would be very difficult for UK researchers to participate in such projects in future without acquiring an individual licence. We see no evidence that provision has been made, in the impact assessment, for the costs of dealing with a large number of additional applications from persons who are not traditional users of the export licensing system. If it is indeed the intention of the DTI that academics should not have to apply for such licenses, then this should be clearly stated in the regulations.

The restrictions on sharing information about cryptanalysis, as defined, may also restrict public discussion of information security vulnerabilities in common software products. This would lock out the UK information security community from participation in the process of incremental improvement of security products; it would place UK software firms at a serious disadvantage; and it might result in computer scientists visiting the UK becoming unknowingly liable to criminal penalties. Surely this was not the intention of the drafters.

The open general export licenses make unrealistic assumptions. For example, the cryptographic development licence assumes that the collaborators have established relationships, and are developing commercial cryptographic products. Neither is typically the case for academic cryptography researchers who will typically be producing new algorithms and/or protocols out of scientific curiosity, and whose work will typically not take place in a formal relationship such as an EU fifth or sixth framework collaboration.

The same problem arises with developers of free and open source software. There is a further stipulation that the item must only be for use by the exporter or its collaborator; however, in both academic and free software collaborations it is common that the software will eventually be made available to many others, and quite possibly to the public. The effect will be to place burdens on free and open source software developers that are quite unreasonable given that they are often individual volunteers - perhaps students with no significant financial or organisational resources. Requiring such persons to submit to the licensing regime will likely drive many of them out of open source software development altogether. This is is a particularly perverse outcome, given the increasing reliance placed on such software by the public sector.

Another example comes from the open general export licence for computers. The dual-use list covers many computers in common use, such as those employing error-correcting memory, and much software, such as that designed to create a more reliable system out of less reliable components. This latter provision appears to catch peer-to-peer software. The OGEL allows the export of computers generally available at retail, but its multiprocessor clause is restricted to capacity handling only (and even that is limited to 190,000 MTOPS, so would be exceeded by even relatively small distributed applications such as SETI@home). Peer-to-peer research, like cryptography research, would appear to be squarely caught - contrary to the principles conceded by the DTI when the Act passed Parliament.

A further problem with these (indeed most) OGELs is that they require separate registration. During the discussions on the Export Act, we pointed out the administrative and political consequences of requiring thousands of academics to register in a system that was designed for a few large companies. It was agreed that this was neither desirable nor feasible. It is particularly odious to impose registration and record-keeping requirements in such circumstances while allowing an employee of a French company selling front-line weapon systems at the Farnborough Air Show to keep no records whatsoever of his trading activity in the UK.

It is argued that universities should be almost unaffected as the controls (other than in the WMD case) apply only to information that is not in the public domain. This argument is badly flawed. It is particularly unsustainable given that a defeat in the House of Lords compelled the DTI to even consider the requirements of research, and that yet it has since sought to vitiate the exemption inserted by Parliament by aggressive drafting of the dual-use implementation.

Only a small proportion of the information communicated between research workers ends up in the public domain. For example, during the development of the encryption algorithm `Serpent' by researchers in the UK, Israel and Norway, over 600 emails were exchanged (at least, that is the number retained; less important emails were not saved). The published outcome of this work was a specification and a handful of research papers. The unpublished emails included fragments of encryption code, fragments of cryptanalysis code, ideas, hypotheses, test data, sketches of proofs, and all sorts of other material which under the proposed regulations would have extremely variable export status. Some of it would escape licensing under the `pure research' exemption (though most would not); some of it at the other extreme would be caught as cryptanalysis software and would be unconditionally licensable. The effort involved in tracking all exchanges, documenting them and applying for licenses, would make such research impractical in future.

The net effects on research are unacceptable, and the Act Team must go back and start afresh. There must be a minimalist implementation of the EU regulations insofar as they apply to collaborative research and development work, and any residual effects on research in science, technology and medicine must be eliminated by means of an Open General Export Licence.

There are many other issues that the Act Team must consider during the redrafting:

  1. The regulations are unacceptably complex. Determining whether a particular activity is lawful or not takes multiple cross-references through schedules to UK and European legislation. For people who are not practitioners of export control law, the task is fraught. Such rights as there are get restricted by country in very complex and confusing ways. Some major trading countries, such as China, are on some quite serious blacklists; finding out precisely which list applies to which product is very hard. For example, Nicholas Bohm and the present writer - who had both been involved in the Act's passage through parliament, advising UUK and AUT as well as FIPR - spent some two hours trying to determine whether emailing crypto software to the USA would be an offence. We concluded that it would be; then the following day that it probably would not be as the USA is on a list of `good' countries. The two of us are perhaps as close to being experts as possible for people who are not actually practitioners of export control law.

    Intuition and common sense do not appear to help; it is not likely to be obvious to the average software developer that sending gas meter software to China requires a license. This state of affairs may have been tolerated so long as export controls affected only a few hundred large companies who sold weapons, and products such as industrial chemicals that could be used as weapon precursors, and who had perforce to cooperate with the DTI on export licences. Its effect was to keep these exporters in a state of dependency on DTI guidance. However, if export controls are to apply to most academics working in science, technology and medicine, as well as to all developers of free and open source software and to tens of thousands of SMEs, such deliberate opacity of legislation is unacceptable. Government must devise a better structure for the regulations and the schedules so that ordinary professionals can understand clearly whether a particular activity requires a licence.

  2. During the lobbying that led up to the Export Control Act, it was pointed out by us, by UUK and by the DMA, that the regulations implementing the Dual-Use Directive in 2000 had criminalised the everyday activities of large numbers of small businesses. (That is, the regulations would have criminalised such behaviour if they were lawful, which was disputed on the grounds that a new criminal offence of speaking to foreigners could not be created by secondary legislation.) This point was repeatedly evaded by officials who appeared determined to continue under the assumption that only their traditional `customers' would be affected.

    This consultation therefore materially breached the Cabinet Office guidelines on consultations, as it has not been `effectively drawn to the attention of interested parties' (principle 4). Indeed, the DTI appears to have been at some pains to achieve the opposite effect - by playing down the quite predictable impact of these regulations on large numbers of previously unaffected companies, especially in the software sector.

    We therefore request that Government extend the consultation period, and undertake a major publicity campaign aimed at informing all firms that exchange software or other information relating to dual-use technology electronically of the potential implications for their business, so that they and their trade associations can make informed representations. It is certainly not appropriate to sneak in draconian regulations of which most affected businesses are unaware and then subsequently tighten up enforcement to bring these businesses into the control system. This is not how the business of government should be conducted.

  3. The DTI has not paid attention to the US experience in managing intangible export controls. For example, if one of my students sends me some power analysis code in an email, and I happen to read it while sitting in an Internet cafe in Belgium, then under the proposed regulations he becomes a serious criminal. (Even the Americans have an exemption for such cases.)

  4. Given such complexity, and given the huge uncertainty about the numbers of affected parties, and given the poor implementation, and in any event, it is totally unacceptable to create strict-liability offences, and particularly offences that are likely to be committed inadvertently by harmless individuals. Many innocent people will be criminalised, such as schoolkids playing with software that contains any cryptographic functions (which is most software of any size these days). Large numbers of academics and business people will similarly be exposed.

  5. There are very onerous registration, reporting and record-keeping requirements. It will be foolish for an organisation not to keep copies of all emails (which will be in tension with the desire of many large companies to destroy all internal emails after 30 days to prevent them being subpoena'ed in litigation). Even keeping all emails might not be enough; it seems people might have to document the context of relevant emails for future reference. In this context we would like to protest specifically at the last sentence of section 4(3) of the regulations, which imposes a significant escalation and will greatly push up the cost of due diligence.

  6. The regulations are even more severe on hobbyists than on professionals. For example, people who write software in their spare time for fun and end up tinkering with code that contains cryptography will have to comply with tiresome registration and reporting provisions - which are worse than those imposed on professionals such as businesses producing cryptographic products and university staff. This could also have a serious adverse impact on volunteers who write or maintain free and open source software. Again, this is in direct conflict with other Government policies and priorities.

  7. The complexity and nondeterminism of the regulations mean that compliance costs will be astronomical. We understand that the Defence Manufacturers' Association will be making representations to this effect on behalf of industry. We would like to point out that similar costs may be imposed on nonprofit organisations such as universities. Universities' science and technology departments are greatly interested in high-tech equipment, of the kind that tends to end up on the dual-use list.

    In the old days of purely material controls, this was a tolerable burden. A university that purchased a sensitive item of semiconductor test equipment, such as a focussed ion beam workstation, had to apply for a licence when it was purchased, and again when it was thrown into a skip seven years later. It only had to train a couple of people in the purchasing department about export controls. Now, however, all software written for the machine becomes subject to license, and everyone in the departments of physics, chemistry, materials science and computer science is a potential proliferator. Any of them could unwittingly send a script abroad by email and thus commit an offence. The costs of legal advice and staff training may be substantial. In addition, the controls are likely to get seriously in the way of international research collaboration. It is only because UK scientists share our techniques with researchers overseas, prior to formal publication, that they share theirs with us. If we have to wait to read about the latest techniques in learned journals we shall be placed at a serious disadvantage.

  8. We were assured that the restrictions on telephone transfer of information would apply only to weapons of mass destruction. Now however it appears that discussions of military technology will be thus restricted. This might place an unacceptable burden on academics and NGOs discussing legitimate political subjects in international relations and politics generally. Even restrictions on WMD may have some effects, for example on anti-nuclear campaigners. There needs to be clarity that restrictions on military technologies of verious levels will not impinge on political free speech.

  9. There are many technical aspects of the proposed regulations and schedules that are unsatisfactory. For example, schedule 4 part 1 in annex F contains a mass-market exception that allows the export of crypto to non-wicked countries if it is available for sale to the public, its functionally can't be changed easily and it requires no further support from the supplier. This may deal with the SSL/TLS functionality shipped with Internet Explorer, but it does not offer much comfort to the many applications of embedded cryptography - such as the key fobs used for remote locking of motor vehicles. These contain cryptography, but are not `sold to the public without restriction'. On the contrary, it is in their nature that they are supplied only under considerable restriction, namely through an authorised dealer who must be satisfied that the customer is the registered keeper of the vehicle to which the fob relates.

    Other examples include prepayment electricity and gas meters, goods vehicle tachographs and perhaps even the authentication chip in your mobile phone battery. It is clearly undesirable that an elderly lady who donates a Motorola mobile phone to a Bosnian charity should thereby become a criminal because of the cryptographic authentication device in the battery.

    The mass-market exception may also not suffice for crypto shipped with free and open source software (although one might argue that changing GnuPG internals is not easy for the average user). People taking such wares through customs will be committing offences if they fail to register the transaction within 30 days (yet another case in which the use of strict liability is unacceptable). Lesser examples abound; public-domain crypto software can be exported in physical form (article 7(8)) - even for use in weapons of mass destruction - but not in electronic form (article 8). And the question of whether floppy disks are `electronic' while CDs are `physical' is far from clear.

  10. The severe treatment of activities such as cryptanalysis will have wider and unpleasant effects. It will make the possession of software such as DeCSS legally risky, and thus undermine attempts to develop free software for playing DVDs on Linux PCs. It will therefore most probably come into direct conflict with the EU Software Directive and it will certainly have a harmful effect on competition policy. DeCSS is also widely seen as a free speech issue; subjecting it to a requirement for prior export licensing will be seen by many as a direct assault on free speech.

  11. The anti-circumvention provisions are grossly excessive. Being knowingly concerned in the transfer of software with intent to evade will be punished with 2-10 years in prison. The desired effect of this appears to be the suppression of previously lawful political protest, such as the book of PGP source code published by MIT Press in protest at the USA's intangible export controls in the mid-1990s. Its likely effect in the present climate may be to criminalise (on a strict liability basis) the wearing of a DeCSS T-shirt. (There has been some discussion of this on the ukcrypto mailing list, but such is the opacity of the regulations that list members - who include a number of practising lawyers - have been unable to determine whether such T-shirts are in fact prohibited.)

  12. The powers granted to officials to enter premises at all reasonable times may be felt to be appropriate for the inspection of the arsenals of professional weapons dealers, but they raise grave concerns where the premises concerned are private homes. The prospects of DTI officials breaking down doors while people are at work and perhaps harming pet dogs who try to defend their owners' homes, or of a group of male officials confronting a woman aggressively in her home while her husband is away, are most unwelcome. The proposed powers must be circumscribed in relation to private homes.

In short, these regulations will impose a substantial burden of cost and regulation not just on industry but also on universities, NGOs, free software developers and the unsuspecting public. Kids wearing T-shirts expressing anti-Hollywood sentiments may become liable to thousand-pound fines, and kids who tinker with free software are very likely to commit offences unless they know that they must register with the DTI within 30 days. Free software developers will find it much harder to counteract the anti-competitive tactics of proprietary system vendors, to the detriment of innovation and economic growth. Someone who takes a computer abroad as a present for a relative (or to install in their cottage in Brittany) will become a criminal, while someone who gives a PC or a used mobile phone to a school in Bosnia will become a serious criminal.

It may be that these regulations will have little public impact, in that they will criminalise tens of thousands of people - but with so little publicity and enforcement (outside, perhaps, the arms industry) that the new criminals will remain blissfully unaware. That is not a desirable outcome, both as a general policy principle and also because the unwitting criminalisation of many academics and software developers will give future, perhaps less benign, Governments a tool with which to exact vengeance on their critics. On the other hand, if the regulations are passed in their current form and actually enforced, then a huge bureaucracy will be needed to administer them.

I believe it is fair to say that these regulations, if pressed to a vote in Parliament, will make the Government look both oppressive and stupid. A resounding defeat in the Lords is devoutly to be hoped for.

Ministers must have them withdrawn and redrafted.

Ross Anderson
Chair, Foundation for Information Policy Research

Copyright © FIPR 2003. This document may be copied freely in whole or in part provided attribution is given.

Valid XHTML 1.0
Problems viewing this site?