How To Get A Systems Vendor To Ship Cryptographic Software


Abstract

Treat cryptography as a localization issue; systems are localized (made specific to a locale) in order to sell overseas: error message and prompts are translated into local languages, etc. There is no reason why cryptographic software cannot be handled in the same way: localized for what the prevaling laws allow in the country to which such software will be sold. This allows U.S. vendors to deliver to their domestic customers the strongest computationally secure cryptographic systems possible (and thereby differentiate themselves in the commodity market that communications & networking standards create), and modified versions per other laws for export.

If you are a consumer of software, it is important for you to insist that your vendor behave this way.


The Expanded Version of the Argument

Cryptography is good, because it protects your privacy. If you encrypt your files and your correspondence, no one other than you or your intended recipient can read a message.

However, in so far as the U.S. Federal Government is concerned, cryptography is a munition (i.e. a weapon of war, like a gun or a bomb). That's how they choose to regulate it.

The existing International Trade in Arms Regulations (ITAR) prohibit American companies from exporting cryptographic software outside the United States without a license to do so. The Federal Government has set policy such that a license will only be granted if the keys used in exportable cryptographic systems are short enough that the Federal Government can break the code at will, quickly and cheaply. Currently, cryptographic systems whose key lengths exceed 40 bits do not get export licenses.

40 bits is not nearly long enough. It's too cheap to break, and getting cheaper all the time.

In August 1996, French graduate student Damien Doligez broke a Netscape 40-bit SSL-encrypted transaction in response to a challenge in eight days with computers he had at his disposal at the university with no special permission or arrangements. A team in the U.S.A. also cracked the same transaction in a similar amount of time, using computers put at their disposal from around the Internet.

If an individual can do this, so can your competitors, and so can foreign governments. It is always best, where cryptography is concerned, to assume that your adversary has at least the same capabilities that you do.

Add to this, Moore's Law: "The number of transistors that you can pack into an area of silicon doubles every 18 months." Loosely interpreted, this means that computer power at a given price doubles every 18 months. So, in principle, sometime toward the end of 1997, it will be possible to break the same 40-bit encrypted transaction in the same time with half as many computers, or twice as fast with the same number of computers. Moore's Law has held true for over two decades, so far.

So it's very important to use encryption technology and key lengths that are as strong as is economically reasonable. Fortunately, it's relatively cheap to make the keys longer. Except that in making the keys longer, you can't export the cryptography software from the U.S.A. any more because the Federal Government is more interested in being able to spy on everyone than in protecting its own citizens' privacy.

This puts American companies into a bind. If they develop cryptographic software, they must do two versions: one for domestic use (and for export to those countries we like or trust, e.g. Canada) with strong algorithms and long keys, and one for export to elsewhere, with crippled capabilities.

Of course, this situation puts American software at a competitive disadvantage when thir products are compared against similar software from countries that have no such export restrictions. Who overseas will buy American software or systems with cryptography that is set up such that the U.S. Federal Government can break the code when ever they feel like doing so?

What's worse is that, with some exceptions, American companies are using ITAR export restrictions to justify building products for domestic distribution that are weak enough for export (i.e. crippled in just the same way, just as easily crackable by individuals or groups with enough computing power).

The argument is posed by the systems companies (e.g. Apple, Sun, SGI, Microsoft, etc) that producing two versions of a system for customers (i.e. one with cryptographic protection of privacy and security, and one without) is "too expensive."

However, their actions give lie to their argument; each one of these companies performs "localization" of their systems for particular markets. That is, they deliver an English language system to English-speaking countries, Spanish-language system to Spanish-speaking countries, etc. This is, in effect, maintaining multiple versions of their products. They do this because otherwise their products would not sell in those markets.

The rational thing to do, then, in a world where cryptography is required for privacy, but the law prevents the export of strong cryptography, is to deliver strong cryptography systems where you can, deliver legally-required weak systems where you must, and document the sales lost to this policy to the extent possible.

Domestically, American software companies should deliver the strongest cryptography now known to their customers, so as to protect their privacy, and to differentiate themselves from each other in an otherwise commodity communications market.

Finally, software consumers and the general public have a role to play. Cryptography is for privacy protection. If you value your privacy, let both your software vendors know that you insist on buying only products with quality encryption systems in them, and let your Senator and Representative know that the Federal government's predeliction for being able to spy on everyone all the time at the expense of an individual's right to privacy is unacceptable, and must cease to be the policy of the United States of America.


Additional Reading


Erik Fair <fair@clock.org>
October 14, 1996