[0738] daemon@ATHENA.MIT.EDU (RISKS Forum) RISKS Forum 10/28/91 17:04 (607 lines) Subject: RISKS DIGEST 12.57 From: RISKS Forum Date: Mon, 28 Oct 91 14:02:41 PST Reply-To: risks@csl.sri.com To: RISKS-LIST:;@csl.sri.com RISKS-LIST: RISKS-FORUM Digest Monday 28 October 1991 Volume 12 : Issue 57 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: DSA/DSS -- Digital Signatures (Ron Rivest) Porn-Sabotage in Italian newspaper (Enrico Musio) Re: MCI Friends & Family (Allan Meers) Do floor vibrations damage disks? (Magnus Redin) Re: Software migration at Johnson Space Center (Doug Burke) A New Twist on "Speed Controlled by Radar" (Andrew C. Green) Call for Papers ESORICS-92 (Yves Deswarte) The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, and nonrepetitious. Diversity is welcome. CONTRIBUTIONS to RISKS@CSL.SRI.COM, with relevant, substantive "Subject:" line. Others may be ignored! Contributions will not be ACKed. The load is too great. REQUESTS please to RISKS-Request@CSL.SRI.COM. For vol i issue j, type "FTP CRVAX.SRI.COMlogin anonymousAnyNonNullPW CD RISKS:GET RISKS-i.j" (where i=1 to 12, j always TWO digits). Vol i summaries in j=00; "dir risks-*.*" gives directory; "bye" logs out. The COLON in "CD RISKS:" is essential. "CRVAX.SRI.COM" = "128.18.10.1". =CarriageReturn; FTPs may differ; UNIX prompts for username, password. ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY. Relevant contributions may appear in the RISKS section of regular issues of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise. ---------------------------------------------------------------------- Date: Sat, 26 Oct 91 23:12:33 EDT From: rivest@theory.lcs.mit.edu (Ron Rivest) Subject: DSA/DSS -- Digital Signatures Director, Computer Systems Laboratory ATTN: Proposed FIPS for DSS Technology Building, Room B-154 National Institute of Standards and Technology Gaithersburg, MD 20899 Dear Director, I'm writing to comment on the public-key digital signature algorithm, which you call ``DSA'', that you have proposed to become a standard. This letter is in response to your request for comments to this proposal. Although I have other comments about this algorithm (to be made in another letter), and I believe that an RSA-based standard would serve the country much better, I will restrict my comments here to just a discussion of your proposed key size of 512 bits for the DSA. I believe that a national standard based on such a fixed small key size would serve our country very poorly---you are unnecessarily risking catastrophic failure of the integrity of our financial, industrial, and governmental information-processing systems. To begin with, I question the rationale for having a fixed key-size at all as part of your proposal. Clearly, one needs a minimum key-size to prevent users from choosing keys-sizes that are too short to be secure. And one might want a maximum key-size so one can design efficient yet fully-compatible implementations. Yet there is no obvious reason why the minimum required key-size and the maximum allowed key-size should be the same. Indeed, your ``one size fits all'' proposal is a very poor match to the engineering and security needs of most public-key applications. Typically, there are many users, each of whom has a public key used by others to verify his digital signatures. Such applications are invariably based on the use of ``certificates,'' so verifiers can assure themselves that they are verifying signatures with the appropriate public key. A certificate is a message associating a user's name with his public key; this message is itself signed by a ``certifying authority'' whose public key is widely known. A typical signature verification then normally consists of two verifications: one to verify the certifying authority's signature on the user's certificate, and then another to actually verify the user's signature. (As a side note, I observe that this typical structure contradicts your claim that signing is done more often than verification. In my experience, verification is typically done at least twice as often as signing.) A certificate-based application thus incorporates two basic kinds of signatures: ordinary user signatures and signatures by a ``certifying authority.'' The latter kind form the backbone of the integrity of the entire application, since the ability to forge certificates would give an attacker essentially unlimited power to corrupt the system. I consider it essential in secure public-key system design to have certifying authorities use the maximum allowable key-size. This size is typically much larger than what an ordinary user might select for his own public-key size. As an example, in an RSA-based scheme, certifying authorities might use keys of 1024 bits, whereas ordinary users might choose key sizes of 500--800 bits. In a typical application, the trade-off between security and performance mandates the use of different key-sizes in different parts of the system. Certifying authorities, or users with very valuable data, must use very long keys to achieve the highest possible security level. Other users, with reduced security requirements and/or more stringent performance requirements, will use shorter keys. Trying to make ``one size fit all'' results either in unacceptably low security for all users (because all certificates will be suspect) or unacceptably poor performance for some users. In a public-key system based on number theory, there is no valid technical reason for requiring a fixed key size. The underlying number-theoretic algorithms can support arbitrary key sizes. Users and certifying authorities should be able to choose key sizes according to their requirements. I now turn to a discussion of the particular key size you have chosen: 512 bits. (By key size, here, I refer to the size of the prime modulus p.) I argue here that if you are going to insist on having a fixed key size, then 512 bits is far too short. I note that you provide no rationale for the choice of your key size. While it is my belief that you have been co-opted by the NSA (who fears the use of widely distributed public keys as a basis for encryption algorithms), I will restrict my discussion to technical matters rather than political speculation. In order to estimate the key size necessary, one needs to understand the computational resources available to an imagined potential attacker, and the computational difficulty of the underlying cryptanalytic problem. Let me address each of these issues in turn. How much computational power can an imagined attacker bring to bear to ``break'' the system? This depends on the time period we are talking about (since technology is rapidly evolving) and the financial resources of the attacker (to purchase the necessary computing power). It is necessary to know the expected lifetime of the proposed standard in order to know what level of security to aim for. A scheme that is considered ``secure'' today may not be secure in the year 2000, and a scheme considered secure in the year 2000 may not be secure in the year 2010. Computer technology is evolving at an incredible pace, and is likely to continue to do so for the next few decades. The security of cryptographic schemes thus tends to ``erode'' steadily over time, and the design of cryptographic systems must envision and plan for such erosion. I would suggest that a digital signature standard should be designed with a minimum expected lifetime of at least 25 years. That is, one should design so that a system adopted in the year 1992 should still be secure in the year 2017. It should not be possible for an attacker in 2017 to forge a signature, using the computers available then. Where does ``25 years'' come from? To consider the only available precedent of the lifetime of a NIST cryptographic standard, I note that the DES was adopted in 1976 and seems likely to still be in widespread use by 1996, twenty years later. After a cryptographic signature standard has been terminated, one needs to have an additional period of time where the validity of signatures can still be assured. For example, it is not uncommon to require that signed documents be retained and be considered legally binding for seven years. A signature produced in the year 2010 should still be verifiable in the year 2017, with an understood assurance that it wasn't just recently forged. I consider a 25-year expected lifetime a minimum reasonable requirement for a digital signature standard. What kind of computational power will be available to an attacker in the year 2017? It is widely asserted that computational power (per dollar spent) is increasingly at approximately 40% per year. Some of my colleagues assert that 45% is a better estimate, but I'll stick to the more conservative estimate. This means that we have an approximate doubling of computer power (per dollar) every two years, and an approximate increase of a factor of 4500 after twenty-five years. Let's round this off to 5000 for our back-of-the-envelope calculations (corresponding to 25.3 years at 40% growth/year). In the year 2017, I expect computer power will be about 5000 times cheaper than it is now. How big an attack should one prepare for? Let me suggest that a national digital signature standard should, at a minimum, be able to withstand an attack costing an attacker $25 million. This amount of money is easily available to large corporations, drug dealers, and third-world countries. There is no reason that our national security, in terms of the integrity of our electronic business, financial, and governmental information-processing systems, should be vulnerable to an attack costing only $25 million. Indeed, it is easy to make an argument for a much higher threshold; it is not hard to imagine scenarios in which the benefit of a successful attack exceeds $25 million. However, I'll continue our back-of-the-envelope calculation with the $25 million figure. How much computing power can one buy for $25 million? Today, a workstation with 100 MIPS (million instructions per second) can be probably be purchased in quantity for about $5,000. An attacker wouldn't need all of the peripherals (screen, floppy disk, mouse, nice cabinent, etc.), and could economize by sharing power supplies, fans, etc. He is basically interested in having many processors, each with a reasonable amount of memory. Let me estimate that such a ``stripped-down'' 100-MIPS processor would have an amortized cost today of $1,000. A convenient unit of computation is a ``MIPS-year''---the amount of computation performed by a one-MIPS processor running for a year. A MIPS-year thus corresponds to about 32 trillion basic operations. If we assume that a 100-MIPS processor lasts for about 10 years, we obtain an amortized cost estimate for today of $100 per MIPS-year of computation. (Here we are buying ``computation by the yard''; our yard is one MIPS-year, and it should cost about $100 in quantity. The details of buying computational power in 2017 I leave to your imagination; a simple cost-effective way might be to spend considerably more than \$25 million to purchase hardware, and then to resell the hardware after the computation is done.) We therefore can estimate that an attacker with $25 million to spend today could purchase about 250,000 MIPS-years of computation. In the year 2017, he will be able to purchase about 5000 times as much, or 1.25 billion MIPS-years. I believe that a digital signature standard adopted today should, at a minimum, be able to withstand an attack of 1.25 billion MIPS-years. (This sounds like a lot of computation, but you can see from my arguments above that this is in fact a rather conservative estimate of the security requirement for such a standard.) How large a key-size is needed to withstand an attack of 1.25 billion MIPS-years? This depends, of course, on the cryptanalytic problem to be solved. In the case of your proposed DSA, the basic cryptanalytic problem is the ``discrete logarithm problem'': computing x, given g, p and g^x mod p. Using the best-known algorithms for this problem, the number of operations required is approximately L(p) = e^{ sqrt{ ln p ln ln p}} . The state of the art in algorithms for the discrete logarithm problem is still evolving, but the above formula certainly seems like a very conservative estimate of what will be possible in 2017, since it represents what is possible today. For example, with a 512-bit prime p (as in your proposal), we see that only L(2^{512}) = 6.7 x 10^19 operations = 2.1 million MIPS-years of computation are required to ``break'' a 512-bit problem. Thus, we see that your proposed DSA is over 500 times weaker than a conservative analysis suggests is required. Another way of stating the above result is that the DSA, as proposed, has a maximum expected secure lifetime of approximately six years. (Since 250,000 x 1.4^6.33 is greater than 2.1 million.) Setting L(p) equal to 1.2 billion MIPS-years, and solving for p, we find that the DSA should be using keys of at least 640 bits, minimum. This is, as noted, a conservative estimate. It doesn't plan for improvements in the state of the art of algorithms for solving the discrete logarithm problem, which can have a dramatic effect on the key size required. It has no margin built-in for faster-than-expected improvements in hardware, longer-than-expected use of the DSA, or richer-than-expected adversaries. The ability to harness for free ``unused'' processor cycles over large networks of workstations could also dramatically increase the computational ability of an adversary, thereby altering the equation further in his favor. For these reasons, and as a matter of sound conservative design, I feel that a substantial ``margin of safety'' should be built into the standard. Most importantly, certifying authorities should have generous key-sizes allowed. I would strongly recommend allowing key sizes of at least 1024 bits for certifying authorities, and at least 800 bits for users, in any digital signature standard. I feel that anything less is short-sighted and risky, possibly verging on the irresponsible. In cryptographic systems, responsible design is conservative design; generous allowance must be made for unforeseen developments. For all the above reasons, I feel very strongly that your DSA proposal, with its proposed 512-bit keys, is not sufficiently secure to be acceptable as a national signature standard. Sincerely, Ronald L. Rivest, Professor of Computer Science, MIT, Cambridge, Mass. 02139 ------------------------------