The F9 factoring result

 Date: Wed, 27 Jun 90 22:19:44 -0400

From: Theodore Lee <lee@TIS.COM>

Subject: The F9 factoring result


MESSAGE FROM RON RIVEST VIA JIM BIDZOS VIA STEVE KENT VIA STEVE CROCKER:

Thanks to Robert Silverman for keeping many people honest.

As an additional effort to that end, I attach an analysis of

the recent factoring effort, done by Ron Rivest.  The early 

reports of RSA's demise have been greatly exaggerated...

Note: Be sure and read the end of Rivest's note.

Jim Bidzos, RSA Data Security


To: Whom It May Interest (Feel free to distribute further...)

From: Ronald L. Rivest

Date: June 21, 1990

Re: Recent Factoring Achievement

(Preliminary draft; may contain typos or other inaccuracies.

Please send corrections to rivest@theory.lcs.mit.edu)


This note is in response to the numerous inquiries I've received regarding

the recent factoring of a 155-digit number by A. Lenstra, M. Manasse, and

others.  (See the New York Times article of 6/20/1990 by G. Kolata.) 

This note attempts specifically to correct some of the misimpressions that

may arise from a reading of such popular press articles.


Using an ingenious new algorithm, Lenstra, Manasse, and others have

factored the 155-digit number known as "F9", the ninth Fermat number:

F9 = 2^(2^9) + 1 = 2^(512) + 1 .

In binary, this number has the form

100000....000000001

where there are 511 zeros altogether.  (F9 is a 513-bit number.)

This is a fascinating development, and the researchers involved are to be

congratulated for this accomplishment.


The algorithm used is known as the "number field sieve", or "NFS" (not

to be confused with a network protocol of the same acronym!).  The NFS

algorithm is described in the Proceedings of the 1990 ACM STOC Conference.

The NFS algorithm is based on an idea due to Pollard, as developed further by

Arjen Lenstra, Hendrik W. Lenstra, and Mark S. Manasse.


The NFS algorithm is specifically designed to factor numbers that,

like F9, have a very simple structure: they are of the form

a^b + c 

where c is relatively small.  (For F9, we have a=2, b=512, and c=1.)

Some simple extensions of this algorithm are also possible, to handle

numbers whose binary representation has many zeros, and related kinds

of numbers (ternary, etc.)  Numbers that have such a special structure are

extremely rare and are unlikely to be encountered by chance.  That is,

the NFS algorithm does not apply to the kind of "ordinary" numbers that

arise in practical cryptography, such as using RSA.  They only apply to

numbers with "sparse" representations having few nonzero components.

(Let us call such numbers "rarefied".)


When working on a rarefied number, the NFS algorithm has an estimated

running time of the form (for an input number n):

exp(1.56 (ln n)^1/3 (ln ln n)^2/3) (1)

For n = F9, this evaluates to 

4.1 x 10^15 operations,

which, at 3.15 x 10^13 operations/year for a 1 MIP/sec machine (i.e. a

MIP-year), gives a workload estimate of

130 MIP-years,

only off by a factor of two from the actual work of 275 MIP-years. (That is,

formula (1) may be roughly too low by a factor of two.)


It is instructive to see the effect of doubling the size of the number

being dealt with.  A 1024-bit (332-digit) rarefied number requires an estimated

1.54 x 10^21 operations

= 4.9 x 10^7 MIP-years,

a dramatic increase in difficulty.  The NFS algorithm algorithm is not a

"polynomial-time" algorithm; the difficulty of factoring still grows

**exponentially** with a polynomial function of the length of the input.


What has this to do with RSA and cryptography?  I think there are three

basic points:

-- This development indicates that the status of factoring is

   still subject to further developments, and it is wise to be

   conservative in one's choice of key-length.

-- The NFS algorithm may yet be generalized to handle "ordinary"

   numbers, and the potential impact of this should be considered.

-- Factoring is still a very hard problem, despite everyone's best

   efforts to master it.


Regarding the further extensions of NFS to handle ordinary numbers, this is

judged to be a reasonable possibility by those working on NFS, so it is 

helpful to consider what impact this may have.


It is conjectured (see the ACM STOC paper referenced above) that a successful

extension of the NFS algorithm to ordinary numbers would have a running time

of the form:

exp(2.08 (ln n)^1/3 (ln ln n)^2/3) (2)

This is similar to equation (1) except that the constant 1.56 is

replaced by the constant 2.08.  Note that a practical version of such

an extension does NOT yet currently exist (to the best of my

knowledge), but even granting its plausibility we arrive at an

estimate of the tie required to factor a 512-bit number of

6.5 x 10^20 operations

= 2 x 10^7 MIP-years

which (in my opinion) is a substantial degree of security.  It is 

interesting to note that this work factor is actually GREATER than that

required by the ``standard'' factoring algorithms (e.g., the quadratic sieve),

which have a running time of

exp((ln n)^1/2 (ln ln n)^1/2);

for a 512-bit number, this gives a work-factor estimate of only

6.7 x 10^19 operations.

Indeed, the NFS algorithm (when extended) will be asymptotically superior than

the quadratic sieve algorithm, but will be slower for numbers with less 

than about 200 digits.  That is, assuming that (2) is indeed the correct

running-time estimate for any extension of NFS, then NFS will not affect the

security of any numbers of less than about 215 digits.  So any "standards"

that have been considered using 512-bit RSA moduli are not likely to be

affected by any NFS extensions.  (At most, one could imagine that the

RSA key-generation process might be extended to check that the resulting

modulus n is not a rarefied number.)


In the truly worst-case scenario, we would have that an extension of

NFS would be found that allows ordinary numbers to be factored with a

work-factor that is governed by equation (1); in this case one would

need to adjust the sizes of moduli used by RSA upwards by a factor of

less than two to more than offset the new algorithm.  A factor of two

in size affects the running time of public-key encryption (or

signature verification) by a factor of four and the running time of

private-key encryption (or signature generation) by a factor of eight.

Noting that the speed of workstations has increased by a factor of

over 100 in the last decade (indeed, such factors have been the

technological advance that made the successful implementation of NFS

possible!), such performance penalties, if necessary, seem to be

easily absorbed by expected technological advances in the speeds of

the underlying RSA implementation technologies.  That is, the NFS-like

factoring algorithms do not, even in this worst-case scenario, prevent

successful implementations of the RSA cryptosystem.


As a cryptographer, I am actually very happy with all the effort that

is being spent trying to determine the exact level of difficulty of

factoring.  Achievements such as the recent development of NFS help to

pin down the best-possible rate of growth of the difficulty of

factoring, so that users of cryptographic schemes can pick key sizes

with an increased degree of confidence that unforeseen developments

are unlikely to occur.  The best way to ensure confidence in a

cryptographic system is to have it attacked vigorously and

continuously (but unsuccessfully) by well-qualified attackers.  If,

despite their best efforts, the difficulty of cracking the system

remains intrinsically exponential, then one can have a reasonably high

degree of confidence that the system is actually secure.  This is the

process we have been seeing at work in the recent work on factoring.

The results of the attacks can be used to guide the selection of the

necessary key size for a desired level of security (with an

appropriate margin of safety built in, of course).


(As a closing note, here's a prediction: I expect that the 128-digit

``challenge RSA cipher'' published in the August 1977 issue of

Scientific American to be cracked (probably by the quadratic sieve

algorithm or a variant, not NFS) during the next 1-3 years.  This

accomplishment will require substantially more computer time than the

275 MIP-years required to factor F9.)


Comments

Popular posts from this blog

BOTTOM LIVE script

Evidence supporting quantum information processing in animals

ARMIES OF CHAOS