0

It's quite usual to read :

"the time needed to factor n by NFS algorithm is exp(...formula with ln(n)...)"

But I can't find to which unit of time this formula refers ? Is it really time ? Or number of operations ?

Thanks

Mike Andrews
  • 3,045
  • 18
  • 28
Andrew
  • 926
  • 2
  • 17
  • 24
  • Sure this should be tagged "Network File System"? It wouldn't hurt to spell out NFS at least once. – greybeard May 18 '17 at 13:11
  • Andrew, what is the NFS algorithm. What does it process, what kind of data? Is it GNFS https://en.wikipedia.org/wiki/General_number_field_sieve / SNFS https://en.wikipedia.org/wiki/Special_number_field_sieve ? – osgx May 19 '17 at 02:38
  • I'm not 100% sure this should've been closed as a duplicate, but the formula you're referring to [does appear to](https://en.wikipedia.org/wiki/General_number_field_sieve) be the *complexity*, in which case the duplicate post is very much relevant. – Bernhard Barker May 19 '17 at 03:42

2 Answers2

0

Normally running times of algorithms are given in amount of operations.

But most of the time if theorists talk about complexity of an algorithm they talk about the O-Notation (see https://en.wikipedia.org/wiki/Big_O_notation). In the O-Notation constants are ignored so a running time of O(1000n) is the same as O(n) and it is only concerning the runtime of the algorithm if the input size n is getting arbitrary large.

For a more detailed definition of operations complexity theorists use the turing model (https://en.wikipedia.org/wiki/Turing_machine).

blacktemplar
  • 699
  • 1
  • 7
  • 21
  • OK, thanks. Number of operations sounds more logical. What is puzzling, is that I always found the word "time"... – Andrew May 18 '17 at 12:43
  • The thing is in `O`-Notation it makes not really a difference. The difference between operations or time is only a constant scaling factor. In this sense it doesn't matter which unit you use. – blacktemplar May 18 '17 at 13:12
  • Yes, there is a o-notation in the formula exp(...bla-bla...); but in my understanding, it means that the formula without "o" is true when n-> infy; meanwhile, the real time in seconds will depend on the computer performance. True ? – Andrew May 18 '17 at 13:28
  • Not really, as I said constants are irrelevant for the `O`-Notation. So lets say one computer performs 1,000,000 operations per second and the other one 1,000,000,000 per second then the difference is only a constant factor of 1,000. There are different (equivalent) definitions of the `O`-Notation but one of them is the following: A function `f(n)` is in `O(g(n))` if there exists a `n_0` and a constant `c` such that `f(n) <= c * g(n)` for all `n > n_0`. Here you see the constant `c` can be arbitrarily large (but constant ;)). – blacktemplar May 18 '17 at 13:55
  • Well, thanks for these explanations; but I have to investigate further on my own, because something is disturbing. For me, if one computer performs 1,000,000 operations per second and takes 1000 s to factor n, then the one performing 1,000,000,000 per second will factor n in 1000/1000 = 1 s. – Andrew May 18 '17 at 14:42
0

In the case of NFS as "number field sieves" (methods to factor some huge integers into prime factors) the n is usually the number to be factored. For example:

Heuristically, its complexity for factoring an integer n (consisting of ⌊log2 n⌋ + 1 bits) is of the form

  • exp( (sqrt3(64/9)+ o(1)) * (ln(n))^(1/3) * (ln(ln(n)))^(2/3)

... Note that log_2(n) is the number of bits in the binary representation of n, that is the size of the input to the algorithm

Heuristically, its complexity for factoring an integer n is of the form:

  • exp( (1 + o(1)) * ( 32/9 * ln(n))^(1/3) * (ln(ln(n)))^(2/3)

Using this complexity user may try to use time of factoring some huge N with GNFS/SNFS, to estimate how long it will take to factor much bigger M with uses 2 times more bits than N, or even bigger L of 4 times more bits.

For example, there is RSA Factoring Challenge which was able to factor RSA-768 - number so huge that it was needed to use 768 bits (binary digits) to write it (232 decimal digits). It was factored by many computers in 2 years, and total time of computations is 2000 years of work of single 2.2GHz "Vendor 2" PC. There are now RSA numbers used now of 1024 bits (and 2048 bits too), and you may use 768 and 1024 as value of ln(n) in formulas (binary logarithm is number or binary digits needed to write the number plus o(1) -small o of 1). Did not calculate, but people who did factor 768 says that 1024 bits are much more secure:

https://eprint.iacr.org/2010/006.pdf Factorization of a 768-bit RSA modulus. version 1.4, February 18, 2010

On December 12, 2009, we factored the 768-bit, 232-digit number RSA-768 by the number field sieve (NFS, [20]) ... Factoring a 1024-bit RSA modulus would be about a thousand times harder, and a 768-bit RSA modulus is several thousands times harder to factor than a 512-bit one.

osgx
  • 90,338
  • 53
  • 357
  • 513