T.R | Title | User | Personal Name | Date | Lines |
---|
51.1 | | ORPHAN::BRETT | | Tue Mar 27 1984 23:44 | 4 |
|
Does anyone know why these tend to have such huge factors?
/Bevin
|
51.2 | | ULTRA::HERBISON | | Wed Mar 28 1984 17:52 | 4 |
| These results are not for random numbers (or even random numbers of
the form 2^n-1), these are "interesting" numbers - numbers which are
known not to have small factors (or not many small factors).
B.J.
|
51.3 | | AURORA::HALLYB | | Fri Mar 30 1984 20:55 | 8 |
| To put it a different way: all the "easy" factorizations were done long ago.
What's left are those numbers known to be composite but which are difficult
to factor.
Does anybody have (or know about) an up-to-date "status list" of Mersenne
numbers and factors? I'd hate to reinvent the wheel.
John
|
51.4 | | HARE::STAN | | Sat Mar 31 1984 19:40 | 2 |
| I just recently ordered a book from the AMS that has all
the latest results. I should get it within the next month.
|
51.5 | | HARE::STAN | | Mon Apr 09 1984 15:23 | 11 |
| The book is entitled
n +
Factorizations of b - 1, b=2,3,5,6,7,10,11,12 up to high powers
and is by Brillhart, Lehmer, Selfridge, Tuckerman, and Wagstaff.
It is volume 22 in the AMS's Contemporary Mathematics series and
is in paperpack (fairly cheap). I now have a copy; I'll leave it in
my office for a while for those of you in ZK who want to drop by
and see it.
|
51.6 | | HARE::STAN | | Mon Apr 09 1984 15:35 | 27 |
| The ten "most wanted" factorizations:
1) 2, 211 - C60
2) 2, 251 - C69
3) 2, 212 + C54
4) 10, 64 + C55
5) 10, 67 - C61
6) 10, 71 - C71
7) 3, 124 + C58
8) 3, 128 + C53
9) 11, 64 + C67
10) 5, 79 - C55
Explanation:
k, m + means k^m+1
k, m - means k^m-1
Cnn means the number is known to have a composite factor of nn digits.
This is the number whose factorization is desired.
Note also that Prof. Wagstaff at Purdue has been set up as a clearinghouse
for factorization results. He has been getting approximately 1 new
factorization result per day; so there is a good chance some of the above
numbers have already been factored. They were the ten "most wanted"
as of the date that the book was published (1983).
|
51.7 | | HARE::STAN | | Tue Jul 31 1984 15:43 | 8 |
| The ten most wanted factorizations have been found by J. A. Davis and
D. B. Holdridge at Sandia National Laboratories. They used the quadratic
sieve algorithm on a CRAY 1S computer.
Reference
Abstacts of papers presented to the American Mathematical Society,
5(1984)298, abstract 813-10-86.
|
51.8 | | SPRITE::OSMAN | | Tue Feb 19 1985 12:22 | 11 |
| Two questions:
1) If the ten most-wanted factorizations have now been done, what
are the NEXT ten most-wanted ones ? (I haven't been to the post
office lately to check the bulletin board)
2) I've heard several mentions of "numbers known to be composite for
which no factors are known". Can someone please explain how
a number can be known composite without knowing a factor ?
I realize that this must be a staple for real mathochists, but
for us layfolk it sounds interesting.
|
51.9 | | HARE::STAN | | Tue Feb 19 1985 13:03 | 7 |
| There are various theorems of the form
"If n satisfies a certain property then n is composite."
which do not produce any explicit divisors. I can't think
of any at the moment, although I think Wilson's Theorem
might be in this class.
|
51.10 | | RANI::LEICHTERJ | | Sun Feb 24 1985 18:02 | 24 |
| Any of the "pseudo-primality" tests will give you this kind of information.
They usually start off from Fermat's theorem: Let p be a prime. Then for
all 0<a<p, a^(p-1) = 1 mod p. One need only find a single a for which this
congruence doesn't hold to prove that p is composite. (Unfortunately, there
are numbers - known as the Carmichael numbers - which are NOT prime but
which have the property that Fermat's congruence holds for all relevant a.
They are quite rare. Once can strengthen Fermat's theorem by adding additional
easy to compute steps - in more than one way - so as to end up in a situation
where, for every non-prime n, and at least half then 0<a<n, Fermat's congruence
or one of the additional tests fails. This can be used to build a probablistic
primality tester: To test n, chose k numbers at random (without replacement)
between 0 and n, and apply the conditions. If any of them fails, n is not
prime, and one is in possession of a proof - though not, in any way anyone
knows today, of a factor. If all pass, in some sense the chance that n is not
prime is 1 in 2^k. This is the technique proposed to find large primes for
various cryptographic protocols.
It has also been shown that, if the Extended Riemann Hypothesis is true, one
form of this test can be made deterministic: That is, there is a small, easily
computable set of numbers such that, if the test with these numbers say n is
"probably prime", it IS, in fact, prime. Even this form of the result gives
no information on factoring n if it happens to be composite, as far as anyone
knows.)
-- Jerry
|
51.11 | | SPRITE::OSMAN | | Mon Mar 04 1985 16:35 | 14 |
| Although technically you've demonstrated a way of detecting that a number
is composite without really finding a factor, I don't feel you've addressed
the SPIRIT of my question. You claim that sufficient evidence of n being
composite is to find a 0<a<n such that
a^(n-1) does not equal 1 (mod n).
It seems to me that the effort of "trying" an a to see if it demonstrates
compositeness of n is as hard (more so ??) than trying an a to see if it
divides evenly into n. Hence we haven't gotten anywhere.
What I'm really wanting is to know if there are ways of proving that n
is composite that are less expensive than dividing numbers into n looking
for one that "guzinta" n.
|
51.12 | | EIFFEL::BRETT | | Mon Mar 04 1985 17:09 | 4 |
| There is always Wilson's Theorem, which states (P-1)!+1 is divisible by P if
and only if P is a prime!
/Bevin
|
51.13 | | TURTLE::GILBERT | | Mon Mar 04 1985 18:02 | 5 |
| re .11
The description in .10 notes mentions probabilities. The probability
of a random "a" illustrating that a composite "n" is, indeed, composite
is MUCH greater than the probability that a random "b" divides "n".
This makes the cost of computing a power mod p more than worth it.
|
51.14 | | HARE::STAN | | Mon Mar 04 1985 18:48 | 4 |
| Brett's argument is more cogent. If you have a number n and
have computed (n-1)! (mod n) and find that the result is not
congruent to -1 modulo n, then you can be sure that n is composite,
but you have no ideas what the factors might be.
|
51.15 | | EIFFEL::BRETT | | Mon Mar 04 1985 20:59 | 4 |
| Bevin, please. Calling me Brett makes me think I'm in an English public
school!
/Bevin
|
51.16 | | TAV02::NITSAN | | Wed Mar 06 1985 01:01 | 2 |
| But to compute (n-1)! you need MANY MANY MANY more operations than to test
powers of numbers of order n ?!?
|
51.17 | | ZFC::DERAMO | Daniel V. D'Eramo | Wed Dec 30 1987 20:49 | 82 |
| Re .11:
>> What I'm really wanting is to know if there are ways of proving that n
>> is composite that are less expensive than dividing numbers into n looking
>> for one that "guzinta" n.
Re .16:
>> But to compute (n-1)! you need MANY MANY MANY more operations than to test
>> powers of numbers of order n ?!?
I'm surprised that nothing more has been added here, because
the objections appear valid. Let's look at how expensive it
is to test if a large number is composite. Suppose that the
number to be tested is
N = 288425819807475326653230130304209
To use Wilson's test, it is necessary to compute (N-1)!,
which has got to have more digits in it than your computer
can hold (I approximate about 9 * 10^33 digits). But wait
-- the computation need only be done modulo N -- as was
stated in .14:
.14:
>> If you have a number n and
>> have computed (n-1)! (mod n) and find that the result is not
>> congruent to -1 modulo n, then you can be sure that n is composite,
>> but you have no ideas what the factors might be.
Don't wait until last to compute (N-1)! mod N, instead
reduce the intermediate products mod N as you go along.
That way you can bound the number of bits in the numbers you
are manipulating to around twice the number of bits in N.
So, compute (N-1)! mod N by multiplying 2 * 3 * ... * (N-1)
where the multiplications are done modulo N. That is still
more multiplications than you have time to wait for!
Compare this to the results below and you see that the
objection in .16 is correct.
Now try to factor N. Each trial division is of the number N
by a smaller number, which is probably less work than the
modulo N multiplications for Wilson's test. But if you try
the odd numbers less than the square root of N, that is
still 8491551923634176 trials. This appears better than the
Wilson's theorem test, but still too long to wait for. The
objection in .11 obviously hasn't been answered yet.
Well, then let's look at that other test. Select A so that
0 < A < N, and compute A^(N-1) mod N. By reducing the
intermediate products modulo N (as before) you keep down the
sizes of the numbers. But you have to do about N
multiplications (as before) where N = 288425819807475326653230130304209.
Or do you? This point may be what prompted the questioning
in .8 and .11 way back when.
Actually, the number of modulo N multiplications needed to
compute A^(N-1) will be roughly from one to two times the
number of bits in N [How?]. For the 108 bit number N given here
that is not much at all. You will do approx. 100-200
operations of multiplying two 100+ bit numbers into a 200+
bit product, and taking the remainder modulo a 100+ bit
number. That's not that overwhelming. And usually such A
are not that hard to come by, so you don't have to repeat
the test that often.
Try it out, I generated the number N and so I think I know
whether it is prime and its factors if it isn't.
Dan
P.S. Somewhere in Knuth Volume 2 (Seminumerical Algorithms)
he gives a better test than "guessing A" and a proof (in the
exercises) that each random test can fail at most 1/4 the
time. When the test passes, you are left with an A such
that either A^(N-1) = 1 mod N fails, or an A such that
GCD(N,A+1) and/or GCD(N,A-1) give not-necessarily-prime
factors of N. When the test fails, the probability that N
is composite is less than 1/4. If 50 random tests fail,
that probability is less than (1/4)^50, etc.
|