PDA

View Full Version : Small n factoring



vjs
02-25-2005, 02:20 PM
I noticed a few people are now doing small n factoring...

I was wondering if anyone was going to keep track or how we should keep track.

If anyone does decide to do this type of factoring I suggest you use very very large bounds. Or certainly you work will be repeated.

The hardest pair to crack right now is k=24737 n=991

It's been ecm'ed to 1800+ curves, p-1 to very deep B1=B2=(??24G??), very limited P+1.

Joe O
02-25-2005, 04:32 PM
Actually ECM for 40 digits has been completed


[Sun Feb 20 18:08:42 2005]
24737*2^991+1 completed 5100 ECM curves, B1=3000000, B2=640000000

and ECM for 45 digits has been started.

Mystwalker posted in the Mersenne forum (http://www.mersenneforum.org/showthread.php?t=3449):


I have done 100 curves with B1=11M (using Prime95) - stage1 only, so far.

Before putting gmp-ecm on stage2, I will wait until version 6 comes out, which is said to be faster especially in stage2.


and I've done a dozen or so curves myself.

Mystwalker
02-27-2005, 03:49 PM
Currently, I run ECM curves for 25 (and started 30 digits on some where no factor was found) on all numbers with n < 2000.
So far, I found 7 or 8 factors, 13 numbers[1] are left.

Current status:


digits: 25 30

21181,1148 ok ok
21181,1172 ok ok
10223,1181 ok
21181,1268 ok
27653,1257 ok
33661,1320 ok
10223,1517 ok
10223,1529 ok
24737,1543 ok
24737,1567 ok
55459,1666 ok ok
55459,1894 ok ok
55459,1966 ok ok


If somebody wants to do work here, I'd propose the 30 digit range (B1=250k, 700 curves with B2=100*B1) for those numbers that are already tested upto 25 digits. As the 25 digit range takes roughly 30 minutes for my Duron, there should be enough 30 digit work for everyone. :)
Please state when you do some work.

Note: "res" = reserved

I'd advise using Prime95/mprime ver. 24.6, which can fastly do ECM curves on these kind of numbers. The syntax is:
ECM2=21181,2,1148,1,250000,25000000,700 for 700 curves with B1=250000 and B2=25000000 on 21181*2^1148.
On Pentium-M machines, turning off SSE2 can speed up computation.

I was inspired when I saw the list on the MikeH's stats pages - and now noticed that this has been recently added, just like the new features from today/yesterday.

For 24737*2^991+1, I already have ~400 curves of the 45 digit level at stage1. Stage2 with B2=100*B1 takes approx. 1 minute, but with the improved performance of gmp-ecm6 and the feature to see how many curves are needed, I gonna find a more optimal B2 value...

[1]I just noticed that some new numbers have been added as unfactored. :confused:

MikeH
02-27-2005, 05:36 PM
[1]I just noticed that some new numbers have been added as unfactored. Which ones? I've probably messed something up...:blush:

As a separate exercise I'll look at doing a 'low n' sob.dat so anyone that wants to join the party can do so relatively easily.

BTW, anyone that finds any factors that are too big to submit in the usual way, just send them to me (and post them here if you like), and let me know the SoB user id you'd like them credited to. Thx.

Mystwalker
02-27-2005, 06:03 PM
Assuming I didn't oversee them, the following have appeared after I initially made up the worktodo file:

21181*2^1172+1
21181*2^1268+1
19249*2^1346+1
24737*2^1543+1
24737*2^1567+1
21181*2^1772+1

For these numbers, it would be no big deal, as my testing would find factors very quickly when factors have already been found. After all, sieving is at ~15 digits, whereas I test for 25 digits...
It could be a problem when other (bigger) numbers are affected as well, though.

edit:
605094683116556462184679 | 19249*2^1346+1

It took almost 200 curves, so I guess there is no smaller factor. This one seems to be legitimate...

21181*2^1172+1 and 21181*2^1268+1 most likely don't have a factor < 25 digits.

btw.:
One entry seems to be false:


10223 509 Mon 21-Feb-2005 Mystwalker

I'm pretty sure I didn't test this lately. Maybe it's a lot older?

MikeH
02-27-2005, 06:18 PM
21181*2^1172+1
21181*2^1268+1
19249*2^1346+1
24737*2^1543+1
24737*2^1567+1
21181*2^1772+1 Just been back through my backups, and these have been there on every page (or at least the 03:00 versions, since that's the only ones I keep) since I first started showing this data (22 Feb).


One entry seems to be false: I agree, it's much older. This is date when I finally incorporated it into my database. Better late than never.:)

Mystwalker
02-27-2005, 06:57 PM
Originally posted by MikeH
Just been back through my backups, and these have been there on every page (or at least the 03:00 versions, since that's the only ones I keep) since I first started showing this data (22 Feb).

Hm, ok. Must have been my mistake. Never mind...


I agree, it's much older. This is date when I finally incorporated it into my database. Better late than never.:)

In addition, I can't remember that I've found a 38 digit factor. It doesn't look like a P-1 factor to me...


As a separate exercise I'll look at doing a 'low n' sob.dat so anyone that wants to join the party can do so relatively easily.

I think you can start with n=2016, as it is highly unlikely that sieving will ever get to 25 digits...

All in all, maybe it's even completely unwise to sieve these low n's - depending on the amount of numbers. It should take only a few seconds per number to test them upto 15 digits with ECM. Maybe a minute or two for 20 digits. So, the top100 for each k can be tested to 20 digits in 33 hours.

These figures are for a 850 MHz system...

edit:
I forgot to consider that ECM curves take longer for higher n's. It would thus take more time - I don't no how much longer, though...

Joe O
02-27-2005, 09:11 PM
Originally posted by Mystwalker


My factorer gave me this:





Using B1=3000000, B2=4016636514, polynomial Dickson(12), sigma=1948938696
Step 1 took 150878ms
Step 2 took 105525ms
********** Factor found in step 2: 21133297510706620304719126766866230127
Found probable prime factor of 38 digits: 21133297510706620304719126766866230127
Probable prime cofactor 81073493380742712605879137929073445260114635803005
03681635553856576597751904480764074787326160157437
20104914903574795151 has 120 digits




I tried to factor 10223*2^509+1 - can someone verify this result?
The sieve submission pages don't work for n<1000...


I almost wiped out this factor, as I was almost in the act of powering down the computer. Thought the program did "I don't know what" cuz it has been terminated too early. Well, now I know why...

Mystwalker,
Check the 11th post and following in this thread. (http://www.free-dc.org/forum/showthread.php?s=&threadid=6913&highlight=509)

So it's not a false entry, you really did factor this k n pair. I captured the information and sent it to Mike.

Joe O
02-27-2005, 09:30 PM
I did this back in September:

sbfactor12.55.exe 24737 991 sbECM25.cfg
sbfactor12.55.exe 21181 1148 sbECM25.cfg
sbfactor12.55.exe 19249 1166 sbECM25.cfg
;sbfactor12.55.exe 21181 1172 sbECM25.cfg
sbfactor12.55.exe 10223 1181 sbECM25.cfg
sbfactor12.55.exe 27653 1257 sbECM25.cfg
;sbfactor12.55.exe 21181 1268 sbECM25.cfg
sbfactor12.55.exe 55459 1306 sbECM25.cfg
sbfactor12.55.exe 33661 1320 sbECM25.cfg
;sbfactor12.55.exe 19249 1346 sbECM25.cfg
;sbfactor12.55.exe 24737 1471 sbECM25.cfg
Pause
sbfactor12.55.exe 55459 1498 sbECM25.cfg
sbfactor12.55.exe 10223 1517 sbECM25.cfg
sbfactor12.55.exe 10223 1529 sbECM25.cfg
sbfactor12.55.exe 24737 1543 sbECM25.cfg
sbfactor12.55.exe 24737 1567 sbECM25.cfg
sbfactor12.55.exe 33661 1608 sbECM25.cfg
sbfactor12.55.exe 55459 1666 sbECM25.cfg
sbfactor12.55.exe 21181 1772 sbECM25.cfg
sbfactor12.55.exe 55459 1894 sbECM25.cfg
sbfactor12.55.exe 55459 1966 sbECM25.cfg
sbfactor12.55.exe 33661 2016 sbECM25.cfg
Pause
sbfactor12.55.exe 67607 2051 sbECM25.cfg
sbfactor12.55.exe 24737 2071 sbECM25.cfg
sbfactor12.55.exe 55459 2134 sbECM25.cfg
sbfactor12.55.exe 19249 2138 sbECM25.cfg
sbfactor12.55.exe 24737 2191 sbECM25.cfg
sbfactor12.55.exe 4847 2247 sbECM25.cfg
sbfactor12.55.exe 55459 2290 sbECM25.cfg
sbfactor12.55.exe 67607 2411 sbECM25.cfg
sbfactor12.55.exe 28433 2473 sbECM25.cfg
sbfactor12.55.exe 10223 2537 sbECM25.cfg
sbfactor12.55.exe 22699 2638 sbECM25.cfg
Pause
sbfactor12.55.exe 55459 2674 sbECM25.cfg
sbfactor12.55.exe 55459 2686 sbECM25.cfg
sbfactor12.55.exe 4847 2751 sbECM25.cfg
sbfactor12.55.exe 10223 2789 sbECM25.cfg
sbfactor12.55.exe 33661 2808 sbECM25.cfg
sbfactor12.55.exe 4847 2847 sbECM25.cfg
sbfactor12.55.exe 24737 2863 sbECM25.cfg
sbfactor12.55.exe 27653 2877 sbECM25.cfg
sbfactor12.55.exe 27653 2913 sbECM25.cfg
sbfactor12.55.exe 27653 2985 sbECM25.cfg
sbfactor12.55.exe 22699 2998 sbECM25.cfg

And sbECM25.cfg has

# or using manual bounds
bounds = manual
B1 = 50000
B2 = 15000000

# only relevant for ECM
curves = 400

So all n less than 3000 have had enough ECM done on them to find 25 digit factors.

You'll notice that there are 4 numbers with semicolons on the line. I skipped these because I had reason to believe that they were already factored. I'll have to check my notes to see why I thought that. One of them Mystwalker just factored.

EDIT:
Turns out I did find in sbfactor.log that I had also run ECM on those 4 k n pairs, Yet did not find a factor. ECM is a probabalistic algorithm, so success is not guarraneed. So more work on these k n pairs may be justified. But use Prime95 as Mystwalker suggests.
Oh yes, I had also run
bounds = manual
B1 = 2000
B2 = 600000

# only relevant for ECM
curves = 40

and

bounds = manual
B1 = 11000
B2 = 2000000

# only relevant for ECM
curves = 120

on all these k n pairs.

Mystwalker
02-27-2005, 09:53 PM
Originally posted by Joe O
Mystwalker,
Check the 11th post and following in this thread. (http://www.free-dc.org/forum/showthread.php?s=&threadid=6913&highlight=509)

So it's not a false entry, you really did factor this k n pair. I captured the information and sent it to Mike.

Ah, ok. I just couldn't remember. Thanks!


So all n less than 3000 have had enough ECM done on them to find 25 digit factors.

You'll notice that there are 4 numbers with semicolons on the line. I skipped these because I had reason to believe that they were already factored. I'll have to check my notes to see why I thought that. One of them Mystwalker just factored.

I found some others with the 25 digits bound:

2816749407631533354401 | 19249*2^1166+1
17719414861774257915917 | 55459*2^1306+1
26783435724241223 | 24737*2^1471+1 (also excluded)
4192341180658301266763 | 33661*2^1608+1
830578932421150082798981 | 55459*2^1498+1
724805244916016081371 | 21181*2^1772+1

Of course, this is not unusual, as the chance that there is a factor of n digits left after running the appropriate curves is 1^(-e), AFAIK. So, there is a chance of something around 30% that a factor is not found so far. The levels only tell when it's optimal to switch to higher bounds...

Joe O
02-27-2005, 10:40 PM
30% eh
I didn't remember that it was that high. Yes, the bounds are only the optimal point to switch to the next level. 7 out of 44, unless you found some more. Don't forget to send them to MikeH.

Oh, and the low sob.dat is not for sieving. It makes factoring easier. The select script or now the program, will create a worktodo.ini entry from the sob.dat file and results.txt file. But be careful, Prime95 uses results.txt as an output file unless you tell it to use a different name. Because Mikeh's dat files are so up to date, we probably need a script and/or program that does not need a results.txt file. This would minimize the possibility of downloading over Prime95's result file.

You know we could use a script/program to create ECM2 lines as well as PFACTOR lines.

Joe O
02-27-2005, 10:52 PM
MikeH,

Instead of a "low" sob.dat file, how about 2 files?

The first would have PFACTOR lines such as:

PFACTOR=27653,2,2985,1,40.2,3.9

that people could edit as they wished before using.

The second would have ECM2 lines such as:

ECM2=24737,2,991,1,11000000,2500000000,530

that also could be edited as needed before using.

If bandwith/filespace is a problem, one file with
=27653,2,2985,1,$
=24737,2,991,1,$
would suffice. People could do global edits to put PFACTOR or ECM2 in front of the = sign and replace the $ with the other needed operands.

MikeH
02-28-2005, 08:49 AM
Instead of a "low" sob.dat file, how about 2 files? Or....how about if I update the makewtd tool so that it can generate a suitable ECM format worktodo file given a desired factor length? - i.e. it will determine suiatble B1, B2 and loops.

Joe O
02-28-2005, 05:29 PM
Originally posted by MikeH
Or....how about if I update the makewtd tool so that it can generate a suitable ECM format worktodo file given a desired factor length? - i.e. it will determine suiatble B1, B2 and loops.

Even better! Thanks.

MikeH
03-05-2005, 11:50 AM
A low N sob.dat especially for this low N factoring effort is available here (http://www.aooq73.dsl.pipex.com/sobdat/SobDat_LowN.zip). This is all the unfactored candidates below the double check PRP line. It'll be upated daily.

A updated version of the windows utility to create worktodo.ini files (for Prime95) is here (http://www.aooq73.dsl.pipex.com/utils/makewtd.zip), again with source.

Updates include:

1) Can now produce P-1 or ECM work files
2) Can pick one or more specific k

The attcahed .bat file should make it clear enough how to use, but if not, you know where to find me.

Enjoy. :|party|:


BTW (just in case you lost the links): Latest versions of Prime95 can be found here (http://www.mersenne.org/gimps/), that's p95 for windows, mprime for linux.

MikeH
03-07-2005, 01:33 PM
P-1 found a factor in stage #2, B1=4625000, B2=166500000.
24737*2^6631+1 has a factor: 668139145160251763062608380347404413431
39 digits, too big to submit. I've sent it to myself :D

Glad I did some serious P-1 in that range before moving on to ECM :)

vjs
03-07-2005, 02:20 PM
Is anyone interested in factoring the larger numbers for the 13.6M+ account???

Since the numbers are larger it should be easier to find sooth factors...

smh
03-07-2005, 04:21 PM
Originally posted by MikeH

P-1 found a factor in stage #2, B1=4625000, B2=166500000.
24737*2^6631+1 has a factor: 668139145160251763062608380347404413431
39 digits, too big to submit. I've sent it to myself :D

Glad I did some serious P-1 in that range before moving on to ECM :)

You could have factored this composite into it's prime factors, which would have made it easier to submit.

668139145160251763062608380347404413431
PRIME FACTOR 1489391276707081441
PRIME FACTOR 448598803826386770391

MikeH
03-08-2005, 01:18 PM
P-1 found a factor in stage #2, B1=4060000, B2=139055000.
4847*2^8367+1 has a factor: 1110687092604384736290030992727448303 37 digits and seems to be a prime factor this time.:rolleyes:

Thanks smh. It seems the largest three factors for low n I've submitted recently have been composite. I've been thinking about tidying up the stats pages so that somewhere we display the largest prime factors found in this project. But it's good to be aware that we do have composite factors, so I need to sort that aspect first.

Jwb52z
03-08-2005, 05:45 PM
MikeH, I sympathize with your job around here because it seems like it gets increasingly tedious all the time. :) Of course, it may really not because it may be easier than it sounds. I don't know.

vjs
03-09-2005, 11:52 AM
Mike I feel your pain about those large factors that are actually composite, it might be a good idea if everyone who finds a factor through p-1 tries factoring the factor.

This is a pretty good applet for doing so...

http://www.alpertron.com.ar/ECM.HTM

BTW nice speed on those small factors and I like the idea of the top 100 and top 10.

Is there any chance of expanding your stats further to 50M or 100M when the time comes? Would this be difficult to do etc...

MikeH
03-09-2005, 06:09 PM
Is there any chance of expanding your stats further to 50M or 100M when the time comes? Would this be difficult to do etc... Yep, that bit should be easy enough. I already use a n<50M p=1G sob.dat file, and just disable the bits above 20M that would otherwise make it a bit of a mess right now.

Mystwalker
03-09-2005, 06:51 PM
I do some more ECMing on the 1k<n<2k numbers:

Current status:


digits: 25 30

21181,1148 ok ok
21181,1172 ok ok
10223,1181 ok ok
21181,1268 ok ok
27653,1257 ok ok
33661,1320 ok res
10223,1517 ok res
10223,1529 ok res
24737,1543 ok res
24737,1567 ok res
55459,1666 ok ok
55459,1894 ok ok
55459,1966 ok ok


Note: "res" = reserved

Using gmp-ecm6 for stage2, I can lower the time for the 30 digit level from 337 to 280 minutes, a speed increase of 20%. :)

vjs
03-09-2005, 07:01 PM
Any chance of a secret stats page for the 991<n<50M dat like the 2005 stats.

If it would be easy to do a pull from the n<50M dat it would save me time from doing it manually using excel and Joe_O's outs.

vjs
03-10-2005, 06:21 PM
I posted stats on the whole 991<n<100M I'm talking about in the sieve section. It also shoulds your low n-factoring effort so I thought the factors would also like to know about them.

Mystwalker
03-14-2005, 02:37 PM
Almost through with 1000 < n < 2000 for 30 digits:


25 30 hex (hex digits)

21181,1148 ok ok N=0x52BD000 (292)
21181,1172 ok ok N=0x52BD000 (298)
10223,1181 ok ok N=0x4FDE000 (302)
27653,1257 ok ok N=0xD80A000 (319)
21181,1268 ok ok N=0x52BD000 (322)
33661,1320 ok ok N=0x837D000 (335)
10223,1517 ok ok N=0x4FDE000 (384)
10223,1529 ok ok N=0x4FDE000 (387)
24737,1543 ok ok N=0x3050800 (391)
24737,1567 ok stage1 complete N=0x3050800 (397)
55459,1666 ok ok N=0x3628C00 (422)
55459,1894 ok ok N=0x3628C00 (479)
55459,1966 ok ok N=0x3628C00 (497)

No new factors at that level. :(

I found out something strange:
As you can see at the end of each line, the numbers in hex always start the same for same k's, they only differ in the (hex) digit count.
That means that the n's of these numbers differ by a multiple of 4.
Is that

a) coincidence,
b) natural (due to some obvious implication) or
c) a phenomenon unknown so far?

vjs
03-14-2005, 03:01 PM
Yes they also differ by a mulitple of 2... ?
And they exactly differ by

55459,1666
55459,1894 2^(1894-1666)
55459,1966 2^(1966-1894)

Perhaps I'm missing your point?

Mystwalker
03-14-2005, 03:17 PM
Well, taken those k's, how big is the chance that each number of a k still unfactored differs by a multiple of 16 (of 2^4) to other numbers of that k?

I've just found out that the n's even all differ by a multiple of 12!
I don't think that this is just coincidence.

I *guess* that only numbers of this special form "survive" sieving by very small numbers (< 100). But I don't know, hence I'm asking...

Joe O
03-14-2005, 03:25 PM
b) natural (due to some obvious implication)

Remember we are dealing with k*2^n+1

The n's for a given k are all even or all odd so they differ by an even number.

n'=n+2*j for some positive interger j

k*2^n'+1=k*2^(n+2j)+1 = k*(2^n)*(2^2J)+1=k*(2^n)*4^j+1=(4^j)*k*2^n+1

Joe O
03-14-2005, 03:28 PM
Originally posted by Mystwalker

I've just found out that the n's even all differ by a multiple of 12!
I don't think that this is just coincidence.

I *guess* that only numbers of this special form "survive" sieving by very small numbers

Yes the others would be divisible by 3 or 5 or 7.

Joe O
03-14-2005, 03:51 PM
Check out this thread! (http://www.free-dc.org/forum/showthread.php?s=&threadid=2994&highlight=cells) which contains the following:

Originally posted by wblipp This is easier to understand in table form. Here is a table of all the possible combinations of k and n, and which combinations are divisible by 3 and/or 5. All the remaining k values come from the four rows. Each of these rows has only one cell that is not divisible by 3 or 5, so all remaining primes must come from these four cells (The cells marked with j=). Note that for these four cells, the k values are all 2^j and n values are all 4-j.

vjs
03-14-2005, 05:23 PM
I think the major point here is that eventually we will find a few of these very small k/n's that will be a product of two large primes. But the chances of them being close together???

It may be too soon to tell on some of these low-n... we will eliminate quite a few will trial factoring and the 991<n<50M sieve but what will be left???

I think it's starting to get to a point now where these small n are quite interesting.

Mystwalker
03-17-2005, 06:07 PM
Thanks for that insight, Joe! :cheers:

btw.:
I reached the 30 digit level of the last composite. No new factor...
If somebody else wants them, they are all free. I currently concentrate on 24737*2^991+1

MikeH
03-29-2005, 12:19 PM
It's obvious from the stats pages that I've been doing some work in this area, but never got round to posting what. Here's my progress to date


Complete (P-1)
[Mon 21-Feb-2005 20:38:28] P-1: Range 900-2000 ( 1100), 49, 1000000.0, 23 candidates in worktodo.ini
[Sun 06-Mar-2005 15:12:57] P-1: Range 2000-5000 ( 3000), 49, 200000.0, 66 candidates in worktodo.ini
[Sun 06-Mar-2005 23:02:51] P-1: Range 5000-8000 ( 3000), 49, 100000.0, 84 candidates in worktodo.ini
[Sun 06-Mar-2005 23:04:11] P-1: Range 8000-11500 ( 3500), 49, 70000.0, 92 candidates in worktodo.ini
[Mon 21-Feb-2005 16:19:55] P-1: Range 11500-12500 ( 1000), 49, 50000.0, 27 candidates in worktodo.ini
[Mon 21-Feb-2005 12:48:55] P-1: Range 12500-13500 ( 1000), 49, 30000.0, 26 candidates in worktodo.ini
[Mon 21-Feb-2005 09:38:40] P-1: Range 13500-14500 ( 1000), 49, 20000.0, 26 candidates in worktodo.ini
[Mon 21-Feb-2005 08:22:34] P-1: Range 14500-15000 ( 500), 49, 20000.0, 14 candidates in worktodo.ini
[Sat 19-Feb-2005 18:54:28] P-1: Range 15000-20000 ( 5000), 49, 10000.0, 146 candidates in worktodo.ini
[Sat 19-Feb-2005 18:40:46] P-1: Range 20000-25000 ( 5000), 49, 10000.0, 146 candidates in worktodo.ini
[Sun 20-Feb-2005 20:49:19] P-1: Range 25000-25500 ( 500), 49, 10000.0, 18 candidates in worktodo.ini
[Sun 20-Feb-2005 23:29:14] P-1: Range 25500-27000 ( 1500), 49, 10000.0, 45 candidates in worktodo.ini
[Sun 20-Feb-2005 23:43:29] P-1: Range 27000-27200 ( 200), 49, 10000.0, 3 candidates in worktodo.ini
[Sun 20-Feb-2005 23:44:59] P-1: Range 27200-27400 ( 200), 49, 10000.0, 5 candidates in worktodo.ini
[Mon 21-Feb-2005 17:35:27] P-1: Range 27400-40000 (12600), 49, 10000.0, 360 candidates in worktodo.ini
[Sun 27-Feb-2005 13:03:26] P-1: Range 40000-41000 ( 1000), 49, 2000.0, 38 candidates in worktodo.ini
[Sun 27-Feb-2005 16:46:31] P-1: Range 41000-45000 ( 4000), 49, 2000.0, 104 candidates in worktodo.ini
[Tue 01-Mar-2005 22:48:28] P-1: Range 45000-50000 ( 5000), 49, 1000.0, 131 candidates in worktodo.ini
[Tue 01-Mar-2005 23:00:22] P-1: Range 50000-60000 (10000), 49, 1000.0, 248 candidates in worktodo.ini
[Wed 02-Mar-2005 20:09:23] P-1: Range 60000-70000 (10000), 49, 500.0, 307 candidates in worktodo.ini
[Wed 02-Mar-2005 23:31:35] P-1: Range 70000-80000 (10000), 49, 500.0, 263 candidates in worktodo.ini
[Fri 04-Mar-2005 19:30:59] P-1: Range 80000-90000 (10000), 49, 500.0, 293 candidates in worktodo.ini
[Sat 05-Mar-2005 13:02:15] P-1: Range 90000-100000 (10000), 49, 500.0, 286 candidates in worktodo.ini

Complete (ECM)
[Sun 20-Mar-2005 21:08:45] ECM: Range 2000-3000 ( 1000), 30, 18 candidates in worktodo.ini
[Sat 05-Mar-2005 16:29:27] ECM: Range 3000-4000 ( 1000), 25, 22 candidates in worktodo.ini
[Sun 06-Mar-2005 23:17:50] ECM: Range 4000-5000 ( 1000), 25, 29 candidates in worktodo.ini
[Wed 09-Mar-2005 22:53:46] ECM: Range 5000-6000 ( 1000), 25, 23 candidates in worktodo.ini
[Thu 10-Mar-2005 20:30:51] ECM: Range 6000-7000 ( 1000), 22, 31 candidates in worktodo.ini
[Fri 11-Mar-2005 15:08:36] ECM: Range 7000-8000 ( 1000), 21, 24 candidates in worktodo.ini
[Sat 12-Mar-2005 00:17:05] ECM: Range 8000-9000 ( 1000), 21, 22 candidates in worktodo.ini
[Sat 12-Mar-2005 08:51:51] ECM: Range 9000-10000 ( 1000), 20, 22 candidates in worktodo.ini
[Sat 12-Mar-2005 14:11:21] ECM: Range 10000-11000 ( 1000), 20, 22 candidates in worktodo.ini
[Sat 12-Mar-2005 18:26:17] ECM: Range 11000-14000 ( 3000), 20, 76 candidates in worktodo.ini
[Sun 13-Mar-2005 23:04:49] ECM: Range 14000-18000 ( 4000), 20, 109 candidates in worktodo.ini
[Tue 15-Mar-2005 18:03:26] ECM: Range 18000-20000 ( 2000), 20, 45 candidates in worktodo.ini
[Wed 16-Mar-2005 22:21:41] ECM: Range 20000-25000 ( 5000), 20, 126 candidates in worktodo.ini

In progress (ECM)
[Mon 28-Mar-2005 12:20:04] ECM: Range 3000-4000 ( 1000), 30, 15 candidates in worktodo.ini

So everything n<100K has been P-1ed to a 'value' of at least 500. Everything n<25K has been ECMed to at least 20 digits.

I'm currently ECMing 3K<n<4K to 30 digits. When that's done I'll probably extend the range beyond 25K before trying anything deeper on n>4K.

...and before anyone says it ....why?..... Well it's something different.:umm:

vjs
03-29-2005, 12:48 PM
Great work Mike,

I've been trying to track your progress thus far through the user info, but it's tough to tell what one is doing.. until they have submitted factors of course.

Joe, will probably send you some more factors soon, p<25T, I basically found 21 factors less than 100k, 411 less than 20M. Not sure how many are unique or ones that you havn't found but they are all in my fact.txt so we will see.

Any chance of doing some 991<n<50M stats for us sievers? Should I send you what I'm doing for stats thus far?

MikeH
04-02-2005, 09:11 AM
Very quick update. Two factors found in 3000-4000 range.


Complete (ECM)
[Sun 20-Mar-2005 21:08:45] ECM: Range 2000-3000 ( 1000), 30, 18 candidates in worktodo.ini
[Mon 28-Mar-2005 12:20:04] ECM: Range 3000-4000 ( 1000), 30, 15 candidates in worktodo.ini
[Sun 06-Mar-2005 23:17:50] ECM: Range 4000-5000 ( 1000), 25, 29 candidates in worktodo.ini
[Wed 09-Mar-2005 22:53:46] ECM: Range 5000-6000 ( 1000), 25, 23 candidates in worktodo.ini
[Thu 10-Mar-2005 20:30:51] ECM: Range 6000-7000 ( 1000), 22, 31 candidates in worktodo.ini
[Fri 11-Mar-2005 15:08:36] ECM: Range 7000-8000 ( 1000), 21, 24 candidates in worktodo.ini
[Sat 12-Mar-2005 00:17:05] ECM: Range 8000-9000 ( 1000), 21, 22 candidates in worktodo.ini
[Sat 12-Mar-2005 08:51:51] ECM: Range 9000-10000 ( 1000), 20, 22 candidates in worktodo.ini
[Sat 12-Mar-2005 14:11:21] ECM: Range 10000-11000 ( 1000), 20, 22 candidates in worktodo.ini
[Sat 12-Mar-2005 18:26:17] ECM: Range 11000-14000 ( 3000), 20, 76 candidates in worktodo.ini
[Sun 13-Mar-2005 23:04:49] ECM: Range 14000-18000 ( 4000), 20, 109 candidates in worktodo.ini
[Tue 15-Mar-2005 18:03:26] ECM: Range 18000-20000 ( 2000), 20, 45 candidates in worktodo.ini
[Wed 16-Mar-2005 22:21:41] ECM: Range 20000-25000 ( 5000), 20, 126 candidates in worktodo.ini

In progress (ECM)
[Sat 02-Apr-2005 15:06:22] ECM: Range 4000-5000 ( 1000), 30, 18 candidates in worktodo.ini
[Sat 02-Apr-2005 15:07:09] ECM: Range 25000-30000 ( 5000), 20, 126 candidates in worktodo.ini

Keroberts1
04-27-2005, 02:59 AM
what is currently being done with 24737*2^991+1 is anyone still trying to factor this? if so is there any way i can help?

Mystwalker
04-27-2005, 06:37 AM
I'm currently finish the last few ECM curves for the 45 digit level.

You can see the progress here (http://www.mersenneforum.org/showthread.php?t=3449).

If you want, you can start the 50 digit effort. I will take a rest, though...

Keroberts1
04-27-2005, 04:13 PM
what do i need ot help, just get me started and I'm on board.

Mystwalker
04-27-2005, 07:10 PM
Depends on your personal ease/performance preferences.

The easy way:
Get Prime95/mprime v24.6 (http://www.mersenne.org/gimps) (24.11 currently contains a bug possibly affecting ECM).

You then have to do as described here (http://rieselsieve.com/forum/viewtopic.php?t=367&start=21) (top posting should be enough).


The performance way:
Read the next ~20(?) postings. ;)

Keroberts1
04-28-2005, 01:02 AM
so is ECM2=24737,2,991,1,11000000,1100000000,100 the inpout i should use? are you past that yet?

Mystwalker
04-28-2005, 05:57 AM
You should use
ECM2=24737,2,991,1,43000000,4290000000,100

This is for the 50 digit level. You can also try to replace 4290000000 by 0 - I *think* that works as well. But I don't know for sure...

Keroberts1
04-28-2005, 03:31 PM
i didn't think the bounds looked hig henough so i started it off using ECM2=24737,2,991,1,110000000,11000000000,100
is that going to not work or is it just going to take longer? I've got 14 curves done so far

How will i kno if something is found? Where is the output placed? shouldi see anything until all of the curves are done?

vjs
04-28-2005, 04:32 PM
I've been told before that there are good reasons for starting at the specified bounds for each curve and increasing accordingly.

I think the best answer is b/c it does take longer.

Mystwalker
04-29-2005, 09:02 AM
1. Prime95 is not suited for B1=110M (a.k.a. "the 55 digit level"). This is because the max. overall bounds for Prime95 is 4290M, but with B2=100*B1 (which is smaller than optimal), B2=11000M...
Using max. bounds, a Prime95 curve only counts as ~1/3 standard curve, IIRC.
For B1 > 43M, yo ushould definitely use gmp-ecm. Before, I'm not sure which one is faster, as stage1 is faster with Prime95, whereas stage2 is faster for gmp-ecm.

2. Output will be in "results.txt" - either when a factor is found or when all curves have been done.

3. A factor can be found after the completion of each curve (after each stage of each curve, to be specific). ECM is similar to P-1 factoring. The big difference is that ECM tests (with different sigma values) search for different group orders.
As a massive (and mathematically wrong, but descriptive) oversimplification, just imagine ECM as "P-sigma factoring".

4. Searching for increasing digit counts of factors is AFAIK proven to be most efficient. So, the 50 digit level (B1=43M) should be done next.

Keroberts1
05-02-2005, 12:33 PM
24737*2^991+1 completed 100 ECM curves, B1=110000000, B2=2410065408

no factors

I don't know what that means for the 50 digit search

Keroberts1
05-02-2005, 03:21 PM
i found GMP-ECM and i was going ot do a few hundred curves for the 60 digit search but i can't find references for running the program. I managed ot get it installed because of the very useful readme file but it doesn't say where to place the data for values to be tested on and what bounds i would like. Anyone familiar with GMP-ECm am i just overlooking something obvious?

Mystwalker
05-02-2005, 03:25 PM
You'll need 9743 of those curves for the 50 digit level and 58080 curves for the 55 digit level.

For the 55 digit level, you should definitely try gmp-ecm. Stage1 will take longer (you *can* take Prime95 here, it's only a bit more administrative work), stage2 maybe as well, but you need way less curves:

50 digit level: 3155 curves
55 digit level: 17899 curves

If you tell me what system you use (P4, AthlonXP, Athlon64, ...), I can send you an optimized Windows binary.

edit:
Ah, I see you've found it yourself. :)
I'll write an explanation now.

Mystwalker
05-02-2005, 03:52 PM
First, I'll do some preliminaries:

1. Make sure you've got gmp-ecm6. Version 5 is definitely slower, especially at these high bounds.

2. Make sure you've got a binary that's optimized for the system you're running it on. An Athlon64-optimized version on a A64 is maybe 50% faster than a P4-optimized one on that A64.

3. For the 60 digit level, I'd suggest that you have at least 1 GB RAM. You can do it with less, but it will take ages. 2 GB are better, more RAM is optimal, but not necessary.

4. I'd say that in order to familiarize oneself with the program, it's best to try smaller factorizations first - so less time is wasted when something goes wrong. OTOH, gmp-ecm is quite easy to handle, and it's hard to do errors that stay unnoticed.

Having said that, if you want to use gmp-ecm for both stages (1 and 2), I'd suggest putting the number to factor (517693523749349506216820125753852827887841227092964547799532292131198709993702048088168912046169305 5176232309367482850198174690025814604003221689364557004783690027403238808630592990673934138182356151 4180453107549154075484596024084592007433709786778382372595169635155000276572801873236222107804025592 2177) into a text file called SB.txt into a folder with ecm.exe (or what yours is called).
Then, simply call:


ecm -c 100 -n -one 11e7 < SB.txt >> SBresult.txt

This should do 100 curves with B1=11e7=110,000,000 on the number, the output will go into SBresult.txt
The ">>" means that it will append the file and not overwrite it.

"-c 100" are the 100 curves
"-n" lets it run at lower priority
"-one" tells the program to stop once a factor is found. This has the nice effect that as long as the program is running, you know no factor has been found so far.
"11e7" is the B1 limit for 55 digits

You can get all parameters with "ecm --help".


Some things to consider:

1. When you stop the program, the currently processed curve is lost. :(

2. You'll need more than 1 GB RAM to do such a curve without running out of memory. You can reduce memory needs by two parameter:

"-k <n>" where <n> is a number > 2. I don't remember the exact numbers, but IIRC, when you multiply the value by 4 (default value is 2), you use half the memory. Of course, curves take longer this way. You have to figure out what k value is still okay for your computer. With 1 GB RAM, I use "-k 4", but only in conjunction with the other parameter:

"-treefile <treefile name>" where <treefile name> is an arbitrary name. With this parameter, gmp-ecm puts some files into the directory to lower memory needs. Although the hard disk naturally is some orders of magnitude slower than system memory, the performance impact is really low, as these files are only seldomly needed. In fact, I think accesses only occur at the end of a curve (and they get created at the beginning).

3. You can optimize performance by experimenting with the B2 value. By default, it's 680270182898 for B1=11e7. Just put other values as a parameter behind the B1 value (e.g. "ecm [...] 11e7 560e9[...]"). With the parameter "-v", you'll get an output at the beginning of stage2, telling you how many curves are needed. All you have to do is find the B2 value where "needed curves * time for one curve" is minimal...

Keroberts1
05-02-2005, 05:06 PM
ok i have gotten it to run but it isn't displaying anysort of status update. i have a athalon 2000 how long should a curve take is there any way ot make iyt dispaly updates? or anything? i Also only have 256 MB is my machine any good for factoring? Is it even worth doing?

vjs
05-03-2005, 04:52 PM
It's been a while since I worked on this number but I have been thinking about it lately, here is the question...

24737*2^991+1 is basically 303 digits long, I know I finished some of the curves off for a 40-digit factor, Joe and mystwalker did everything for a 45-digit factor.

So this implies for this number all factors are greater than 45 digits! So it's only possible to have 6x50 digit factors at best ... 3x100 digit factors ... 2x151 digit factors ... or some multiple such as 150-digit x 80-digit x 70 digit, etc...

--------------------------------------

I've been thinking about the way p-1 works and the previously undocumented b2-b1 method...

If one uses very large B1 and B2 Bounds, and considers very very large b1 and b2 bounds the factors doesn't have to be very smooth at all. If it had a 45+ digit factor for example, if we could B1=B2=23-digits we would find that factor for sure.

First question....

What is the largest B1 possible which program would we use and how much memory would be required?

(I think the max I was able to use was B1=B2=9007199254740995 (16-digits from my bats but I'm not sure if it worked)


Start with the largest B1 bound possible, I.E. B1=B2 what is the largest possible B1 for stage1.

If the largest were around B1 were 16 digits (perhaps the new ecm6 can do p-1 with larger b1) we would esentially re-eliminate all 31-digit factors.

(Factor -1) would have to be less smooth than = B1(16 digits) x B1(16 digits) = 31 digits.

Factor -1 > 16-digits x 16 digits

Next

Then we could distribute the stage1 pass files to people interested and try various B2-B1's, going to ever increasing B2's.

For example B1=16 digits (done with stage1) b2=X2-X1 next B2 would be X3-X2

Where X1, X2, X3 would be distributed amoung ourselves.


Perhaps this is the way NFS works etc, I'm just trying to think of other possibilites than doing more ECM. Also I'm not sure how much P+1 was done on this number nor how successful it would be.

I assume factors can also be P+1 smooth? and the factor could be P+1 smoother than P-1 smooth...

Thanks just trying to learn something.

Also FYI I'd be going this on a Barton 2200 MHz, 1G of dual channel, fry's also has 2G of dual channel for ~$220 which looks very tempting right now.

Nuri
05-03-2005, 07:49 PM
I know this is a silly question, but to be honest I have not looked into the logic much deeply. That being said, here's my question.

With ecm, if you run the calculated number of curves for a digit level, does it guarantee that there is definitely no factor for that number with digits less than or equal to that digit level?

From Mystwalkers post, for example (and assuming the calculations are correct - not that I doubt them btw), does running

3155 curves at 50 digit level guarantee that there are definitely no factors with less than 50 digits? Or the same with 55 digit level: 17899 curves case.

Or, is there still a chance (althoughy slight) that there might be a factor with less number of digits?

And, another way of looking into it, let's assume its theoretically and practically possible to continue the search until 155 digit level. Then could one say, with the calculated number of curves, ecm would definitely find "the factor".

And one last other way of looking, if it were to find no factors until 155 digits, would it be possible to assume that 24737*2^991+1 is infact, prime?


PS: I'm not suggesting anything. Just trying to understand the logic.

ShoeLace
05-03-2005, 09:21 PM
this is my understanding of ECM.

ECM is a probabilistic algorithm.

the standard listed number of curves when run has a chance of a missed factor for the given size of 37% (its like 1/e where e=2.71828183... )

(http://www.loria.fr/~zimmerma/records/ecm/params.html)

i also had thought (but may be mistaken) that it only work if EXCATLY 1 factor is within B1 and B2. eg if there are factors 123001 and 124001 would only be found if B2 is between 123001 and 124001..
ie B1=100000,B2=200000 would fail
B1=100000,B2=123500 would succeed

but i couldnt find a reference that supports this claim.
edit: found it http://www.mersenneforum.org/showthread.php?t=194
this page goes onto say ECM (and P-1) find factors that are (B1,B2)-smooth (1 factor between b1 and b2)


--
Shoe Lace

wblipp
05-03-2005, 11:40 PM
With ECM, there is always a chance that a factor has been missed. "3155 curves" really means that if there is a factor of that size, then each curve has a probability of 1/3155 of finding it. The probability of not finding it is (3154/3155)^3155. It's well known limit that ((n-1)/n)^n approached 1/e. Hence the surprisingly large probabilty of about 37% that the number has been missed. However, it has a much higher probability of turning up with the next set of parameters, so in practice the missed numbers tend to show up quickly on the next level.

With ECM, you find the factor if a number near the factor is smooth enough - which number depends on which "sigma" is used for the starting point. So multiple factors can come out at one time, but it's more common for them to come out in different curves. With P-1, you will always get the product of all the factors that are sufficiently smooth. P+1 has this additional twist that depends (50/50, I think), on the starting point, so you might get none or one or a product on sufficiently smooth numbers in any one trial.

vjs
05-03-2005, 11:45 PM
Perhaps I was a little too affermitive with my post regarding the digit level.

As Shoelace pointed out and nuri eluded too.

Edit: Realised Wblipp, replied while I was responding, his explaination is obviously superior


Running ecm to the 50-digit level and finishing all curves for a 45-digit level doesn't mean with 100% certainty that a 45-digit factor does not exist.

As shoelace said there is only 63% possibility that no factor <45-digits exist.
What these B1,B2 bounds and number of curves run for at particular level represent is the point at which one should switch to higher bounds. In other words searching with bounds for 45-digit factors is less optimal than switching to 50-digit.

The best answer on this topic I ever recieved was "there are good reasons for doing all the curves required for lesser-digit factors first. Then continuing with higher digit bounds" (ECM is probabilistic).

However when you do curves in order 20,25,30,35,40,45,50-digit, as your doing curves for the 45-digit the probability of a missed factors at the lower levels 30,35,40-digits decreases.

A good example is running the alpertron applet

http://www.alpertron.com.ar/ECM.HTM

Try a few curves for 99^100+100^99, as the numbers increase the possibility of smaller 15-digit factors decrease. We can pretty much say that there is a near zero possibility that a 25-digit factor exists at this time. Not sure if we can say this for a 35 and certainly not a 40.

------------- P-1 -----------------------------------------


i also had thought (but may be mistaken) that it only work if EXCATLY 1 factor is within B1 and B2.

I don't think that is correct but I may be mistaken as well, I know something similar is true for P-1, however.

Let's take this factor, this sort of goes with what I was talking about for P-1 with high B1 bounds.

32176897563079 | 4847*2^29601063+1

32176897563079 - 1 = 32176897563078 (this is for P-1 factoring)

32176897563078 factors into = 2 x 3 ^ 2 x 67 x 2063 x 12 932951

So if you wanted to find this factor using P-1 factoring (I used sieve of course) the most efficient B1 and B2 bounds would have been.

b1=2 064
b2=12 932 952

Of course you could have used B1=B2=15 000 000 and that would have actually found the factor in stage 1, but it would have been a waste of time. (But, how do you know what the optimal bounds are for finding the factor before you find the factor??? It's impossible)

But if there was no factor found what does this imply?

If using B1=B2=15 000 000 not found any factors we would have known for certain that no factor exists < 14-digits exist.

factor > ( B1 X B2 ) + 1
factor > (15 000 000 x 15 000 000 ) + 1
factor > 225 000 000 000 001 (15-digit)

Of course a factor >15 digits could be found if it's smooth, i.e. other numbers <B1 could be multiplied to form the P-1, example the 2 x 3 ^ 2 x 67 portion

As in the above case b1=2 064 x b2=12 932 952 = 26693612928
but
26693612928 < 32176897563079

So the factor was found b/c it was somewhat smooth (had the 2 x 3 ^ 2 x 67 portion, all these numbers were less than B1)

___________________

Now for P+1 (I'm uncertain about this, but I think the prinicple is the same)

Even using b1=b2= 15 000 000 with P+1 we wouldn't have found this factor...

32176897563079 +1 = 32176897563080

32176897563080 = 2 ^ 3 x 5 x 563 x 1428 814279

B1=B2=15 000 000

1428 814 279 > B1=B2

so it wouldn't have been found using P+1 with the same bounds.

Should I continue???

wblipp
05-04-2005, 10:34 AM
Originally posted by vjs
Running ecm to the 50-digit level and finishing all curves for a 45-digit level doesn't mean with 100% certainty that a 45-digit factor does not exist.

As shoelace said there is only 63% possibility that no factor <45-digits exist.

We say it this way, but it's sloppy. The 63% is the probabilty that we would have found such a factor IF IT EXISTS. However, it's not very likely that any such factor exists, so the probability there is no such factor is higher.

Based on this heuristic:

http://elevensmooth.com/MathFAQ.html#HowMany

the probability of any factor between 10^40 and 10^45 is about 12%, and we would have found about 2/3 of any that exist, so the probability there remains an unfound 45 digit factor is only about 4%.

vjs
05-04-2005, 10:40 AM
Wblipp,

Thanks for the explaination and clearing up my mistakes, then considering we have tried up to the 45-digit level... Is there any reason to try P+1 with very high bounds?

Nuri
05-04-2005, 06:57 PM
thx for the replies.

wblipp
05-04-2005, 10:05 PM
Originally posted by vjs
considering we have tried up to the 45-digit level... Is there any reason to try P+1 with very high bounds?

I don't know. People differ on this subject. I've set up ECM server for OddPerfect.org so that it does one P+1 curve at each level, and works three levels above the ECM level. I like working ahead because P+1 and P-1 are faster than ECM and have about the same chance of finding a factor. You need multiple curves for P+1 because for each factor, half the start points won't find that factor. The traditional way to get this is to run three P+1 curves at each level, but I figure three curves at successively higher levels gets me similar coverage for the lowest level and a shot at the upper levels. It feels like the right tradeoff of effort for chance, but I don't have an analysis to back that up.

William

Keroberts1
05-05-2005, 04:21 PM
is there a way to make GMP-ECM reort progress intermitantly? Does it only report anything when it finishes a curve? find a factor? how long will one curve with a B1 of 11e6 take with a athalon 2000 at 1.67 and 256 MB ( I only want to use 200 MB at most what should i use as the command line for this)

Keroberts1
05-06-2005, 06:42 PM
ok well i was playing around and tried to have itrun with a B1 value of 77e7 and it finished stage one but crashed because i didn't have enoug hmemory for stage two. Is there anyway to recover stage 1? I used the treefile option and because it crashed suddenly i still have the treefiles. Do these contai nthe data for stage 1? i could also provide the sigma value i used.

Keroberts1
06-06-2005, 09:55 PM
I'd like to try some P-1 again for 991 and i was wondering if anyone has stage one work for it. using gmp-ecm I'd like to try stage one to perhaps 10^16 i don't know how long that would take but i assume that stage two could then be broken up and distributed to computers wiht more memoryt than i have and perhaps we could finally put this one to rest.

This was mentioned before but i don't know how much work was actually done on it.

vjs
06-06-2005, 10:04 PM
Keroberts,

I'm not sure if P-1 B1=10^16 is possible for a B1 bound. I know some time back I used ecm5.0 for P-1 with a very high (I think maximum) B1 value it took quite a few days.

I think your using a athlonxp with 256MB if I remember?

You might want to try ecm6.0 P-1 (-pm1 flag) with an extremely large P-1 the program will prompt you that the max B1 value was exceeded, note the value.

Then take that value and divide it by 1000 and run stage one with it. It will complete in a reasonable time. Muliply this time by 1000 and see if your willing to dedicate that much time without a reboot.

Also make sure to save the file.

edit....

O.K. I checked the max B1 with ecm6.0 for k7 processors
This would be your command line if you first save the number as 991.txt and save the stage 1 as stage1.sav.

k7 -pm1 -save stage1.sav 9007199254740996 9007199254740996 <991.txt

Also P-1 is not 100%, the factor has to be smooth. By running this you would really only prove that no factor <16-digits exist for this number.

Keroberts1
06-07-2005, 05:20 AM
butit can also find prime factors that are one more than a numbre with lots of 16 digit or less factors. There is almost no limit to the size of this potential factor.

Mystwalker
06-07-2005, 06:16 AM
AFAIK, it's much better to use Prime95 for numbers of the form k*2^n +/- 1 than gmp-ecm.

Joe O
06-07-2005, 08:37 AM
Originally posted by Keroberts1
butit can also find prime factors that are one more than a numbre with lots of 16 digit or less factors. There is almost no limit to the size of this potential factor.

True!
But, based on the amount of ECM that Mystwalker and I have done so far, there is only a 27.738% chance that a 45 digit or less factor exists. I think that you also have done some ECM work on this number. Please post your B1 and B2 values and the number of curves for those values so that i can include them in my tables.

Some time ago I ran P-1 on this number to some very high B1 B2 values, and have saved the file.
I'll email it to your AOL account in the next day or so.

vjs
06-07-2005, 09:49 AM
O.K. I must be doing something wrong is it possible that the max B1 value of stage one for this number would take 1500 years????

Mystwalker
06-07-2005, 10:29 AM
I haven't tested, but it will definitely take a lot of time. 1500 years could be correct - if there was enough memory in your PC (probably a Terabyte or two?)...

You could give 430,000,000 a try. That would be an optimal choice for a pre-50-digits-ECM-testing...

In addition, you could run 3 P+1 curves at B1=215,000,000

After that, ECM should take over again. I don't know the exact cause, but the larger the factor, the more unlikely it is that it gets found with P-1 compared to ECM...

IIRC, everything over 40-50 digits shouldn't be tried much with P-1.

vjs
06-07-2005, 12:43 PM
Mystwalker this is correct of course but is there any harm in doing...

k7 -pp1 -v -save 991pp1.sav 10000000000 10000000000 <991num.txt

b1=b2=10,000,000,000

It would be quite a bit better than a 430M also 3 curves at B1=B2=430M would only take a few hours.

I just did a B1=B2=100,000,000 it took 950 seconds on a 2200 mhz barton and it did not consure much memory at all.

If extrapolation holds true it would take 100x longer or little more than 1 day per curve if you don't run out of memory for b1=b2=10,000,000,000.

Also it would make some sence to have b2= XB1 where x is small considering he only has 256Mb.

Also I'm not sure can you actually contiune a stage 2 from a B1=B2 P+1???

I've always been interested in P+1 but don't truey understand it. It doesn't make sence to me that you have to run it three times... I'll probably look into the math involved when I get a chance.

Note:

when you say B1=??? are you also assuming that we run B2=100*B1 ???

vjs
06-07-2005, 02:47 PM
Also for those interested here are the P+1 records...



48 884764954216571039925598516362554326397028807829 L(1849) A. Kruppa 29.03.2003 (*) 10^8 10^10
47 79382035150980920346405340690307261392830949801 10100+15 Martin 11.08.2004 108 1011
45 173357946863134423299822098041421951472072119 13*2738-1 P. Zimmermann 24.02.2005 109 1013
42 514102379852404115560097604967948090456409 86124+1 P. Zimmermann 23.02.2005 106 1011
42 264237347008596079071617575175788166361473 13*2973+1 P. Zimmermann 19.02.2005 108 1010
39 134368561962115712052394154476370507609 162*11162+1 P. Leyland 2002 (*) 10^7 10^9
38 36542278409946587188439197532609203387 L(1994) A. Kruppa 30.03.2003 10^8 10^9
38 14783171388883747638481280920502006539 10917+17109 N. Daminelli 25.03.2003 10^7 10^9
37 9535226150337134522266549694936148673 10596+1 D. Miller 16.04.2003 10^7 10^8
37 4190453151940208656715582382315221647 45123+1 P. Montgomery 1994 (*) 10^7 10^9


So looking at the records a B1=B2=10^10 might give a 40-50 digit factor for this number...

Regardless I think these are the type of B1=B2 values we need to consider...

After I've completed a bit more low-p 991<n<50M sieve I might try a curve or two.

Keroberts are you interested in trying stage1 B1=B2=10^10? I wouldn't do anything less if you do and you can save/continue with a stage2 I might run stage2 up from yours or try b1=b2=10^11 if possible.

FYI the records:

884764954216571039925598516362554326397028807830
= 2 x 5 x 19 x 2141 x 30983 x 32443 x 35963 x 117833 x 3063121 x 80105797(8-digits) x 2080952771 (10-digits)

So his bounds were enough but close!

79382035150980920346405340690307261392830949802 =
2 x 11 x 409 x 701 x 1063 x 2971 x 3181 x 347747 x 9 056417 x 12 073627 x 32945 877451

vjs
06-07-2005, 04:26 PM
O.K. one more post... I did one curve with

B1=100000000
B2=100000000000

Stage 2 took about 480MB of memory...

I'm starting to think that a B1=10^10 and B2=10^11 is possible I can at least for a short time upgrade one of my machines to 3.5G.

If someone want's to run three B1=10^10 curves take less than a week. If we don't find a P+1 factor with those bounds I'm pretty sure I could do the stage 2 with 10^11 or greater next week.

Also if a factor is found ecm6.0 will simply print it to the screen correct?

Keroberts1
06-07-2005, 11:17 PM
I'd be willign to dedicate my athalon 2000 to P-1 for a month or s oif it could getto a depth of say 10^11 THen the B2 search can be broken up in to seperate parts right? Several people could search for say 10^11 to 10^12 and 10^12 to 2*10^12 and so on. Is this correct? Can this be done? Perhaps we could then push B2 up to 10^14 or so and have a very good chance of finding a factor. (hopefully, ideally we would have a much larger B1 value with B2 that big but if B2 can be distributed why not push it farther?)

vjs
06-08-2005, 10:07 AM
keroberts,

Are you interested in P-1 or P+1, IMHO we havn't gone enough P+1. I know I did a very serious P-1 on this number some time back Joe may have the file with B1 B2 bounds.

If you want to do P-1 I could rerun the calculations and suggest a B1 value with time.

Both P-1 and P+1 can be continued from stage 1 B1=B2 with any B2 value. Let me know personally I'm more interested in P+1 now especially since I can't remember the P-1 values I used.

Let me know either way, which ever one is done I'd suggest a very very large B1 value. If we over shoot great at least it's been factored.

vjs
06-08-2005, 12:02 PM
keroberts and others,

I believe this is the most optimized version of P+1 or ECM currently...

Xyxxz's post

http://www.mersenneforum.org/showthread.php?t=3766&page=3

I always rename the ecm6-K7.exe version to K7.exe

To run the program create a bat file with the command in it as opposed to typing the command at the dos prompt. Second only start the bat from the command line, otherwise when it finishes the window will close and you won't see the factor. I havn't figured out how to save the factor to a file I believe you use the command >fact.txt.

Here is an example of the bat I use.

k7 -pp1 -v -save stage1.sav B1 B2 <number.txt

k7 - this is the name of the program

followed by the switches...

-v is verbose mode

-pp1 is for doing P+1 factoring
-pm1 is for P-1 factoring.
if you don't include either of the above switches it will run ecm.

B1 and B2 are the bounds of course

<number.txt - this tells the program to use the number contained in the file number.txt (you can name it what you wish.

I always expand the number using dario's applet

http://www.alpertron.com.ar/ECM.HTM

enter 24737*2^991+1 into the applet it producced the number

cut and past this number into a .txt file with notepad, remove the spaces and hard returns so that it's all on one line and save.

When you run K7 it should show a 303 digit number

517...922177

example of bat again.

k7 -pp1 -v -save 991pp1.sav 10000000000 10000000000 <991num.txt

vjs
06-08-2005, 02:10 PM
I ran three P+1 curves at

B1=10000000
B2=1000000000

since it didn't take too long I then ran three P+1 curves at

B1=100000000
B2=10000000000

The difference in time by increasing bounds by 10X, Stage 1 10x longer stage2 4x longer.

1 curve of stage1 with B1=B2=10000000000 would take about 2.5 days on my 2.2G barton system. And you can contiune stage two from the stage1 save file but each stage 1 started with a different x0= as you will note.

vjs
06-08-2005, 04:51 PM
O.K. Joe just sent me the B1 B2 values either he or I used in the past for P-1.

B1=64100675
B2=6205033714
The machine I did it on before was a dual P3-866 with a 1G of ram and ecm-5.03-P3

However I just ran another P-1 with slightly larger B1 bounds and a huge B2 on a dual barton with 1G of ram.

B1=100000000 (10^8)
B2=1000000000000 (10^12)

I didn't take long for the B1, I was trying to see how large a B2 I could use. I could potentially use a larger B2 but it would be more benifital for someone else to do the stage1, the above run consumed a max of 1.8G of memory during stage 2 but stage one didn't take much at all.

I have a fast 15K scsi drive on a adaptec card so swapping a little doesn't hurt as much as one would think. I may also be able to bump the machine up to 2G of memory for the stage2.

If there are any takers on the stage1 P-1 or stage1 P+1 I'd suggest a B1 value of
B1=10000000000 (10^10) minimum.

If we are going to try a distributed approach to P-1 I'd even like to see a higher B1, it's sort of the base for the effort. Any takers???

vjs
06-24-2005, 09:17 PM
O.K.

P-1 done to B1=4.2G no stage2.

B1=2^32-1 limit of prime95 24.12

Mystwalker
07-30-2005, 06:10 PM
1057517737707584916346232353319 | 55459*2^1966+1 :|party|:

I currently factor all composites 1000 < n < 2000 for 35 digits using ecm. Let's see which else stumbles. :D

Mystwalker
08-05-2005, 04:32 PM
10223*2^1529+1 has a factor: 14826855978213563035553602013 :|party|:

2 down, 10 to go (from which 3 resisted the 35 digit level).

What's really special:
Prime95 found this one in stage1 (B1=1,000,000)! :shocked:

Mystwalker
09-05-2005, 06:23 AM
250264328954609482327059485767739|24737*2^1543 :|party|:

That concludes the 35 digit range:


25 30 35 40

21181,1148 ok ok ok reserved
21181,1172 ok ok ok
10223,1181 ok ok ok
21181,1268 ok ok ok
10223,1517 ok ok ok
24737,1567 ok ok ok
55459,1666 ok ok ok
55459,1894 ok ok ok reserved

4 have fallen, 8 to go...
I will take out some effort on this, though. If someone wants to factor some of these candidates, please drop a line.

Greenbank
09-05-2005, 09:27 AM
I'll extend my database of all factors to include computed P+1 B1 and B2 values along with the existing P-1 values.

Greenbank
09-07-2005, 10:16 AM
The P+1 figures are interesting.

I'm in the middle of computing the B1 and B2 values for P+1 (to go along with the B1, B2 for P-1). 1/3 of the way so far.

Both methods provide the roughly the same number of factors (where p < 2^50) and the overlap (where a specific B1,B2 would have found a factor using either method) is relatively small.

(Sticking with B2=B1*100)

228541 factors < 2^50. B1=1000 B2=100000
P-1 finds 3516
P+1 finds 3423
BOTH: 44

229376 factors < 2^50. B1=40000 B2=4000000
P-1 finds 47113
p+1 finds 45112
BOTH: 9005

However, I'm having trouble with ecm 6.0.1 which I'm using for P+1. It segfaults in step 2 whenever I use an n > 1M. Will look into this further.

More stats later when I've factored all P+1's.

vjs
09-07-2005, 10:25 AM
One of the major problems with P+1 is the group order (from mersenne) basically there is only a 50% probablility that you actually do a P+1 and not a P-1.

What most people do is 3 curves at 3-levels higher than they would typically ecm.

Not sure why it's segfaulting for you remember there are many different ecm versions optimized for different platforms. I'm sure your using the correct one but you might just want to check for a new version.

Greenbank
09-07-2005, 10:40 AM
Intersting. Got any links to where I can read up on p+1 and the group orders?

[EDIT] http://magma.maths.usyd.edu.au/magma/htmlhelp/text529.htm

Googled for it. "p+1" is not that easy to search for as google strips out the +, even when it is in double quotes.

Quoth the page: "A base x_0 is used, and not all bases will succeed: only half of the bases work (namely those where the Jacobi symbol of x_0^2 - 4 and p is -1.) Unfortunately, since p is usually not known in advance, there is no way to ensure that this holds. However, if the base is chosen randomly, there is a probability of about 1/2 that it will give a Jacobi symbol of -1 (so that the factor p would be found assuming that p + 1 is smooth enough). A rule of thumb is to run pPlus1 three times with different random bases."

Thank you for pointing me in the right direction.

As for the segfaulting, I think it's a general GMP problem. I had the same thing when implementing my own p-1 code with GMP, it kept on coring in mpz_powm(). I've compiled the whole thing myself (gmp and ecm) so there cannot be any compatability problems.

vjs
09-07-2005, 11:24 AM
You might just want to browse the mersenneforums.org under the factoring section.

Since you have the data and bounds (presumably) you should be able to double check to see if P+1 works for a particular k/n value (i.e. get it in the first or second shot). This could prove interesting.

If so it may be worth P+1'ing all values with (??? we can probably make a guess from your 48+ bit data) ???? 20K/300K before the enter prp in addition to P-1 assuming we get enough factoring computer power.

I personally believe more people will start to factor these numbers in the future as test continue to take longer and longer.

I think the only thing we can really do is experiment, I prefer experimentation to theory.

Joe O
09-07-2005, 07:19 PM
Originally posted by Greenbank
However, I'm having trouble with ecm 6.0.1 which I'm using

for P+1. It segfaults in step 2 whenever I use an n > 1M. Will

look into this further.



Probably stack overflow. There is an option and/or a dll to not use the stack. Check the documentation. It's probably libgmp vs libgmpa but I'm not sure.

Mystwalker
10-04-2005, 10:14 AM
New status:


25 30 35 40

21181,1148 ok ok ok ok
21181,1172 ok ok ok reserved
10223,1181 ok ok ok
21181,1268 ok ok ok
10223,1517 ok ok ok
24737,1567 ok ok ok
55459,1666 ok ok ok
55459,1894 ok ok ok reserved

Mystwalker
10-16-2005, 07:12 PM
New status:


25 30 35 40

21181,1148 ok ok ok ok
21181,1172 ok ok ok reserved
10223,1181 ok ok ok
21181,1268 ok ok ok
10223,1517 ok ok ok
24737,1567 ok ok ok reserved (13.2%)
55459,1666 ok ok ok
55459,1894 ok ok ok reserved (32.6%)

Keroberts1
10-16-2005, 09:41 PM
i tried to do some curves a whileago but i was unable to do s obecause 991 required too much memory at the level itwas at. I would like ot help out with the other numbers though is there anyway someon can tell me what i need to place in different files and what command line i should use ot work on say
10223,1181 ok ok ok
for the 40 digit level

Mystwalker
10-17-2005, 06:35 AM
Maybe this (http://www.free-dc.org/forum/showthread.php?s=&threadid=8571&perpage=48&highlight=&pagenumber=2) posting gof the current thread helps.

If not, just ask or search the forum - there should be a better instruction somewhere...

For the 40 digit level, you'll need ~200 MB for optimal performance, IIRC.

Keroberts1
10-17-2005, 11:48 AM
ok well where r the decimal expansions of the remaining canidates and is this the correct command line to call

C:\gnu\msys\home\sKetch\ecm-6.0.1\ecmfactor.exe ecm -c 100 -n -one 11e7 < SB.txt >> SBresult.txt

it doesn't seem to work

vjs
10-17-2005, 12:14 PM
I'd suggest using prime95 for stage1 its faster anyways...

The other thing you could do is create the decimal expansion with dario's applet.

Google ECM applet dario

Greenbank
10-17-2005, 12:31 PM
I thought ECM-6.0.1 was able to take non-expanded input?

Yup, just checked. My work files have:-

METHOD=ECM; SIGMA=2568500344; B1=100000; N=10223*2^4169+1; X=...

You can check this with (finds a low B1 B2 value factor)...

# echo 33661*2^1918512+1 | ./ecm -pm1 300 400
GMP-ECM 6.0.1 [powered by GMP 4.1.4] [P-1]
Input number is 33661*2^1918512+1 (577535 digits)

No expansion required.

Mystwalker
10-17-2005, 01:15 PM
For Windows systems, the number probably has to be entered differently:

http://www.mersenneforum.org/showthread.php?t=4536 (especially starting from #11)


But I'd also use prime95 for stage1 (using GmpEcmHook=1 in prime.ini). That way, you don't need to do anything about the number, as it is entered into the results file.


C:\gnu\msys\home\sKetch\ecm-6.0.1\ecmfactor.exe ecm -c 100 -n -one 11e7 < SB.txt >> SBresult.txt

If you use ecm only, try

echo 10223*2^^^^1181+1 | ecm.exe -n -one -c 2440 3e6 >> SBresult.txt

- not ecmfactor.exe
- maybe use 1 or 2 "^"
- on a fast PC, this should be doable in about a week
- it's faster when you use Prime95 for stage1, but more complicated (which will hopefully change in a couple of weeks)
- when you close the program, you'll only loose the current curve (approx. 5-10 minutes lost)

Keroberts1
10-18-2005, 02:24 AM
well nothing seems to be working right now... I'm looking to know exactly what to type in for the correct number of curves to be done for any K and N value. I want to start it directly fro mthe start menu and simply click run and enter a command line. I'm not an expert in dealing with computers so please make it simple. if someone uses AIM and wants to give me a little one on one advise it'd be appreciated but it isn't that important. After all this isn't even a productive part of the project, just a fun one. If anyone wants to assist me yo ucan send me a PM and I'll get back ot yo uwithin a day or so.

vjs
10-18-2005, 10:48 AM
If your using windows

C:\gnu\msys\home\sKetch\ecm-6.0.1\ecmfactor.exe ecm -c 100 -n -one 11e7 < SB.txt >> SBresult.txt

almost correct

C:\gnu\msys\home\sKetch\ecm-6.0.1\ecmfactor -n -one -c 100 11e7 7e11 <SB.txt >>SBresult.txt

Where sb.txt contains the decimal expanded number, try that and see if it works.

Also your ecm file is called ecmfactor???
If you tell me what O/S, processor, ram you have I could e-mail you a complete zipped folder where all you have to do is click.

Keroberts1
10-18-2005, 02:08 PM
i have windows XP and a AMD athalon 2000+ with 256 MB of memory. I am planning to get more memory sticks soon. Interestingly enough i have no problem installing new hardware on my machine but for somereason i just can't friggin figure out how to d osimply things with windows. I hate microsoft.

Mystwalker
11-07-2005, 07:33 PM
New status:


25 30 35 40

21181,1148 ok ok ok ok
21181,1172 ok ok ok ok
10223,1181 ok ok ok Keroberts1
21181,1268 ok ok ok reserved
10223,1517 ok ok ok
24737,1567 ok ok ok reserved (26.5%)
55459,1666 ok ok ok
55459,1894 ok ok ok reserved (66.2%)

Keroberts:
Did you make gmp-ecm run your number?

Keroberts1
11-08-2005, 11:06 PM
not really i kinda gave up after i failed 50 times and never got any responses in the fourum. I mean i have sbfactor working fine and at some point in the past i was having no problem running GMPecm but i forgot waht to do adn can't find where i read how to do it. My main problem is i can't remember what to use for formatting when calling the program and i don't remember the windows syntax for calling programs from the start menu. whether i need to seperate comands with - or < or " or - . or just spaces. I also don't remember which order the commands should be in. I forgot all of this a while ago when i dropped all of my programming courses. I love the math but hate the computers. Tis a shame they seemed like they would have fun together.

Mystwalker
11-09-2005, 08:06 AM
Can you navigate to the directory that contains ecm.exe?
You can also use "command prompt here (http://download.microsoft.com/download/whistler/Install/2/WXP/EN-US/CmdHerePowertoySetup.exe)" (this one probably only works for WinXP SP2, but it should be available for other Windows versions as well). Just install it, then right-click on the folder in the Explorer and select the resp. command from the pop-up menu.

Now, first try something like
echo 547398 | ecm.exe 100 1000

Does this work?

Keroberts1
11-10-2005, 02:24 AM
seemed to work found a factor of 2 immediatly where do i go from here?

Mystwalker
11-14-2005, 07:14 AM
Sorry for the late reply... :(

Now, try
echo 10223*2^^^^1181+1 | ecm.exe -n -one -c 2440 3e6 >> SBresult.txt

The expected behavior:
- Nothing will be in the window
- a few seconds later, a file named "SBresult.exe" will appear in the directory. Its contents:

GMP-ECM 6.0.1 [powered by GMP 4.1.4] [ECM]
Input number is 10223*2^1181+1 (360 digits)
Using B1=3000000, B2=4016636513, polynomial Dickson(6), sigma=1083634601
(The sigma value will be different, though...)

- every now and then (depending on your PC's performance; maybe every 1 or 2 minutes), the text file should grow

If this doesn't work, try this one instead:

echo 10223*2^^1181+1 | ecm.exe -n -one -c 2440 3e6 >> SBresult.txt

Keroberts1
11-15-2005, 03:08 AM
I've got itrunning nowhowlong does a single curve take? Howlong should i expect it to take on an athalon 2000+ for it to finish?

edit: nevermind answered my own question should be done in 15 days.

how much memory shouldi have to even attempt to help with the 991 factoring? I would relly love to see that one fall and it would be cool to find a record factor.

I was reviewing some old posts and i saw someone mentioning an obvious fact that there can at most be 6 5 0digit factors or 5 60 digit factors and so on for something adding up to 300 digits. This means that if there are going to be multiple 50 or 60 digit factors then perhaps fewer curves should be performed because there are likely to be more than 1 factor in the certain digit range.

Has the 50 digit range been finished? 55? I just upgraded my system's RAM to 512 MB of DDR. would this be enough ot help out? I am usually just a siever and I like ot play with P-1 but i don't find enough factors in either of these to really make me feel like I'm helping the project much so I'd just like to try something a little more interesting for a while. Does anyone know what the statistical odds of the factors being distributed i ndifferent fashions? We know that the value is not prime so the absolute worst scenerio would be two prime factors around 150 digits. This is not extremely likely though.

Would ECM work at all if the factors that are at say the 70 digit level were composed of 2 primes both of 35 digits hence that factor = P1+P2+1 ?

How often do 2 primes +1 add up to equal another prime... I believe there was a specia lname for this type of a number but i am not up to date on my number theory. I am hopping ot take a few more courses on it next semiester (have some empty blocks n my schedual) I would really love to be able ot understand a little better how exactly all of this works.

Mystwalker
11-15-2005, 07:53 AM
Originally posted by Keroberts1
how much memory shouldi have to even attempt to help with the 991 factoring? I would relly love to see that one fall and it would be cool to find a record factor.

Has the 50 digit range been finished? 55? I just upgraded my system's RAM to 512 MB of DDR. would this be enough ot help out?

512 MB should be enough for the 50 digit level. For 55 digits, 1 GB would be good and 2 GB even better.

The 50 digit range is still underway. I guess we're approx. half-way through, so there's still a lot of work to be done.


I was reviewing some old posts and i saw someone mentioning an obvious fact that there can at most be 6 5 0digit factors or 5 60 digit factors and so on for something adding up to 300 digits. This means that if there are going to be multiple 50 or 60 digit factors then perhaps fewer curves should be performed because there are likely to be more than 1 factor in the certain digit range.

This is possible, but not very likely. According to [utl=http://home.earthlink.net/~elevensmooth/MathFAQ.html#NoFactors]William Lipp[/url], the probability of a factor between 10^a and 10^b is 1-a/b.


Would ECM work at all if the factors that are at say the 70 digit level were composed of 2 primes both of 35 digits hence that factor = P1+P2+1 ?

This is where ECM differs from P-1/P+1 factoring. You don't need a smooth P-1/P+1 (which rely on "P_1 * P_2 * ... * P_n +/- 1", by the way).
I haven't completely figured it out myself, but I think the following is a nice (but definitely wrong) way of putting it (call it a lie - but a useful lie ;) ):

Each curve uses a random sigma value.
Just imagine that a curve tries to find a smooth P+sigma.
Again, this is not how it works, but it can be understood easily.

Keroberts1
11-15-2005, 02:02 PM
I was thinking that such a formula would not work atthis point because of the extensive factoring already done for this value. I believe it is more likely that there is a few factors that are just above what we are currently searching through. This would mean that fewer curves should be necessary for us to find one of those factors and hence we'd want to switch to higher bounds sooner, perhaps alot sooner. I'm planning to work out the stats tonight but i probably won't have time because ofmy workload for school. I just wish i paid more atention in stats class last year so I'd remember how to do this quickly without having to g othrough the book again. What I'd really like is a probability graph showing the likelyhood of multiple factors in the different digit ranges. it is likely that there is a range around 70 digits that is more likely to have several factors within it and would therefore require fewer curves to find a factor.

Keroberts1
11-17-2005, 06:48 AM
so where is the current stand of the 991 factoring effort? when i finish my current factoring range and my sieve range i would like to help. what are the neccessary bounds? How many curves are needed?

Greenbank
11-17-2005, 07:50 AM
Originally posted by Keroberts1
so where is the current stand of the 991 factoring effort? when i finish my current factoring range and my sieve range i would like to help. what are the neccessary bounds? How many curves are needed?

My personal opinion is that people have plowed far too much time into trying to factor this number already. Even my rubbish proth_gmp binary can PRP it in about 1/100th of a second and prove that it is composite.

However, you are free to contribute to the project in whatever way you want. My opinions are my own ;-)

You should find this thread useful:-

http://www.free-dc.org/forum/showthread.php?s=&threadid=9595

It explains the bounds and number of curves that people have run in the past, along with a few expected execution times per curve for certain hardware, plus the bounds for different factor lengths.

Mystwalker
11-21-2005, 08:11 AM
Originally posted by Keroberts1
so where is the current stand of the 991 factoring effort? when i finish my current factoring range and my sieve range i would like to help. what are the neccessary bounds? How many curves are needed?

Joe posted his Excel spreadsheet here (http://www.mersenneforum.org/showthread.php?t=3449).
It's almost half a year old.
In the meantime, I did approx. 10% of the required curves, Joe probably did some more.
Thus we're at 30%+...

Keroberts1
11-23-2005, 01:58 PM
my computer crashed after 1490 curves for 1181 and i was hopping i coul djust start it again and finish the last 950 but I'm worried its going ot start off with the same sigma values again. How should i go about continueing this?

vjs
11-23-2005, 08:24 PM
no everytime the simga value is chosen at random. ALso several sigma values will fit what's required for a factor if there is one.

Mystwalker
11-28-2005, 08:58 AM
Originally posted by vjs
no everytime the simga value is chosen at random.

That would be surprising for me...
Actually, each sigma value should be random, in the sense that it is highly unlikely to get the same value twice in such a short time.

Even if it happens, it's not that bad. That one curve is useless then, but the next curve should have a new (and unused) sigma value.

Your 1490 curves are completely valid, assuming you still have the output and/or checked that no factor has been found.

vjs
11-28-2005, 10:45 AM
O.K. then I'm not understanding where are the sigma values of the previous curve stored or?

I know there are functions to start with a particular sigma value then incrementally increase. But I wouldn't know which sigma to start at or what increment to use.

It would make sence that if one wanted to conduct "all" of the 45-digit curves etc that there would be a minimum sigma to start with and an sigma increase etc.

Greenbank
11-28-2005, 11:02 AM
They aren't stored. There are 2^32 (4294967296) possible sigma values it could pick at random.

Let's have a look at the odds.

If N=2^32 (number of possible sigma values):-

For one sigma value to be unique (which it obviously will be):

N/N = 1

For two sigma values:-

N*(N-1)/ N^2.

For three sigma values:-

N*(N-1)*(N-2) / N^3

Or: N! - (N-c)! / N^c (where c is the number of curves)

Continuing this up to 5000 sigma values (and knocking up a quick GMP program to calculate this).

The chances of picking 5000 unique sigma values is 99.70944%. So there is a 1 in 344 chance that you won't have picked 5000 unique sigma values.

Keroberts1
11-28-2005, 03:22 PM
either way all of the curves are completed and no factors found (where would the factor be outputted?) No fact.txt file or other new files that i can see.

Mystwalker
11-28-2005, 04:12 PM
Do you have the ">> SBresult.txt" at the end of your command?
If yes, all output will be there. Thus, just open it and search for "factor".
If you haven't specified an output file, then you either have to use the parameter "-one" (which stops factoring once a factor is found; nevertheless, the output is only on screen by default) or search through all screen output.

vjs
11-28-2005, 04:56 PM
Mystwalker is correct of course...

The other option is to yet again use more switches, take the following commmand line for example

k7 -one -c 900 3e6 <991.num >>results.txt

This commandline would 900 curves and b1=3e6 (the 40-digit level) and save all the output to results.txt.

The two options of >>results.txt and >results.txt,
>results.txt this basically means write the output for the curve to a text file results.txt
>>results.txt this basically means write the output for the curve to the end of the text file results.txt

The differences are suttle but with the first case only the last curve will appear in the file the one with the factor. With the secondcase all curves are written to the file with the factor producing curve at the end.

Using >> is the better choice since you can see the progress of your ecm by checking txt file size. I think a 900 curve file is about ???88k??? when completed???

Here is the important part

-one

When using this switch the ecm program will stop once a factor is found, therefore the last curve i.e. the last curve written to the curve containing the factor.

Here is an example of what's contained in a sucessful factor using the above commmandline (note b1=11e6)

GMP-ECM 6.0 [powered by GMP 4.1.4] [ECM]
Input number is 2366206759345660958438496174396247604384804004693614461892003255335430183968544739660002801882779222 8876627301931 (113 digits)
Using B1=11000000, B2=50000000000, polynomial Dickson(12), sigma=1539336453
Step 1 took 121896ms
Step 2 took 102287ms
Using B1=11000000, B2=50000000000, polynomial Dickson(12), sigma=2897668340
Step 1 took 121628ms
Step 2 took 102263ms
Using B1=11000000, B2=50000000000, polynomial Dickson(12), sigma=1080738149
Step 1 took 121945ms
Step 2 took 102188ms
********** Factor found in step 2: 109152212865049578528408617691239
Found probable prime factor of 33 digits: 109152212865049578528408617691239
Probable prime cofactor 216780466216577989739156624879506765269560601796002467513388819212013895038704029 has 81 digits

The bolded portion is the only curve which produced a factor. IF you don't use the -one switch the factor could be buried somewhere in the file, if you use the -one and only use > only the factoring curve is written.

I think I made this more complicated than it need to be. Simply do a text search for words factor or prime or simply search for *** as mystwalker suggested.

On another note it will take quite some effort for us to find a factor for this number. It's quite possible that there are no factors less than 100 digits in size. But then again we may find a 45-digit factor for it in the next curve. We just don't know.

Mystwalker
11-28-2005, 07:11 PM
Originally posted by vjs

>results.txt this basically means write the output for the curve to a text file results.txt
>>results.txt this basically means write the output for the curve to the end of the text file results.txt

The differences are suttle but with the first case only the last curve will appear in the file the one with the factor. With the secondcase all curves are written to the file with the factor producing curve at the end.

Correct me if I'm wrong, but I'd think that with ">", all curves of one run of the application are written to the resp. file.
In other words, the file is only overwritten when you start gmp-ecm another time, not when another curve is calculated.

Apart from that, I completely concur with you. :)

Keroberts1
11-28-2005, 08:03 PM
ok well i did use -one and there were no factors at the end of the file so i guess no luck this time. What bounds should i use for 991 i am thinking i should do 10 or 20 curves just to see if i can push that one along a little more. what optional commands should i use to save memory?

Mystwalker
11-29-2005, 09:12 AM
B1 bounds is 44M at the moment.

For B2, it's generally sensible to use the default value when performing stage1 with gmp-ecm as well.
When you have few memory, there are 3 options:

1. use smaller B2 bounds
Curves will be completed faster and need less memory, but more are needed for a certain digit level.
For 49e9, I need upto 340MB.

2. use the parameter "-k <x>" where <x> is a number
By default, x = 4. It's basically the number of fractions the b2 "space" gets divided in.
Increasing this value saves memory, but curves take a bit longer.
Quadrupling the value halves the memory consumption.

3. use the parameter "-treefile <x>" where <x> is a name
Now, some tables are written to disk instead of stored in memory. This considerably reduces memory requirements, but of course lessens performance, but not as much as one might expect.

vjs
11-29-2005, 10:22 AM
Mystwalker,

Hummm... > or >>

Now you have me thinking, it doesn't make sence at if you decided to run 400 curves with the > that only the first curve were reported. Since if a factor were found on the 249th curve the client would stop but you wounldn't know the factor. (Not saying your wrong but... I always thought the txt was simpley overwritten and the >> was an append?)

Well lets just agree that one shouldn't use the > but the >> option instead.

Keroberts,

Mystwalker is giving you great advise I'd stick with with his guidance.

Mystwalker
11-29-2005, 11:29 AM
Originally posted by vjs
I always thought the txt was simpley overwritten and the >> was an append?

That's correct - but it all depends on the granularity. The overwrite event only takes place at the start of gmp-ecm, not at the start of a curve (when using the -c parameter - without, it's the same, of course).
Thus, the information of curve results within a single gmp-ecm run gets appended to the output file, whereas a new start of gmp-ecm naturally erases all data when using > instead of >>.

vjs
11-29-2005, 11:54 AM
O.K. so using > instead of >>

Basically using > and the -c curves option, reports the last curve done. Either containg the factor if it stoped early or the final curve of a run. However you have no idea howmany curves were completed in the event of a crash etc.

Thanks for clearing this up.

Mystwalker
11-29-2005, 01:20 PM
Actually, it seems like I didn't express myself clearly. :(

According to my tests, > and -c saves all output of a single gmp-ecm run.

Example:
echo 2^^^^1061-1 | ecm -c 5 1e4 > results.txt

Contents of results.txt
GMP-ECM 6.0.1 [powered by GMP 4.1.4] [ECM]
Input number is 2^1061-1 (320 digits)
Using B1=10000, B2=1186831, polynomial x^1, sigma=2999931604
Step 1 took 783ms
Step 2 took 844ms
Run 2 out of 5:
Using B1=10000, B2=1186831, polynomial x^1, sigma=3364805730
Step 1 took 901ms
Step 2 took 876ms
Run 3 out of 5:
Using B1=10000, B2=1186831, polynomial x^1, sigma=4143695723
Step 1 took 1020ms
Step 2 took 841ms
Run 4 out of 5:
Using B1=10000, B2=1186831, polynomial x^1, sigma=2441763206
Step 1 took 885ms
Step 2 took 879ms
Run 5 out of 5:
Using B1=10000, B2=1186831, polynomial x^1, sigma=523117962
Step 1 took 870ms
Step 2 took 811ms

When you restart gmp-ecm, the contents is lost, of course.

Vato
11-29-2005, 08:57 PM
That's because > explictly means 'redirect stdout to this filename after truncating to zero length'. As opposed to >> which means 'redirect stdout and append to this filename'. Neither of these are options/flags to the program itself. They are instructions to the shell (which is used to launch the program) to connect certain file descriptors to the file rather than to your tty/terminal/screen/whatever.

Keroberts1
12-05-2005, 02:22 PM
i have started experimenting with p95 but on the irst factoring run it chose b1 bounds that are so large it'll take my athalon 2000+ over 13 days to complete stage one and B@ equals stage ones value. Is there something i can do to set bounds manually?

hhh
12-06-2005, 01:28 PM
The way to chose the bounds manually is:
Pminus1=K,2,N,1,B1,B2,0

There is the max B1=B2 value possible somewhere at mersenneforum, but I don't remember it.

But there are reasons, (don't ask me) to use ECM for so large factors. And I wonder if somebody didn't run P-1 already for these small N, with quite impressive bounds.
H.

Keroberts1
12-06-2005, 03:43 PM
when doing stage 1 with P95 and P-1 how doyou have save files created and stored s othat you can use greater bounds later?

hhh
12-07-2005, 06:35 AM
Originally posted by Keroberts1
when doing stage 1 with P95 and P-1 how doyou have save files created and stored s othat you can use greater bounds later?

With Pminus1=...

Doing this, the save file is not deleted. But you have to keep track of the bounds, because continue on a savefile with the wrong bounds works, but gives bogus, I got the impression.
I don't know either if you can extend B1 from say 1000000 to 2000000. There are commands like
Pminus1=K,2,N,1,B1old-B1new, B2,0

or

Pminus1=K,2,N,1,B1new-B1old, B2,0
I don't know. The same seems to work with B2.
Try it out and keep us informed, please.
Have fun, H.

vjs
12-07-2005, 08:24 AM
From what I know/remember...

First the max B1 is something like 4G what every 2^n is for 4G anyways.

I'm pretty sure you can extend B1 easiely, simply don't erase the intermediate file and run the same worktodo.ini and specify a larger bound. Don't get confused with the B2New-B2old. Increasing B1 on a second run actually extends the existing file were running B2'-B2 simply checks a different range of B2 between B2' and B2. I hope this makes sence.

The idea behind running prime95 for stage1 is it's faster, however it does start to take quite a bit of disk space if you want to do 1000 curves for example.

Keroberts1
12-10-2005, 07:12 AM
where does p95 putput factors? Can i have them placed into a specific file?

hhh
12-10-2005, 09:18 AM
Originally posted by Keroberts1
where does p95 putput factors? Can i have them placed into a specific file?
Isn't it results.txt?
And you can put e.g. the line

results.txt=fact.txt

into your prime.ini.
There you get residues and eventual factors. H.

Keroberts1
12-11-2005, 12:31 AM
ok i have an interesting error. I have set up P95 to start on eveer 67607 test through 15-16 million. for starters i set it to P-1 them all with bounds of B1=1000 and no stage 2. I just checked it and it has found a factor for almost everyone of the tests adn the factor is 1. For some of thetests it says it has found a factor and then itsays error factor doesn't divide N. Is that because i used such low bounds or is there somethingelse going on? the lines in my prime.ini are like this.

Pminus1=67607,2,15001691,1,1000,1000

Is that wrong?

I already have my worktodo.ini populkated so that it will go through all of the tests with this bounds and then do them all again up to 5000 and then 20000 and then 50000. Will this work? Will not work? Any ideas? Just to explain some people have been working o ngetting 9-10 Million to have fewer than 1000 tests for 67607 and I'm just trying to see if i can d oit with 14-15 mil. first. Races are always fun ya know.

hhh
12-11-2005, 07:59 AM
Originally posted by Keroberts1
the lines in my prime.ini are like this.

Pminus1=67607,2,15001691,1,1000,1000

Is that wrong?


PRIME.INI??? You mean worktodo.ini.

And can you try to put the line

Pminus1=67607,2,15001691,1,1000,1000,0

with a final useless parameter 0? That's the way I did it back then.
Don't know if this helps...
H.

Keroberts1
12-11-2005, 02:29 PM
yes imeant worktodo.ini and i have tried adding the 0 i will let you know how it works.

Keroberts1
12-11-2005, 11:51 PM
ok well it worked fine for several hours and then went back to giving the same errorsand 1 as a factor. Its a mighty weird error even if it is hardware. my guess is it has something ot do with such low bounds or a syntax error somewhere in my command line.

hhh
12-12-2005, 03:00 AM
You say it worked fine for a few hours. Does this mean in your results.txt (or whatever it is called) you've got residues and all? With such small bounds, there must be dozens.
And then it startts to mess up again, right?

If you think it's a hardware error, you can run a stress test, a memory test ( you find information on this forum) and stuff, you know all this I guess.
Also, you can post at mersenneforum.org. There you will find qualified support, probably more helpful than me.
It could have to do with a FFT length related bug, too, who knows?
I would like to try it out but don't have time. Perhaps this evening.
Meanwhile, you can post more specific output, like from results.txt, screen (I think you need to make a screenshot) and so on.
Not for me, but others might recognize something.
Good luck, Yours H.

Mystwalker
12-12-2005, 08:32 AM
New status:


25 30 35 40

21181,1148 ok ok ok ok
21181,1172 ok ok ok ok
10223,1181 ok ok ok ok
21181,1268 ok ok ok reserved (24.6%)
10223,1517 ok ok ok
24737,1567 ok ok ok reserved (26.5%)
55459,1666 ok ok ok
55459,1894 ok ok ok reserved (69.8%)

Nuri
12-12-2005, 06:29 PM
Keroberts, is it working properly now?

If not, would you consider a copy-paste of the contents of your prime.ini file, some sample rows of the working pairs from the worktodo.ini and fact.txt, and some sample rows of the non working pairs from the worktodo.ini and fact.txt files?

May be somebody can guess what's going wrong (if it's not hardware related, of course).

BTW, you should note that if you're using only one or two PCs, expect it to take at least a year to slim down 15m-16m to 999.

Keroberts1
12-13-2005, 12:08 AM
[Mon Dec 12 21:50:04 2005]
67607*2^15051627+1 stage 1 is 34.69% complete.
[Mon Dec 12 22:15:00 2005]
67607*2^15051627+1 stage 1 is 69.39% complete.
[Mon Dec 12 22:38:25 2005]
P-1 found a factor in stage #1, B1=20000.
ERROR: Factor doesn't divide N!
67607*2^15051627+1 completed P-1, B1=20000, B2=20005, Wc1: 8573DA75
[Mon Dec 12 23:06:32 2005]
67607*2^15052451+1 stage 1 is 34.69% complete.
[Mon Dec 12 23:35:26 2005]
67607*2^15052451+1 stage 1 is 69.39% complete.
[Tue Dec 13 00:03:36 2005]
P-1 found a factor in stage #1, B1=20000.
67607*2^15052451+1 has a factor: 1
67607*2^15052451+1 completed P-1, B1=20000, B2=20005, Wc1: 8592DA6A

Nuri
12-13-2005, 12:36 PM
here's what I get...

[Tue Dec 13 10:02:14 2005]
67607*2^15051627+1 completed P-1, B1=20000, B2=20005, Wc1: 8573DA75
[Tue Dec 13 11:13:55 2005]
67607*2^15052451+1 completed P-1, B1=20000, B2=20005, Wc1: 8592DA6A

Greenbank
12-13-2005, 01:07 PM
I had similar 'problems' when I was using sbfactor on one of my P4s.

More than likely you have a duff bit of memory, it kept on reporting that 2 (or some other very small number) was a factor of a specific number (which is impossible).

If you can, run the Prime95 binary in the stress test mode to see if it fails. Or you could run some sort of memory testing program.

Nuri
12-13-2005, 01:14 PM
... with prime95, version 24.14.1.0

Other than that, I would also suggest a stress testing of your hardware.

Mystwalker
01-07-2006, 06:52 PM
New status:


25 30 35 40

21181,1148 ok ok ok ok
21181,1172 ok ok ok ok
10223,1181 ok ok ok ok
21181,1268 ok ok ok ok
10223,1517 ok ok ok reserved
24737,1567 ok ok ok reserved (44.8%)
55459,1666 ok ok ok reserved
55459,1894 ok ok ok reserved (69.8%)

vjs
01-23-2006, 11:49 AM
I see Mike_H found a bunch of small n factors between 30K and 60K...

Looks good mike were those P-1 and what bounds were you using just wondering.

MikeH
01-23-2006, 03:58 PM
Looks good mike were those P-1 and what bounds were you using just wondering.Most of those recent successes were with ECM (using B1, B2 and curves which equate to 18 digits).

I had been P-1ing, but I seem to have hit a bit of a drought, thus the switch to ECM. I'll probably be back on P-1 again soon.

Mystwalker
01-23-2006, 05:33 PM
Do you have a status update of your ECM work (like here (http://www.free-dc.org/forum/showpost.php?p=81856&postcount=36))?

MikeH
02-02-2006, 05:14 PM
Do you have a status update of your ECM work (like here (http://www.free-dc.org/forum/showpost.php?p=81856&postcount=36))?

Complete
[Mon 02-Jan-2006 22:10:14] ECM: Range 30000-50000 (20000), 18, 375 candidates
[Sat 21-Jan-2006 14:56:17] ECM: Range 50000-70000 (20000), 18, 393 candidates

In progress
[Sun 22-Jan-2006 17:54:07] ECM: Range 100000-200000 (100000), 18, K=67607, 124 candidates
[Wed 01-Feb-2006 22:19:55] ECM: Range 70000-80000 (10000), 18, 189 candidates

Greenbank
02-06-2006, 06:01 AM
What B1 and B2 are you using for 18 digits? At a guess I'd say:-

B1 = 10000 B2 = 1500000 ?

And how many curves max?

What about B1 and B2 for P-1? The same?

MikeH
02-06-2006, 12:40 PM
66 ECM curves, B1=7400, B2=740000

Need to review where I'm at with the P-1 stuff, but I've been working with B1=B2, then when I get bored I'll do a B2=B1*100 then call it a day (probably).:)

EDIT:

Current status of P-1 testing is:

Complete:


P-1: Range 991- 5000 B1=100000000, B2=100000000
P-1: Range 5000- 20000 B1=15000000, B2=15000000
P-1: Range 20000- 100000 B1=1000000, B2=1000000
P-1: Range 100000-1000000 B1=600000, B2=600000, K=67607

Greenbank
02-08-2006, 06:48 AM
Well I'm doing 200000 to 210000 for all kwith B1=600000 and B2 will probably go to 100*B1 (just getting B1 there on all of them first).

Found 5 so far, most notably:-

111430207063079 | 2^202471+1

That's only 111T for a very low n! Was this previously missed?

p-1 = 2 * 17 * 19 * 360953 * 477881

Mystwalker
02-08-2006, 07:46 AM
We don't have k=1...

Greenbank
02-08-2006, 08:49 AM
Sorry, cut and paste error.

111430207063079 | 24737*2^202471+1

A sorted and cleaned results_duplicates_excluded_marked.txt has:-

111371229538703 21181 11665988 3259 0
111381585663619 19249 3462386 3259 0
111388239468763 10223 10494725 3259 0
111457721737709 21181 16796180 3259 0
111460132600447 10223 8997581 3259 0
111461408850647 24737 15021727 3259 0
111469919309137 10223 5272421 3259 0

I'm checking the gap between 111388239468763 and 111457721737709 now.

I know how easy it is to forget to submit factors. I failed to submit a couple before I started using Sobistrator. It's quite easy to miss off the first digit when cutting and pasting into the factor submission box.

When I went back to collect all the information to send to factrange@yahoo.com I checked each and every factor had been submitted and that's where I noticed the two I'd missed.

Speaking of which I have about 8T worth of ranges to send to factrange@yahoo.com now the emails aren't bouncing :-).

vjs
02-08-2006, 09:34 AM
Most n<300k has been missed since the orginal sieve dat was 300K<n<3M followed by 1M<n<20M then finally 991<n<50M.

Greenbank
02-08-2006, 10:03 AM
Aha, that explains it.

Nuri
02-10-2006, 06:59 AM
I'm planning to throw some ECM cycles for my range at k=67607, 9m<n<10m to see if I can get faster results when compared to P-1 alone. If I stick to P-1 only, it looks like it'll take at least two more years before I reach 999 (compared to initial projection of 18 months).

I've scanned through a couple of pages and got confused on how to run the program. It'd be very hepful if anyone could provide some help. Thanks in advance.

Keroberts1
02-10-2006, 05:43 PM
it was my understanding that with such high N values it is muhc more efficient to P-1. I could be wrong however. I don't know taht much about ECM.

Mystwalker
02-10-2006, 07:20 PM
Well, running an ECM with a comparable chance of success takes longer than P-1 factoring. Hence, as long as there are enough candidates, it's best to stay at P-1. ECM comes in handy when a particular candidate has to be factored, especially when looking for a 40+ digit factor. In this range, P-1 factoring gets less efficient...

If you want to try ECM factoring nevertheless, there are basically two (well, three) options to do it:

1a. Use Prime95
This should be the easiest one. Just use the command "ECM2=" in the worktodo.ini. The syntax is
ECM2=k,b,n,c,B1,B2,curves_to_do[,specific_sigma]

1b. Use gmp-ecm
I haven't tested this program with numbers that high, though. And I think that it will be slower, because base 2 factoring attempts are pretty optimized in Prime95. Maybe the better stage2 implementation can gain some performance points again, but this usually levels the playing field at high digit searches (guesstimate: 35+ digits) only.
Syntax:
ecm <number> <B1> [<B2>]

You can also add parameters such as -c <curveCount>.

2. Mix: Prime95 for stage1, gmp-ecm for stage2
The most efficient, but also most complex way. I don't go into further details, as knowing how to do 1a and 1b is recommended...

Keroberts1
02-10-2006, 11:02 PM
any news on the factoring of 991? anyone still trying this?

Mystwalker
02-11-2006, 11:45 AM
any news on the factoring of 991? anyone still trying this?
I do some work from time to time. If Joe O did no new curves since June 15th, we're at ~20% now.

MikeH
02-11-2006, 12:15 PM
Another update on ECM

In progress
[Thu 09-Feb-2006 23:11:03] ECM: Range 6000-25000 (19000), 25, 294 candidates
[Wed 01-Feb-2006 22:19:55] ECM: Range 70000-80000 (10000), 18, 189 candidates
[Sat 04-Feb-2006 16:02:40] ECM: Range 80000-100000 (20000), 18, 415 candidates
[Sun 22-Jan-2006 17:54:07] ECM: Range 100000-200000 (100000), 18, K=67607, 124 candidates

Nuri
02-11-2006, 04:04 PM
Thanks Msytwalker. I'll report back in acouple of weeks when I have sufficient data.

BTW, as I've already tested all of the candidates to B1=100000 B2=1000000, it's not efficiency of P-1 vs ECM. It's more like marginal efficiency of ECM vs P-1 to find the remaining factors for already easier to find ones found.

Greenbank
02-16-2006, 11:11 AM
4593042024525125134486811 | 24737*2^6127+1

The 8th smallest unfactored n for k=24737.

Found using GMP-ECM 6.0.1 on an Apple Mac Mini (1.42GHz) with suggested bounds for 25 digits. It was the 6th curve out of the suggested 206!

Using B1=50000, B2=14000000, polynomial x^2, sigma=2555558430
Step 1 took 232468ms
Step 2 took 59168ms
********** Factor found in step 2: 4593042024525125134486811
Found probable prime factor of 25 digits: 4593042024525125134486811
Composite cofactor (24737*2^6127+1)/4593042024525125134486811 has 1825 digits

Keroberts1
02-22-2006, 10:04 PM
New status: 25 digits 30 digits 35 digits 40 digits 45 digits



21181,1148 ok ok ok ok reserved (kroberts5)

21181,1172 ok ok ok ok reserved (kroberts5)

10223,1181 ok ok ok ok reserved (kroberts5)

21181,1268 ok ok ok ok

10223,1517 ok ok ok reserved

24737,1567 ok ok ok reserved (44.8%)

55459,1666 ok ok ok reserved

55459,1894 ok ok ok reserved (69.8%)


as long as noone has a problem with this or has started these already

Mystwalker
04-26-2006, 11:57 AM
New status:


25 30 35 40 45

21181,1148 ok ok ok ok reserved (kroberts5)
21181,1172 ok ok ok ok reserved (kroberts5)
10223,1181 ok ok ok ok reserved (kroberts5)
21181,1268 ok ok ok ok
10223,1517 ok ok ok ok
24737,1567 ok ok ok reserved (66.0%)
55459,1666 ok ok ok reserved (45.2%)
55459,1894 ok ok ok ok

No new factor found so far. :(

Mystwalker
06-04-2006, 07:17 AM
New status:


25 30 35 40 45

21181,1148 ok ok ok ok reserved (kroberts5)
21181,1172 ok ok ok ok reserved (kroberts5)
10223,1181 ok ok ok ok reserved (kroberts5)
21181,1268 ok ok ok ok
10223,1517 ok ok ok ok
24737,1567 ok ok ok reserved (89.5%)
55459,1666 ok ok ok ok
55459,1894 ok ok ok ok

Factors still refuse to be discovered. :(
In a week or two, the 40 digit level should be complete.

What's you status, kroberts5?

Keroberts1
06-07-2006, 10:51 PM
i actually have no idea my computer got fried a month ago and i recently aquired a new system with an AMD64 and I'm still waiting ot get it running on ecm-gmp. I don't remember how to get it running though. I don't even have the program and i don't remember how to get .tar file to work. I haven't really played much with any computers since i starte the thing running a few months ago. I'm more of a point and click kind of guy. Some help though and i would be back on track in no time.

Mystwalker
06-10-2006, 09:38 AM
This (http://www.geocities.com/omboohankvald/compileGMP.html) guide should help you.
The versions of the programs have changed to GMP 4.2.1 (http://www.swox.com/gmp/#DOWNLOAD) and ECM 6.1 (http://gforge.inria.fr/projects/ecm), though. In addition, ECM 6.1.1 will be out "soon", so maybe you want to wait. On the other hand, it's not that hard to compile a new version once you're familiar with the process...

Unfortunately, I have no AMD64-equipped PC, so I can't provide you with optimized binaries...

Keroberts1
06-11-2006, 05:10 PM
ok i have it installed but what do i do when running it i selected run from the start menu and found the directory to run it from but i can't remember the format to give the inputs with. once again any help is appreciated

Keroberts1
06-23-2006, 01:32 AM
qnyone all I need is the command format for calling the program I know the first is the B1 bounds thats all I'm sure about

Mystwalker
06-23-2006, 02:31 PM
Sorry for my late answer - I must have overlooked your question.
It should be somewhere in this thread.
Try

echo 21181*2^^^^1148+1 | ecm.exe -c 100 -n 11e6 > result.txt

This should run 100 curves with B1=11M and the default B2 on 21181^2*1148+1.

Keroberts1
06-23-2006, 06:54 PM
yeh i figured it out last night but how many curves do i need?

Mystwalker
06-25-2006, 11:24 AM
When you add the "-v" parameter, you can see it. Just look at the 45 digit value.

hhh
11-29-2006, 07:10 AM
I just encountered the following strange behaviour with mprime:

I put into the worktodo.ini the following line:

Pminus1=168451,2,1116,1,4294967295,4294967295,0

This should p-1 the number 168451*2^1116+1 with the maximal bounds of B1=B2=4294967295=2^32-1. (This is from PSP)

I gave mprime 300 or 400MB RAM.
The program starts factoring, but once at 100%, it immediately restarts at 0%, with the same number, without doing the GCD or what it is called.

Not enough memory? Other ideas?

H.

vjs
11-29-2006, 09:27 AM
It's possible that your computer is not stable enough to compute a stage 1 for that many days in a row without error.

I'm not saying this is the case but it's one possibility.

Also you can try,

168451,2,1116,1,4294967295,1,0

I think this works as well (running stage one only).

Your other option is to break the P-1 into stages.

Not sure of the exact way to do it but...

168451,2,1116,1,1000000000,1,0

168451,2,1116,1,2000000000-1000000000,1,0
etc...

hhh
11-29-2006, 01:13 PM
The whole test took only a few hours (at my surprise). It's because of the low n, I thought.
(P4, 3GHz)
H.

hhh
11-30-2006, 10:15 AM
Yet the same thing with
Pminus1=168451,2,1116,1,4294967295,1,0

I had to stop it. Now 400MB were assigned for sure.

vjs
11-30-2006, 06:39 PM
How many are a few hours < 10?

I think 2^991 which is roughly the same size took me something like 3 days on a 2.4G P4...

hhh
12-01-2006, 01:27 PM
Yes, about 5 hours.
Today, I tried the factorisation with B1=2G, it worked, tough the GCD took 0 seconds. Then I extended to 3G, but had to leave before the end.

Normally, one gets a residue, doesn't one? It think I got none.
We'll see on monday.

SlicerAce
10-15-2007, 03:30 AM
Is anyone still doing ECM factoring?

jasong
10-16-2007, 09:58 PM
I'm assuming you're interested in participating, so my response to
Is anyone still doing ECM factoring?
is:

Who cares? If you want to get involved in ecm factoring of Sierpinski numbers, go for it.

My advice is to go to the gmp-ecm forum at http://www.mersenneforum.org/ and tell them your intentions. Or you could simply surf that sub-forum and probably be able to figure things out on your own, at which point you would come back here to get some numbers.

On second thought, your first stop should be the User Guides in the 'Information and Answers' Forum at that same website I listed.

Good luck. :)

Kman1293
12-14-2007, 11:35 PM
There may not be much of a point to calculating the lower n values, but I still think its kinda fun to get rid of them. Anyways, I believe I may be the first person to have found a factor for 10223*2^1181+1. It came up on the 2260th curve I was calculating.

2869295942753555058435842630879466239475749080003 | 10223*2^1181+1

:|party|:

sturle
01-06-2009, 07:03 PM
Is anyone still doing ECM factoring?

I have been playing a bit with gmp-ecm latey, and run a few more rounds on 24737*2^991+1.

an unreasonable amount of rounds with B1 < current levels.
ca 22000 rounds of B1=1.08e9, B2 ~ 22e12 (54558 recommended for 65 digit factors)
ca 7000 rounds of B1=2.52e9, B2 ~ 78e12 (118242 recommended for 70 digit factors)

No luck. A smallest factor with less than 60 digits is very unlikely. I will probably give up soon. This number may break with SNFS some day, but it is much to large for me.

hhh
01-06-2009, 08:31 PM
an unreasonable amount of rounds with B1 < current levels.

Wow. You meant B1> current levels, though, right? I thought about attacking a bit of the 55 digit level with my limited horsepower, but I will not mess with sturle, of course. Please post your progress when you are definitely fed up.


EDIT: How did you come up with your B1/B2-bounds? The readme states:

digits D optimal B1 default B2 expected curves
N(B1,B2,D)
-power 1 default poly
45 11e6 3.5e10 4949 4480 [D(12)]
50 43e6 2.4e11 8266 7553 [D(12)]
55 11e7 7.8e11 20158 17769 [D(30)]
60 26e7 3.2e12 47173 42017 [D(30)]
65 85e7 1.6e13 77666 69408 [D(30)]

Yours seem way too high...

BTW, Kman1293, how many curves did you run, on which numbers? Not to duplicate work, if someone wants to continue. Did you submit your factor? n=1181 is still in the .dat file...

H.

sturle
01-07-2009, 01:12 PM
Wow. You meant B1> current levels, though, right?
B1 less than 1.08e9.

How did you come up with your B1/B2-bounds?
I did something like this on the particular CPU / memory combination I am mainly using:


for i in `seq 1 300`; do
./ecm-time -v -inp 24737.991.inp -maxmem 4000 ${i}e7 >> timings-4g.txt
done

And similar for 8000 MiB RAM. Among the output, gmp-ecm reports the following interesting lines:


Using B1=2520000000, B2=77978684123358, polynomial Dickson(30), sigma=1683006743
dF=1048576, k=6, d=11741730, d2=19, i0=196
Expected number of curves to find a factor of n digits:
40 45 50 55 60 65 70 75 80 85
30 96 338 1303 5454 24566 118242 608303 3288378 1.9e+07
[...]
Expected time to find a factor of n digits:
40 45 50 55 60 65 70 75 80 85
18.06d 57.24d 201.29d 2.13y 8.91y 40.12y 193.09y 993.37y 5370y 30569y

I selected B1 levels and default B2 level which gave the lowest expected time to find a factor of given size. For the machines I have been using the optmal B1 is 1080000000 for 65 digit factors using up to 4 GB RAM for stage 2, and 2520000000 for 70 digit factors using up to 8 GB ram for stage 2. This test is using a modified gmp-ecm to get expected times for factors > 65 digits.

vjs
01-08-2009, 11:15 PM
Sturle,

Wow you bring a tear to my eye with this one.

I tried on 24737*2^991+1 quite a bit quite a few years back now.

I'm pretty sure I ran a P-1 out to B1=B2=4G or whatever the maximum value is for B1 with version 24.???. Took quite a few days if not weeks...

Also ran quite a few ECM curves as well. Our numbers look similar except my B2 was smaller I only had 4G to work with.

Sorry can't say that I remember how many but it wasn't much.

I also ran at least 2 P+1 at high values as well.

I think your right that SNFS is probably the way to go but I don't think we are there yet with NFS.