Page 2 of 5 FirstFirst 12345 LastLast
Results 41 to 80 of 185

Thread: Small n factoring

  1. #41
    so is ECM2=24737,2,991,1,11000000,1100000000,100 the inpout i should use? are you past that yet?

  2. #42
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    You should use
    Code:
    ECM2=24737,2,991,1,43000000,4290000000,100
    This is for the 50 digit level. You can also try to replace 4290000000 by 0 - I *think* that works as well. But I don't know for sure...

  3. #43
    i didn't think the bounds looked hig henough so i started it off using ECM2=24737,2,991,1,110000000,11000000000,100
    is that going to not work or is it just going to take longer? I've got 14 curves done so far

    How will i kno if something is found? Where is the output placed? shouldi see anything until all of the curves are done?

  4. #44
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    I've been told before that there are good reasons for starting at the specified bounds for each curve and increasing accordingly.

    I think the best answer is b/c it does take longer.

  5. #45
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    1. Prime95 is not suited for B1=110M (a.k.a. "the 55 digit level"). This is because the max. overall bounds for Prime95 is 4290M, but with B2=100*B1 (which is smaller than optimal), B2=11000M...
    Using max. bounds, a Prime95 curve only counts as ~1/3 standard curve, IIRC.
    For B1 > 43M, yo ushould definitely use gmp-ecm. Before, I'm not sure which one is faster, as stage1 is faster with Prime95, whereas stage2 is faster for gmp-ecm.

    2. Output will be in "results.txt" - either when a factor is found or when all curves have been done.

    3. A factor can be found after the completion of each curve (after each stage of each curve, to be specific). ECM is similar to P-1 factoring. The big difference is that ECM tests (with different sigma values) search for different group orders.
    As a massive (and mathematically wrong, but descriptive) oversimplification, just imagine ECM as "P-sigma factoring".

    4. Searching for increasing digit counts of factors is AFAIK proven to be most efficient. So, the 50 digit level (B1=43M) should be done next.

  6. #46
    24737*2^991+1 completed 100 ECM curves, B1=110000000, B2=2410065408

    no factors

    I don't know what that means for the 50 digit search

  7. #47
    i found GMP-ECM and i was going ot do a few hundred curves for the 60 digit search but i can't find references for running the program. I managed ot get it installed because of the very useful readme file but it doesn't say where to place the data for values to be tested on and what bounds i would like. Anyone familiar with GMP-ECm am i just overlooking something obvious?

  8. #48
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    You'll need 9743 of those curves for the 50 digit level and 58080 curves for the 55 digit level.

    For the 55 digit level, you should definitely try gmp-ecm. Stage1 will take longer (you *can* take Prime95 here, it's only a bit more administrative work), stage2 maybe as well, but you need way less curves:

    50 digit level: 3155 curves
    55 digit level: 17899 curves

    If you tell me what system you use (P4, AthlonXP, Athlon64, ...), I can send you an optimized Windows binary.

    edit:
    Ah, I see you've found it yourself.
    I'll write an explanation now.
    Last edited by Mystwalker; 05-02-2005 at 04:53 PM.

  9. #49
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    First, I'll do some preliminaries:

    1. Make sure you've got gmp-ecm6. Version 5 is definitely slower, especially at these high bounds.

    2. Make sure you've got a binary that's optimized for the system you're running it on. An Athlon64-optimized version on a A64 is maybe 50% faster than a P4-optimized one on that A64.

    3. For the 60 digit level, I'd suggest that you have at least 1 GB RAM. You can do it with less, but it will take ages. 2 GB are better, more RAM is optimal, but not necessary.

    4. I'd say that in order to familiarize oneself with the program, it's best to try smaller factorizations first - so less time is wasted when something goes wrong. OTOH, gmp-ecm is quite easy to handle, and it's hard to do errors that stay unnoticed.

    Having said that, if you want to use gmp-ecm for both stages (1 and 2), I'd suggest putting the number to factor (517693523749349506216820125753852827887841227092964547799532292131198709993702048088168912046169305 5176232309367482850198174690025814604003221689364557004783690027403238808630592990673934138182356151 4180453107549154075484596024084592007433709786778382372595169635155000276572801873236222107804025592 2177) into a text file called SB.txt into a folder with ecm.exe (or what yours is called).
    Then, simply call:

    Code:
    ecm -c 100 -n -one 11e7 < SB.txt >> SBresult.txt
    This should do 100 curves with B1=11e7=110,000,000 on the number, the output will go into SBresult.txt
    The ">>" means that it will append the file and not overwrite it.

    "-c 100" are the 100 curves
    "-n" lets it run at lower priority
    "-one" tells the program to stop once a factor is found. This has the nice effect that as long as the program is running, you know no factor has been found so far.
    "11e7" is the B1 limit for 55 digits

    You can get all parameters with "ecm --help".


    Some things to consider:

    1. When you stop the program, the currently processed curve is lost.

    2. You'll need more than 1 GB RAM to do such a curve without running out of memory. You can reduce memory needs by two parameter:

    "-k <n>" where <n> is a number > 2. I don't remember the exact numbers, but IIRC, when you multiply the value by 4 (default value is 2), you use half the memory. Of course, curves take longer this way. You have to figure out what k value is still okay for your computer. With 1 GB RAM, I use "-k 4", but only in conjunction with the other parameter:

    "-treefile <treefile name>" where <treefile name> is an arbitrary name. With this parameter, gmp-ecm puts some files into the directory to lower memory needs. Although the hard disk naturally is some orders of magnitude slower than system memory, the performance impact is really low, as these files are only seldomly needed. In fact, I think accesses only occur at the end of a curve (and they get created at the beginning).

    3. You can optimize performance by experimenting with the B2 value. By default, it's 680270182898 for B1=11e7. Just put other values as a parameter behind the B1 value (e.g. "ecm [...] 11e7 560e9[...]"). With the parameter "-v", you'll get an output at the beginning of stage2, telling you how many curves are needed. All you have to do is find the B2 value where "needed curves * time for one curve" is minimal...

  10. #50
    ok i have gotten it to run but it isn't displaying anysort of status update. i have a athalon 2000 how long should a curve take is there any way ot make iyt dispaly updates? or anything? i Also only have 256 MB is my machine any good for factoring? Is it even worth doing?

  11. #51
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    It's been a while since I worked on this number but I have been thinking about it lately, here is the question...

    24737*2^991+1 is basically 303 digits long, I know I finished some of the curves off for a 40-digit factor, Joe and mystwalker did everything for a 45-digit factor.

    So this implies for this number all factors are greater than 45 digits! So it's only possible to have 6x50 digit factors at best ... 3x100 digit factors ... 2x151 digit factors ... or some multiple such as 150-digit x 80-digit x 70 digit, etc...

    --------------------------------------

    I've been thinking about the way p-1 works and the previously undocumented b2-b1 method...

    If one uses very large B1 and B2 Bounds, and considers very very large b1 and b2 bounds the factors doesn't have to be very smooth at all. If it had a 45+ digit factor for example, if we could B1=B2=23-digits we would find that factor for sure.

    First question....

    What is the largest B1 possible which program would we use and how much memory would be required?

    (I think the max I was able to use was B1=B2=9007199254740995 (16-digits from my bats but I'm not sure if it worked)


    Start with the largest B1 bound possible, I.E. B1=B2 what is the largest possible B1 for stage1.

    If the largest were around B1 were 16 digits (perhaps the new ecm6 can do p-1 with larger b1) we would esentially re-eliminate all 31-digit factors.

    (Factor -1) would have to be less smooth than = B1(16 digits) x B1(16 digits) = 31 digits.

    Factor -1 > 16-digits x 16 digits

    Next

    Then we could distribute the stage1 pass files to people interested and try various B2-B1's, going to ever increasing B2's.

    For example B1=16 digits (done with stage1) b2=X2-X1 next B2 would be X3-X2

    Where X1, X2, X3 would be distributed amoung ourselves.


    Perhaps this is the way NFS works etc, I'm just trying to think of other possibilites than doing more ECM. Also I'm not sure how much P+1 was done on this number nor how successful it would be.

    I assume factors can also be P+1 smooth? and the factor could be P+1 smoother than P-1 smooth...

    Thanks just trying to learn something.

    Also FYI I'd be going this on a Barton 2200 MHz, 1G of dual channel, fry's also has 2G of dual channel for ~$220 which looks very tempting right now.

  12. #52
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    I know this is a silly question, but to be honest I have not looked into the logic much deeply. That being said, here's my question.

    With ecm, if you run the calculated number of curves for a digit level, does it guarantee that there is definitely no factor for that number with digits less than or equal to that digit level?

    From Mystwalkers post, for example (and assuming the calculations are correct - not that I doubt them btw), does running

    3155 curves at 50 digit level guarantee that there are definitely no factors with less than 50 digits? Or the same with 55 digit level: 17899 curves case.

    Or, is there still a chance (althoughy slight) that there might be a factor with less number of digits?

    And, another way of looking into it, let's assume its theoretically and practically possible to continue the search until 155 digit level. Then could one say, with the calculated number of curves, ecm would definitely find "the factor".

    And one last other way of looking, if it were to find no factors until 155 digits, would it be possible to assume that 24737*2^991+1 is infact, prime?


    PS: I'm not suggesting anything. Just trying to understand the logic.

  13. #53
    this is my understanding of ECM.

    ECM is a probabilistic algorithm.

    the standard listed number of curves when run has a chance of a missed factor for the given size of 37% (its like 1/e where e=2.71828183... )

    (http://www.loria.fr/~zimmerma/records/ecm/params.html)

    i also had thought (but may be mistaken) that it only work if EXCATLY 1 factor is within B1 and B2. eg if there are factors 123001 and 124001 would only be found if B2 is between 123001 and 124001..
    ie B1=100000,B2=200000 would fail
    B1=100000,B2=123500 would succeed

    but i couldnt find a reference that supports this claim.
    edit: found it http://www.mersenneforum.org/showthread.php?t=194
    this page goes onto say ECM (and P-1) find factors that are (B1,B2)-smooth (1 factor between b1 and b2)


    --
    Shoe Lace
    Last edited by ShoeLace; 05-03-2005 at 11:45 PM.

  14. #54
    With ECM, there is always a chance that a factor has been missed. "3155 curves" really means that if there is a factor of that size, then each curve has a probability of 1/3155 of finding it. The probability of not finding it is (3154/3155)^3155. It's well known limit that ((n-1)/n)^n approached 1/e. Hence the surprisingly large probabilty of about 37% that the number has been missed. However, it has a much higher probability of turning up with the next set of parameters, so in practice the missed numbers tend to show up quickly on the next level.

    With ECM, you find the factor if a number near the factor is smooth enough - which number depends on which "sigma" is used for the starting point. So multiple factors can come out at one time, but it's more common for them to come out in different curves. With P-1, you will always get the product of all the factors that are sufficiently smooth. P+1 has this additional twist that depends (50/50, I think), on the starting point, so you might get none or one or a product on sufficiently smooth numbers in any one trial.
    Poohbah of the search for Odd Perfect Numbers
    http://OddPerfect.org

  15. #55
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Perhaps I was a little too affermitive with my post regarding the digit level.

    As Shoelace pointed out and nuri eluded too.

    Edit: Realised Wblipp, replied while I was responding, his explaination is obviously superior


    Running ecm to the 50-digit level and finishing all curves for a 45-digit level doesn't mean with 100% certainty that a 45-digit factor does not exist.

    As shoelace said there is only 63% possibility that no factor <45-digits exist.
    What these B1,B2 bounds and number of curves run for at particular level represent is the point at which one should switch to higher bounds. In other words searching with bounds for 45-digit factors is less optimal than switching to 50-digit.

    The best answer on this topic I ever recieved was "there are good reasons for doing all the curves required for lesser-digit factors first. Then continuing with higher digit bounds" (ECM is probabilistic).

    However when you do curves in order 20,25,30,35,40,45,50-digit, as your doing curves for the 45-digit the probability of a missed factors at the lower levels 30,35,40-digits decreases.

    A good example is running the alpertron applet

    http://www.alpertron.com.ar/ECM.HTM

    Try a few curves for 99^100+100^99, as the numbers increase the possibility of smaller 15-digit factors decrease. We can pretty much say that there is a near zero possibility that a 25-digit factor exists at this time. Not sure if we can say this for a 35 and certainly not a 40.

    ------------- P-1 -----------------------------------------

    i also had thought (but may be mistaken) that it only work if EXCATLY 1 factor is within B1 and B2.
    I don't think that is correct but I may be mistaken as well, I know something similar is true for P-1, however.

    Let's take this factor, this sort of goes with what I was talking about for P-1 with high B1 bounds.

    32176897563079 | 4847*2^29601063+1

    32176897563079 - 1 = 32176897563078 (this is for P-1 factoring)

    32176897563078 factors into = 2 x 3 ^ 2 x 67 x 2063 x 12 932951

    So if you wanted to find this factor using P-1 factoring (I used sieve of course) the most efficient B1 and B2 bounds would have been.

    b1=2 064
    b2=12 932 952

    Of course you could have used B1=B2=15 000 000 and that would have actually found the factor in stage 1, but it would have been a waste of time. (But, how do you know what the optimal bounds are for finding the factor before you find the factor??? It's impossible)

    But if there was no factor found what does this imply?

    If using B1=B2=15 000 000 not found any factors we would have known for certain that no factor exists < 14-digits exist.

    factor > ( B1 X B2 ) + 1
    factor > (15 000 000 x 15 000 000 ) + 1
    factor > 225 000 000 000 001 (15-digit)

    Of course a factor >15 digits could be found if it's smooth, i.e. other numbers <B1 could be multiplied to form the P-1, example the 2 x 3 ^ 2 x 67 portion

    As in the above case b1=2 064 x b2=12 932 952 = 26693612928
    but
    26693612928 < 32176897563079

    So the factor was found b/c it was somewhat smooth (had the 2 x 3 ^ 2 x 67 portion, all these numbers were less than B1)

    ___________________

    Now for P+1 (I'm uncertain about this, but I think the prinicple is the same)

    Even using b1=b2= 15 000 000 with P+1 we wouldn't have found this factor...

    32176897563079 +1 = 32176897563080

    32176897563080 = 2 ^ 3 x 5 x 563 x 1428 814279

    B1=B2=15 000 000

    1428 814 279 > B1=B2

    so it wouldn't have been found using P+1 with the same bounds.

    Should I continue???
    Last edited by vjs; 05-04-2005 at 11:38 AM.

  16. #56
    Originally posted by vjs
    Running ecm to the 50-digit level and finishing all curves for a 45-digit level doesn't mean with 100% certainty that a 45-digit factor does not exist.

    As shoelace said there is only 63% possibility that no factor <45-digits exist.
    We say it this way, but it's sloppy. The 63% is the probabilty that we would have found such a factor IF IT EXISTS. However, it's not very likely that any such factor exists, so the probability there is no such factor is higher.

    Based on this heuristic:

    http://elevensmooth.com/MathFAQ.html#HowMany

    the probability of any factor between 10^40 and 10^45 is about 12%, and we would have found about 2/3 of any that exist, so the probability there remains an unfound 45 digit factor is only about 4%.
    Poohbah of the search for Odd Perfect Numbers
    http://OddPerfect.org

  17. #57
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Wblipp,

    Thanks for the explaination and clearing up my mistakes, then considering we have tried up to the 45-digit level... Is there any reason to try P+1 with very high bounds?

  18. #58
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    thx for the replies.

  19. #59
    Originally posted by vjs
    considering we have tried up to the 45-digit level... Is there any reason to try P+1 with very high bounds?
    I don't know. People differ on this subject. I've set up ECM server for OddPerfect.org so that it does one P+1 curve at each level, and works three levels above the ECM level. I like working ahead because P+1 and P-1 are faster than ECM and have about the same chance of finding a factor. You need multiple curves for P+1 because for each factor, half the start points won't find that factor. The traditional way to get this is to run three P+1 curves at each level, but I figure three curves at successively higher levels gets me similar coverage for the lowest level and a shot at the upper levels. It feels like the right tradeoff of effort for chance, but I don't have an analysis to back that up.

    William
    Poohbah of the search for Odd Perfect Numbers
    http://OddPerfect.org

  20. #60
    is there a way to make GMP-ECM reort progress intermitantly? Does it only report anything when it finishes a curve? find a factor? how long will one curve with a B1 of 11e6 take with a athalon 2000 at 1.67 and 256 MB ( I only want to use 200 MB at most what should i use as the command line for this)

  21. #61
    ok well i was playing around and tried to have itrun with a B1 value of 77e7 and it finished stage one but crashed because i didn't have enoug hmemory for stage two. Is there anyway to recover stage 1? I used the treefile option and because it crashed suddenly i still have the treefiles. Do these contai nthe data for stage 1? i could also provide the sigma value i used.

  22. #62
    I'd like to try some P-1 again for 991 and i was wondering if anyone has stage one work for it. using gmp-ecm I'd like to try stage one to perhaps 10^16 i don't know how long that would take but i assume that stage two could then be broken up and distributed to computers wiht more memoryt than i have and perhaps we could finally put this one to rest.

    This was mentioned before but i don't know how much work was actually done on it.

  23. #63
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Keroberts,

    I'm not sure if P-1 B1=10^16 is possible for a B1 bound. I know some time back I used ecm5.0 for P-1 with a very high (I think maximum) B1 value it took quite a few days.

    I think your using a athlonxp with 256MB if I remember?

    You might want to try ecm6.0 P-1 (-pm1 flag) with an extremely large P-1 the program will prompt you that the max B1 value was exceeded, note the value.

    Then take that value and divide it by 1000 and run stage one with it. It will complete in a reasonable time. Muliply this time by 1000 and see if your willing to dedicate that much time without a reboot.

    Also make sure to save the file.

    edit....

    O.K. I checked the max B1 with ecm6.0 for k7 processors
    This would be your command line if you first save the number as 991.txt and save the stage 1 as stage1.sav.

    k7 -pm1 -save stage1.sav 9007199254740996 9007199254740996 <991.txt

    Also P-1 is not 100%, the factor has to be smooth. By running this you would really only prove that no factor <16-digits exist for this number.
    Last edited by vjs; 06-06-2005 at 11:28 PM.

  24. #64
    butit can also find prime factors that are one more than a numbre with lots of 16 digit or less factors. There is almost no limit to the size of this potential factor.

  25. #65
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    AFAIK, it's much better to use Prime95 for numbers of the form k*2^n +/- 1 than gmp-ecm.

  26. #66
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    Originally posted by Keroberts1
    butit can also find prime factors that are one more than a numbre with lots of 16 digit or less factors. There is almost no limit to the size of this potential factor.
    True!
    But, based on the amount of ECM that Mystwalker and I have done so far, there is only a 27.738% chance that a 45 digit or less factor exists. I think that you also have done some ECM work on this number. Please post your B1 and B2 values and the number of curves for those values so that i can include them in my tables.

    Some time ago I ran P-1 on this number to some very high B1 B2 values, and have saved the file.
    I'll email it to your AOL account in the next day or so.
    Joe O

  27. #67
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    O.K. I must be doing something wrong is it possible that the max B1 value of stage one for this number would take 1500 years????

  28. #68
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    I haven't tested, but it will definitely take a lot of time. 1500 years could be correct - if there was enough memory in your PC (probably a Terabyte or two?)...

    You could give 430,000,000 a try. That would be an optimal choice for a pre-50-digits-ECM-testing...

    In addition, you could run 3 P+1 curves at B1=215,000,000

    After that, ECM should take over again. I don't know the exact cause, but the larger the factor, the more unlikely it is that it gets found with P-1 compared to ECM...

    IIRC, everything over 40-50 digits shouldn't be tried much with P-1.

  29. #69
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Mystwalker this is correct of course but is there any harm in doing...

    k7 -pp1 -v -save 991pp1.sav 10000000000 10000000000 <991num.txt

    b1=b2=10,000,000,000

    It would be quite a bit better than a 430M also 3 curves at B1=B2=430M would only take a few hours.

    I just did a B1=B2=100,000,000 it took 950 seconds on a 2200 mhz barton and it did not consure much memory at all.

    If extrapolation holds true it would take 100x longer or little more than 1 day per curve if you don't run out of memory for b1=b2=10,000,000,000.

    Also it would make some sence to have b2= XB1 where x is small considering he only has 256Mb.

    Also I'm not sure can you actually contiune a stage 2 from a B1=B2 P+1???

    I've always been interested in P+1 but don't truey understand it. It doesn't make sence to me that you have to run it three times... I'll probably look into the math involved when I get a chance.

    Note:

    when you say B1=??? are you also assuming that we run B2=100*B1 ???

  30. #70
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Also for those interested here are the P+1 records...

    Code:
    48  884764954216571039925598516362554326397028807829  L(1849)  A. Kruppa  29.03.2003 (*)  10^8  10^10  
    47  79382035150980920346405340690307261392830949801  10100+15  Martin  11.08.2004  108  1011  
    45  173357946863134423299822098041421951472072119  13*2738-1  P. Zimmermann  24.02.2005  109  1013  
    42  514102379852404115560097604967948090456409  86124+1  P. Zimmermann  23.02.2005  106  1011  
    42  264237347008596079071617575175788166361473  13*2973+1  P. Zimmermann  19.02.2005  108  1010  
    39  134368561962115712052394154476370507609  162*11162+1  P. Leyland  2002 (*)  10^7  10^9  
    38  36542278409946587188439197532609203387  L(1994)  A. Kruppa  30.03.2003  10^8  10^9  
    38  14783171388883747638481280920502006539  10917+17109  N. Daminelli  25.03.2003  10^7  10^9  
    37  9535226150337134522266549694936148673  10596+1  D. Miller  16.04.2003  10^7  10^8  
    37  4190453151940208656715582382315221647  45123+1  P. Montgomery  1994 (*)  10^7  10^9
    So looking at the records a B1=B2=10^10 might give a 40-50 digit factor for this number...

    Regardless I think these are the type of B1=B2 values we need to consider...

    After I've completed a bit more low-p 991<n<50M sieve I might try a curve or two.

    Keroberts are you interested in trying stage1 B1=B2=10^10? I wouldn't do anything less if you do and you can save/continue with a stage2 I might run stage2 up from yours or try b1=b2=10^11 if possible.

    FYI the records:

    884764954216571039925598516362554326397028807830
    = 2 x 5 x 19 x 2141 x 30983 x 32443 x 35963 x 117833 x 3063121 x 80105797(8-digits) x 2080952771 (10-digits)

    So his bounds were enough but close!

    79382035150980920346405340690307261392830949802 =
    2 x 11 x 409 x 701 x 1063 x 2971 x 3181 x 347747 x 9 056417 x 12 073627 x 32945 877451
    Last edited by vjs; 06-07-2005 at 04:11 PM.

  31. #71
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    O.K. one more post... I did one curve with

    B1=100000000
    B2=100000000000

    Stage 2 took about 480MB of memory...

    I'm starting to think that a B1=10^10 and B2=10^11 is possible I can at least for a short time upgrade one of my machines to 3.5G.

    If someone want's to run three B1=10^10 curves take less than a week. If we don't find a P+1 factor with those bounds I'm pretty sure I could do the stage 2 with 10^11 or greater next week.

    Also if a factor is found ecm6.0 will simply print it to the screen correct?
    Last edited by vjs; 06-07-2005 at 05:37 PM.

  32. #72
    I'd be willign to dedicate my athalon 2000 to P-1 for a month or s oif it could getto a depth of say 10^11 THen the B2 search can be broken up in to seperate parts right? Several people could search for say 10^11 to 10^12 and 10^12 to 2*10^12 and so on. Is this correct? Can this be done? Perhaps we could then push B2 up to 10^14 or so and have a very good chance of finding a factor. (hopefully, ideally we would have a much larger B1 value with B2 that big but if B2 can be distributed why not push it farther?)

  33. #73
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    keroberts,

    Are you interested in P-1 or P+1, IMHO we havn't gone enough P+1. I know I did a very serious P-1 on this number some time back Joe may have the file with B1 B2 bounds.

    If you want to do P-1 I could rerun the calculations and suggest a B1 value with time.

    Both P-1 and P+1 can be continued from stage 1 B1=B2 with any B2 value. Let me know personally I'm more interested in P+1 now especially since I can't remember the P-1 values I used.

    Let me know either way, which ever one is done I'd suggest a very very large B1 value. If we over shoot great at least it's been factored.

  34. #74
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    keroberts and others,

    I believe this is the most optimized version of P+1 or ECM currently...

    Xyxxz's post

    http://www.mersenneforum.org/showthr...?t=3766&page=3

    I always rename the ecm6-K7.exe version to K7.exe

    To run the program create a bat file with the command in it as opposed to typing the command at the dos prompt. Second only start the bat from the command line, otherwise when it finishes the window will close and you won't see the factor. I havn't figured out how to save the factor to a file I believe you use the command >fact.txt.

    Here is an example of the bat I use.

    k7 -pp1 -v -save stage1.sav B1 B2 <number.txt

    k7 - this is the name of the program

    followed by the switches...

    -v is verbose mode

    -pp1 is for doing P+1 factoring
    -pm1 is for P-1 factoring.
    if you don't include either of the above switches it will run ecm.

    B1 and B2 are the bounds of course

    <number.txt - this tells the program to use the number contained in the file number.txt (you can name it what you wish.

    I always expand the number using dario's applet

    http://www.alpertron.com.ar/ECM.HTM

    enter 24737*2^991+1 into the applet it producced the number

    cut and past this number into a .txt file with notepad, remove the spaces and hard returns so that it's all on one line and save.

    When you run K7 it should show a 303 digit number

    517...922177

    example of bat again.

    k7 -pp1 -v -save 991pp1.sav 10000000000 10000000000 <991num.txt

  35. #75
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    I ran three P+1 curves at

    B1=10000000
    B2=1000000000

    since it didn't take too long I then ran three P+1 curves at

    B1=100000000
    B2=10000000000

    The difference in time by increasing bounds by 10X, Stage 1 10x longer stage2 4x longer.

    1 curve of stage1 with B1=B2=10000000000 would take about 2.5 days on my 2.2G barton system. And you can contiune stage two from the stage1 save file but each stage 1 started with a different x0= as you will note.
    Attached Images Attached Images

  36. #76
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    O.K. Joe just sent me the B1 B2 values either he or I used in the past for P-1.

    B1=64100675
    B2=6205033714
    The machine I did it on before was a dual P3-866 with a 1G of ram and ecm-5.03-P3

    However I just ran another P-1 with slightly larger B1 bounds and a huge B2 on a dual barton with 1G of ram.

    B1=100000000 (10^8)
    B2=1000000000000 (10^12)

    I didn't take long for the B1, I was trying to see how large a B2 I could use. I could potentially use a larger B2 but it would be more benifital for someone else to do the stage1, the above run consumed a max of 1.8G of memory during stage 2 but stage one didn't take much at all.

    I have a fast 15K scsi drive on a adaptec card so swapping a little doesn't hurt as much as one would think. I may also be able to bump the machine up to 2G of memory for the stage2.

    If there are any takers on the stage1 P-1 or stage1 P+1 I'd suggest a B1 value of
    B1=10000000000 (10^10) minimum.

    If we are going to try a distributed approach to P-1 I'd even like to see a higher B1, it's sort of the base for the effort. Any takers???
    Last edited by vjs; 06-08-2005 at 11:16 PM.

  37. #77
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    O.K.

    P-1 done to B1=4.2G no stage2.

    B1=2^32-1 limit of prime95 24.12

  38. #78
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    1057517737707584916346232353319 | 55459*2^1966+1

    I currently factor all composites 1000 < n < 2000 for 35 digits using ecm. Let's see which else stumbles.

  39. #79
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    10223*2^1529+1 has a factor: 14826855978213563035553602013

    2 down, 10 to go (from which 3 resisted the 35 digit level).

    What's really special:
    Prime95 found this one in stage1 (B1=1,000,000)! :shocked:

  40. #80
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    250264328954609482327059485767739|24737*2^1543

    That concludes the 35 digit range:

    Code:
    		25	30	35	40			
    
    21181,1148	ok	ok	ok	reserved
    21181,1172	ok	ok	ok
    10223,1181	ok	ok	ok
    21181,1268	ok	ok	ok
    10223,1517	ok	ok	ok
    24737,1567	ok	ok	ok
    55459,1666	ok	ok	ok
    55459,1894	ok	ok	ok	reserved
    4 have fallen, 8 to go...
    I will take out some effort on this, though. If someone wants to factor some of these candidates, please drop a line.
    Last edited by Mystwalker; 09-05-2005 at 11:23 AM.

Page 2 of 5 FirstFirst 12345 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •