Page 6 of 10 FirstFirst ... 2345678910 LastLast
Results 201 to 240 of 386

Thread: P-1 factorer

  1. #201
    Senior Member
    Join Date
    Dec 2002
    Location
    Madrid, Spain
    Posts
    132
    Two questions that have been discussed, but still are unclear for me:

    - What is currently the best factor value? 1.5? I'm using 2.0, which takes almost as twice computing time than 1.5.

    - I have tried several mem values over 256 Mb (300, 310, 330 Mb), but it always allocates ~252000 kB for stage 2. I think this is beacuse there are some kind of "chunks". Couldn't this be improved?

    Thanks!

  2. #202
    Member
    Join Date
    Feb 2003
    Location
    Lucerne, Switzerland
    Posts
    30
    A factor value of 1.5 seems quite good even if don't have too much statistical material to analyse.

    At the moment it seems as if primality testing is approaching p-1 factoring. There is a huge gap of not even reserved candidates above 4188000 which is reached by the main stream testing in a few days.
    Would it not be better to decrase the factor value a bit to prevent leaving un(p-1)ed candidates?

    biwema

  3. #203
    Senior Member
    Join Date
    Feb 2003
    Location
    Sweden
    Posts
    158
    Yeah, it's probably better to lower the factor value a bit for now. We need more p-1 firepower! Desperately.

  4. #204
    yes, you will likely need to add more power to P-1. I had a lot of P4s working on it for awhile but I won't be able to use them for about a month so someone else will have to replace the 50GHz+ I was putting on it.

    Everyone will have to drop to a factor level of 1 or add about 20 more computers at 1.5 to keep up.

    also, if you want to see where you should be, the next test the server is going to distribute is always in http://www.seventeenorbust.com/sieve/next.txt It is only updated a few times a day and can sometimes get (temporarily) smaller due to expiring tests but it will give you a ballpark idea of where tests are being assigned so you know where to be to stay ahead of prp testing.

    -Louie

  5. #205
    I'm trying to figure out how to use this appication I can't get it to work. I downloaded it and installed it into the same folder as SB 1.10 but after clicking on it it just opens for a second but then immediatly closes. Is there something simple I'm forgeting to do, I'd like to help out here because obviously you need more processors.

  6. #206
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    Originally posted by Keroberts1
    I'm trying to figure out how to use this appication I can't get it to work. I downloaded it and installed it into the same folder as SB 1.10 but after clicking on it it just opens for a second but then immediatly closes. Is there something simple I'm forgeting to do, I'd like to help out here because obviously you need more processors.
    If you create a bat file with sobfactor.exe help as the only line and execute it you will get something similar to:
    Code:
    C:\#GMP-ECM50\sbfactor10>sbfactor11.exe help
    SBFactor v1.1
    P-1 and ECM factoring for number of the form k*2^n+1.
    Adapted from GIMPS v23.4 by George Woltman and Louis Helm
    Intel Pentium III or Pentium III Xeon processor detected.
    Factors are printed and dumped to fact.txt
    P-1 factoring a single number with set bounds:
    C:\#GMP-E~1\SBFACT~3\SBFACT~2.EXE <k> <n> <B1> <B2> <mem>
    P-1 factoring a single number with optimal bounds:
    C:\#GMP-E~1\SBFACT~3\SBFACT~2.EXE <k> <n> <factor depth> <factor value> <mem>
    P-1 factoring a range of numbers with optimal bounds:
    C:\#GMP-E~1\SBFACT~3\SBFACT~2.EXE <n low> <n high> <factor depth> <factor value>
     [[cpu #] [total cpus]] <mem>
    ECM factoring a single number with set bounds and # of curves:
    C:\#GMP-E~1\SBFACT~3\SBFACT~2.EXE <k> <n> <B1> <B2> <curves to run> <mem>
    
    <factor depth>: how much the number is factored expressed as a power of 2
                    45 means the number has been factored to 2^45 = 35 trillion
    <factor value>: how many prp tests a factor would be worth
                    values of 1.2 to 1.5 are recommended
    presuming you are running in windows. If you want to reserve a range go to the coordination thread and reserve a small range eg 4191000- 4191100 then try run.bat 4191000 4191100 to see what happens

    You should get:
    Code:
    C:\#GMP-ECM50\sbfactor10>sbfactor11.exe 4191000 4191100 45 1.3 256
    SBFactor v1.1
    P-1 and ECM factoring for number of the form k*2^n+1.
    Adapted from GIMPS v23.4 by George Woltman and Louis Helm
    Intel Pentium III or Pentium III Xeon processor detected.
    256MB of memory avilable for stage 2
    Finished parsing SoB.dat
    4 numbers between 4191000 =< n < 4191100
    Searching for known factors in results.txt...Done.
    Searching for known factors in lowresults.txt...Done.
    Removed 0 numbers using the factor files
    Testing 4 numbers between 4191000 =< n < 4191100
    Reordering array by n value...Done.
    Estimating for k=27653  n=4191009
    Estimating for k=10223  n=4191017
    Estimating for k=19249  n=4191026
    Estimating for k=21181  n=4191092
    Expected number of factors for entire range: 0.042070
    B1=20000 B2=190000 Success=0.010517 Squarings=46121
    P-1 on 27653*2^4191009+1 with B1=20000, B2=190000
    initializing test
    sieve finished<<this really means that it started
                          there will be status lines after this
                          when it's really done it will tell you how long it took
                          then it will go on to the next k n pair
    27653*2^4191009+1 stage 1 is 0.867 complete.
    27653*2^4191009+1 stage 1 is 1.735 complete.
    27653*2^4191009+1 stage 1 is 2.602 complete.
    27653*2^4191009+1 stage 1 is 3.470 complete.
    27653*2^4191009+1 stage 1 is 4.337 complete.
    then edit your reservation to read:
    4191000 4191100 keroberts 4 0.042070 ? [reserved]
    when you are done running the range post a line with the ? changed to the number of factors you have found and the [reserved] changed to [completed]

    There will be a file 276534191009 created after 10 minutes and updated every 10 minutes after that. This will be used to restart from if you stop the run. It can also be used to rerun with other parameters i.e. change the 1.3 to 1.5 or even run an individual pair with B1 B2 of your choosing. If you do not plan to do any more runs with this K N pair, then you may delete this file.

    If you find a factor, it will be written to the end of fact.txt (i.e. appended not replaced). Submit it at the usual location.
    If the factor is greater than 2^64 (~18,446,744,073,709,600,0000), then submit it at the large sieve page.
    If the factor is greater than 2^128 (~340,282,366,920,938,000,000,000,000,000,000,000,000), then submit it to Louie via email.

    Periodically, you will want to refresh your results.txt file. The latest version can be found here in BZIP2 format.

    If you want to know what the next K N pair to be handed out is, look here.

    Thanks to Mikael and JMBlazek for the additional ideas.
    Last edited by Joe O; 07-25-2003 at 01:04 AM.
    Joe O

  7. #207
    Junior Member
    Join Date
    Jun 2003
    Location
    Poughkeepsie, NY
    Posts
    19
    Originally posted by Joe O
    ...then edit your reservation to read:
    4191000 4191100 keroberts 4 0.042070 ? [reserved]
    when you are done running the range post a line with the ? changed to the number of factors you have found and the [reserved] changed to [completed]

    There will be a file 276534191009 created after 10 minutes and updated every 10 minutes after that. This will be used to restart from if you stop the run. It can also be used to rerun with other parameters i.e. change the 1.3 to 1.5 or even run an individual pair with B1 B2 of your choosing. [/B]
    Thanks Joe O for the additional explaination...I needed it as well. I have a few additional questions:

    - I have 20+ ########## files and more are added every couple of hours. After a full run is complete, is it ok to delete them?
    - will fact.txt be overwritten or amended between runs?
    - If prp testing catches up to my range, should I continue to process or stop and select a higher range?
    - I'm currently processing at a 1.5 factor and 500 megs on an athlon 1800. If I increase the megs, will it go faster or should I just lower the factor?
    - Do I need to be updating the results.txt, lowresults.txt, or SoB.dat files? If so, where do I get newer files?

    Thanks for your help!

    BTW... my ranges should read:

    4182500 4183000 jmblazek 15 0.187007 1 [completed]
    4186000 4188000 jmblazek 66 0.822831 ? [reserved]

  8. #208
    Senior Member
    Join Date
    Feb 2003
    Location
    Sweden
    Posts
    158
    jmblazek,
    some answers:
    >- I have 20+ ########## files and more are added every couple of hours. After a full run is complete, is it ok to delete them?

    unless you're planning on doing more testing of the same numbers, but with higher bounds. You probably don't want to do this, so the answer is yes.

    - will fact.txt be overwritten or amended between runs?

    new factors are added to the end of fact.txt

    - If prp testing catches up to my range, should I continue to process or stop and select a higher range?

    if the prp test limit exceeds the n value you're testing it's probably better to skip some n until you're above the limit again. Best is probably to plan a bit ahead and lower the factor value so you can stay ahead. Also keep in mind that it's the time of your factor submission that counts. If you submit factors only after a full range is completed and the current prp limit at that time is in the middle of your n range then all factors below the limit will be worth significantly less.

    - I'm currently processing at a 1.5 factor and 500 megs on an athlon 1800. If I increase the megs, will it go faster or should I just lower the factor?

    It will only be very marginally faster if you give it more mem than that, and only in some cases (generally the bounds seem to get set higher when sbfactor is given more memory). Lower the factor instead, or do both if you've got memory to spare.

    - Do I need to be updating the results.txt, lowresults.txt, or SoB.dat files? If so, where do I get newer files?

    only results.txt need updating every now and then to skip candidates eliminated via sieving or some other method. You can find it at http://www.seventeenorbust.com/sieve/results.txt.bz2
    It's bzip2 compressed.

    Just my thoughts,
    Mikael

  9. #209
    Junior Member
    Join Date
    Jun 2003
    Location
    Poughkeepsie, NY
    Posts
    19
    ...
    Just my thoughts,
    Mikael
    Thanks Mikael!

    Does anyone know if the p-1 coordination thread will be cleaned up? Also, it'd be nice to add these links to the thread...for quick reference:

    http://www.seventeenorbust.com/sieve/next.txt
    http://www.seventeenorbust.com/sieve/results.txt.bz2

  10. #210
    Member
    Join Date
    Feb 2003
    Location
    Lucerne, Switzerland
    Posts
    30
    by the way,

    be careful that you don't reserve a range which is too near the prp effort.
    If your factors leave the active range before you submit them, you will only score ~0.4 instead of 2100.

    Or am I mistaken?

    Another solution to this problem is to increase the size of the active window by 100000 or so below the prp border.

    biwema

  11. #211
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    Or am I mistaken?

    Another solution to this problem is to increase the size of the active window by 100000 or so below the prp border.
    You are not mistaken.

    I don't really want to extend the window down, because anything that is submitted below the window has not saved a PRP test, and that's where the scoring is now biased.

    Mike.

  12. #212
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    It's true that these factors will score 0.4 instead of 2100. But, they will eventually score 1260 (=2100*0.6) when (if?) double check PRP reaches that number. So, you still hold your stock options. Of course, the score will not change to 1260 if we find a prime for that k before double check PRP reaches that point.

  13. #213
    Member
    Join Date
    Feb 2003
    Location
    Lucerne, Switzerland
    Posts
    30
    When doublechecking reaches that point, first time checking is probably beyond 10M scoring 12500 each factor. In that case these 1200 points are not that important anymore compared to the factors of 10M.

    Actually, I can understand Mike not to lower the first time range in order to motivate people to factor exponentnt beyond the borderline.

    biwema

  14. #214
    Senior Member Frodo42's Avatar
    Join Date
    Nov 2002
    Location
    Jutland, Denmark
    Posts
    299
    From the coordination thread
    The best solution would be to "merge" P-1 factoring into the main (PRP) client.
    Well stage 1 could perhaps be merged but stage 2 is memory demanding and therefore I don't think that would be a good idea.
    I myself check factors that has not been P-1 factored before I let the client run them, but to implement stage 2 into the client would destroy one of it's main features, that it does only uses "free" resources.

  15. #215
    Well stage 1 could perhaps be merged but stage 2 is memory demanding and therefore I don't think that would be a good idea.
    This could be done the same way as the GIMPS client. Let de user specify the ammount of memory stage 2 can use. If there's not enough memory to run stage 2, just skip it and continue testing the number. The more memory is available, the higher second bound can be used.

    It would be even better if daytime and nighttime settings would also be implemented.

  16. #216
    the P-1 will eventually be in the client, but it won't be in the short term (aka this year) for a number of reasons.

    first off, the P-1 code is still in flux. i'd like the time estimation code to be better. i'd like to see people test the developement version I posted (with newer gw-code) that no one even tried.

    second, i'd like to wait until there was actually a sizable amount of work for factoring. right now, the need for factoring is so small that i don't see why it can't be done outside the confines of the normal prp client.

    third, it's ideal to only have P-1 done by those willing and able to use 256MB+ of mem. i realize that skipping stage 2 is ok when it has to be done, but it is where the majority of factors are found. if someone wants to go though the results.txt file and analyze the factors likely found with P-1, that would be interesting. but I know from when I was running a cluster of 20 P4 2.26GHz machines on it for a week, I only found 1 factor out of 10 with B1 factoring alone.

    as another example of how important B2 is, look what happens if you tell the client you only have 24MB of mem for B2.

    C:\sbfactor>sbfactor 4250000 4250100 45 1.0 24
    SBFactor v1.1
    P-1 and ECM factoring for number of the form k*2^n+1.
    Adapted from GIMPS v23.4 by George Woltman and Louis
    Intel(R) Pentium(R) III processor detected.
    24MB of memory avilable for stage 2
    Finished parsing SoB.dat
    5 numbers between 4250000 =< n < 4250100
    Searching for known factors in results.txt...Done.
    Searching for known factors in lowresults.txt...Done.
    Removed 0 numbers using the factor files
    Testing 5 numbers between 4250000 =< n < 4250100
    Reordering array by n value...Done.
    Estimating for k=24737 n=4250023
    Estimating for k=27653 n=4250049
    Estimating for k=10223 n=4250057
    Estimating for k=21181 n=4250060
    Estimating for k=19249 n=4250066
    Expected number of factors for entire range: 0.000000
    B1=0 B2=0 Success=0.000000 Squarings=0
    P-1 factoring doesn't make sense for this input.
    B1=0 B2=0 Success=0.000000 Squarings=0
    P-1 factoring doesn't make sense for this input.
    B1=0 B2=0 Success=0.000000 Squarings=0
    P-1 factoring doesn't make sense for this input.
    B1=0 B2=0 Success=0.000000 Squarings=0
    P-1 factoring doesn't make sense for this input.
    B1=0 B2=0 Success=0.000000 Squarings=0
    P-1 factoring doesn't make sense for this input.

    The code determines that without sufficient memory, it's not even worth testing. That's how close we are to the boundry right now. Even the program itself recognizes that without B2, it's more efficient to just prp the number.

    another major reason that P-1 can't be in the client immediately is because it would require a change in the protocol, the client, and the server. it may seem trivial on its face but i assure you it's not. the server code would be the most difficult and i'd rather wait until it was more nessisary to do it.

    Even if there were no justification for P-1 to be outside the client, I still wouldn't put it in right now. That's because I am personally unavilable to code for the for the forseeable future:

    August: In two days, I leave for California. I will then be in Florida for the second half of August. I will not touch computers after tomorrow.
    September-December: Soul-crushing final semester @ UM.
    Jan 2004: (Hopefully) start full-time job

    That also means that I will be unable to finish the new linux client this summer. I'll try and finish it this fall, but other developement will probably wait till '04. Speaking of which, if someone wanted to manage the sieve/factor coordination threads, email me soon. Joe and ceselb seem to be busy and I will soon be gone.

    -Louie

  17. #217
    Senior Member
    Join Date
    Dec 2002
    Location
    Madrid, Spain
    Posts
    132
    first off, the P-1 code is still in flux. i'd like the time estimation code to be better. i'd like to see people test the developement version I posted (with newer gw-code) that no one even tried.
    i wouldn't use it for regular testing until it's a bit more polished, but if you want to speed test it you can have it.
    Currently, I don't have neither the time neither the CPU power to do such tests.

    second, i'd like to wait until there was actually a sizable amount of work for factoring. right now, the need for factoring is so small that i don't see why it can't be done outside the confines of the normal prp client.
    If the need of factoring is so small, should I stop factoring? AFAIK, I'm the only who's factoring ranges ahead of PRP testing.

  18. #218
    Originally posted by Troodon
    If the need of factoring is so small, should I stop factoring? AFAIK, I'm the only who's factoring ranges ahead of PRP testing.
    If factoring is producing results, I'd say you should continue if you enjoy saving SB resources. It may be a small need but it takes a lot of manual work to do it. I appreciate all you do.

    Perhaps a few others will follow your lead and join in the factoring work.

    -Louie

  19. #219
    Senior Member
    Join Date
    Dec 2002
    Location
    Madrid, Spain
    Posts
    132
    Thanks Louie! At this moment, my poor Tualatin (now running@1350 MHz because my room ambient temp is 40ºC!) has factored 192 k/n pairs (many of them using 2.0 - now using 1.3 to gain speed) and 0 factors have been found. Maybe it's just I'm unlucky.

    With 2.0, it needs ~145 minutes to factor a k/n pair. That equals to ~8 factors/day (I turn it off while I sleep). If we consider ~250 PRP test are finished per day, we would need 32 machines like the mine one to keep up factoring ahead of PRP testing!
    With 1.0, it needs ~50 minutes per pair and we would need 12 Tualatins@1350 MHz.

    Also, I've observed the program takes memory only in very lage chuncks - 128 Mb/chunck I think. Even if I enter 380 Mb it only takes 256 Mb.

  20. #220
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    I'll join factoring again once I have my new PC...

  21. #221
    Queston:

    I'm P-1 factoring the range 4297000 4298000 and I found my first P-1 factor EVAR! However..the factor is not new...and puzzles me as to why it wasn't excluded from the results.txt file.

    3 | 21181*2^4297244+1

    What's up with that?

  22. #222
    Senior Member
    Join Date
    Feb 2003
    Location
    Sweden
    Posts
    158
    b2uc:
    3 does not divide that number. What's up with that?

    Anyway, perhaps you have a fairly old results.txt and your factor (relatively small? somewhere around 40T?) was found by a siever just recently? I'm assuming you got the "not new"-message when trying to submit the factor.

    Mikael

  23. #223
    My computer was abruptly shut down about the time that fact.txt was written...should I P-1 that factor? Just to be safe?

    Also...I'm using 1.11 or whatever version was the last before the test one...and just downloaded it yesterday...so hopefully the result.text file is up to date.

  24. #224
    Senior Member
    Join Date
    Feb 2003
    Location
    Sweden
    Posts
    158
    and that line with "3 |..." was written to fact.txt? Sounds weird. I could understand if the last part of the line was chopped off, but the first part? If something was written to fact.txt, it undoubtedly seems as if a factor was found... Yeah, doing a new p-1 on that pair sounds wise.

    The results.txt in the v11 zip is just over a month old now, so it might be a good idea to get a new one.
    http://www.seventeenorbust.com/sieve/results.txt.bz2

    Mikael

  25. #225
    Senior Member
    Join Date
    Dec 2002
    Location
    Madrid, Spain
    Posts
    132
    Wow! It works! After factoring more than 260 k/n pairs, I've found 38372614033613527 | 22699*2^4252294+1 although it has been found too late!

  26. #226
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    although it has been found too late!
    Troodon I really feel sorry for you, it's hard enough to find these factors, then to have missed the window.

    Can I suggest that P-1 factorer's pick ranges that are a little further from the PRP 'wave'. In the scoring we have a 500K window, this represents about 2 months PRP work. So I would suggest picking a range that is at least 200K from the 'wave'.

    As I write this, the next.txt file indicates 4.269M. This means the coordination thread is a little confusing, because it shows ranges that are now being PRPed as [available].

    Can we please update the co-ordination thread to offer some advice to new P-1 factorer's like "before reserving a range, check the next.txt file, add 200K to the n value, then pick an available range above that point".

    Mike.

  27. #227
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    Does anyone have any metrics on eaxctly how much difference memory allocated to P-1 makes? I know there's a post from Louie that shows that PCs with very small quantities of RAM just won't work, but has anyone experimented with hugh quanties of RAM?

    When I've run P-1, I've allocated 256M, where the PC has 512M of physical RAM. Does anyone know how much difference it would make if I upgraded to (say) 2G? If it's really small like 2% gain, then it's clearly a waste of time, but if it's 10% now that's worth considering....

  28. #228
    Senior Member
    Join Date
    Dec 2002
    Location
    Madrid, Spain
    Posts
    132
    Please note that when you increase the amount of RAM available to sbfactor, it also increases the optimal bounds!
    Yesterday I did a little test with 4847*2^5000007+1, by entering sbfactor 5000002 5000010 45 1.0 [mem]

    ***********
    With mem=32
    ***********
    Success: 0.004617
    B1=10000
    B2=40000
    Squarings=22062
    Memory used (kB)=38000
    Memory allocated (kB)=42000
    Stage 1 transforms=28892
    Stage 2 transforms=11958
    Time used for stage 1 (s)=1617
    Time used for stage 2 (s)=2549
    Total factoring time:43 min

    ************
    With mem=512
    ************
    Success: 0.008419
    B1=15000
    B2=138750
    Squarings=34402
    Memory used (kB)=464708
    Memory allocated (kB)=465400
    Stage 1 transforms=21500
    Stage 2 transforms=43254
    Time used for stage 1 (s)=2420
    Time used for stage 2 (s)=4077
    Total factoring time:70 min


    When I'll some time, I'll perform more tests, with the same bounds.

  29. #229
    Senior Member
    Join Date
    Dec 2002
    Location
    Madrid, Spain
    Posts
    132
    More benchmarks:
    sbfactor 4847 5000007 100 150 [mem]

    ***********
    With mem=32
    ***********
    Memory used (kB)=36700 (1.1); 36704 (1.1 dev)
    Memory allocated (kB)=37816 (1.1); 37856 (1.1 dev)
    Stage 1 transforms=270
    Stage 2 transforms=1230
    Time used for stage 1 (s)=15
    Time used for stage 2 (s)=158
    Total factoring time=3 min

    ************
    With mem=256
    ************
    Memory used (kB)=67364 (1.1); 67368 (1.1 dev)
    Memory allocated (kB)=68452 (1.1); 68492 (1.1 dev)
    Stage 1 transforms=270
    Stage 2 transforms=184
    Time used for stage 1 (s)=15
    Time used for stage 2 (s)=97
    Total factoring time=2 min

    As you can see:
    - There is an improvement with more RAM, if the bounds are the same.
    - sbfactor 1.1 dev is as fast as sbfactor 1.1 and takes a few more kb of memory.

  30. #230
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    ************
    With mem=512
    ************
    Out of interest, how much physical RAM did this machine have?

  31. #231
    Senior Member
    Join Date
    Dec 2002
    Location
    Madrid, Spain
    Posts
    132
    640 Mb

  32. #232
    Senior Member Frodo42's Avatar
    Join Date
    Nov 2002
    Location
    Jutland, Denmark
    Posts
    299
    For some weird reason the factorer stops when it's done with Stage 1 of 28435^4575913 .
    It's only with this one it happens, so I skipped it so far.
    Could someone perhaps tjeck if it also happens on another box?
    Last edited by Frodo42; 09-04-2003 at 05:19 AM.

  33. #233
    Member
    Join Date
    Feb 2003
    Location
    Lucerne, Switzerland
    Posts
    30
    Hi,

    At the moment only a part of all the candidates are prp tested. So there is a lot of choice.

    Does someone know, where the limits for FFT sizes (SSE" and not SSE2) are. In that case these few people who do prp tests can avoid taking ranges just a bit above these limits.
    That micht increase the efficiency a bit.

    just an idea

    biwema

  34. #234
    Member
    Join Date
    Feb 2003
    Location
    Lucerne, Switzerland
    Posts
    30
    Hi,

    I tried to find now the FFT limit for SSE2 processors.

    It is somewhere between 4780000 and 4781000.
    at that point the b1 increases from 15000 to 20000 (1.1; 256M Ram) increasing the chance of finding a factor by 25%, but it takes amost twice as long to calculate.

    biwema

  35. #235
    Does that mean that very soon the time required to do a prp test will increase drastically?

  36. #236
    Member
    Join Date
    Feb 2003
    Location
    Lucerne, Switzerland
    Posts
    30
    I don't know exactly, but I think so.

    At 4650000 one test took me 50 minutes;

    at 4820000 it is 1:40 and
    at 4700000 - 4780000 it is 1:20

    It seems as if there are two steps (one for fft and one for the increase of b1) but I actually don't know what weight to choose (between 1 and 2) that the throughput of factors ist best. I am usind 1.1.

    I think I have to investigate a bit further...

    we will see
    biwema

  37. #237
    Moderator ceselb's Avatar
    Join Date
    Jun 2002
    Location
    Linkoping, Sweden
    Posts
    224
    I'm getting alost exactly one factor per day at 4830000 (1h 40m per test on a PIV 1.5).

    Btw, anyone tried to do a range over 5M? Sbfactor just exits if I try

  38. #238
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    'm getting alost exactly one factor per day at 4830000 (1h 40m per test on a PIV 1.5).
    Using a P4C @ 3 GHz, 2 tests at bounds=1.5 take 103 minutes. Getting only a factor every 2 days in average, though...

    Does that mean that very soon the time required to do a prp test will increase drastically?
    There will be a slowdown (slightly dropping cEM/s rate). It has happened some times before already...

    Btw, anyone tried to do a range over 5M? Sbfactor just exits if I try
    Seems like Troodon did:

    5000000 5000010 Troodon 1 0.008419 0 [completed]

  39. #239
    Senior Member Frodo42's Avatar
    Join Date
    Nov 2002
    Location
    Jutland, Denmark
    Posts
    299
    I found a weird factor:
    7 | 5359*2^4754206+1

    I can't submit this one, because of the boundries laid on the submit site ...
    Shouldn't this one be sieved pretty much all the way?
    It's found using stage 1 of P-1-factoring.

  40. #240
    Senior Member
    Join Date
    Feb 2003
    Location
    Sweden
    Posts
    158
    That factor is incorrect. Bug in sbfactor?

Page 6 of 10 FirstFirst ... 2345678910 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •