Page 2 of 3 FirstFirst 123 LastLast
Results 41 to 80 of 96

Thread: Double Checking discussion (1<n<3M)

  1. #41
    I started at 1M. I've run the P-1 up to 85E7, the P+1 up to 11E7, and the ECM up to 3M.
    You need to run 1100 curves with B1=1M to find most 35 digit factors
    See the ECMNET page for mor information.

    Edit: The following were on my list and not on yours:
    Seems that i forgot to get all K=55459 numbers. I've added them to my list (found a factor for one of them already)

    Could you let me know which 12 N you've eliminated?
    A few more then 12 I'll attach them (not all are submitted yet (can't get to that page) but i'll do that tonight).

    Yes, I would appreciate some help setting it up. First of all where can I get UBASIC for Windows?
    I can't find it online at the moment, but i can send you a slightly changed version of the program (only changes are to produce an output file with the found factors) tonight (NL time).

    Is it okay to send a couple of hundred Kb to your e-mail add.?
    Attached Files Attached Files
    Last edited by smh; 03-19-2003 at 10:37 AM.

  2. #42
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    Is it okay to send a couple of hundred Kb to your e-mail add.?
    Yes, it is. Please ZIP it if you can. Or ARC it, or ....
    I don't know what my ISP's limit is, but we'll find out. Perhaps, you could break it up into convenient "Chunks".

    As far as the iterations go, I'm using the following table:

    The following table gives a set of near-to-optimal B1 and B2 pairs, with the corresponding expected number of curves to find a factor of given size (this table does not take into account the "extra factors" found by Brent-Suyama's extension, see
    below).

    digits D optimal B1 B2 expected curves N(B1,B2,D)
    15 2e3 1.2e5 30
    20 11e3 1.4e6 90
    25 5e4 1.2e7 240
    30 25e4 1.1e8 500
    35 1e6 8.4e8 1100
    40 3e6 4.0e9 2900
    45 11e6 2.6e10 5500
    50 43e6 1.8e11 9000
    55 11e7 6.8e11 22000
    60 26e7 2.3e12 52000
    65 85e7 1.3e13 83000
    70 29e8 7.2e13 120000

    Table 1: optimal B1 and expected number of curves to find a
    factor of D digits.

    Important note: the expected number of curves is significantly smaller than the "classical" one we get with B2=100*B1. This is due to the fact that this new version of gmp-ecm uses a default B2 which is much larger than 100*B1 (for large B1), thanks to the improvements in step 2.
    Joe O

  3. #43
    Joe, the only reason i asked is because you first wrote you had ran 336 curves, and later you said you were at 3M.

    I sent a zip to the e-mail account i used the other day. It's 487Kb, so that shouldn't be a real problem.

    I hope i made it clear how to use the program, otherwise just ask.

    Let us know the results

  4. #44
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752

    Re: Moo_the_cow

    Originally posted by Joe O
    "However, if I do only this sieve 24/7, I'll be delaying the other project, and some guys are going to be pretty mad about it. I'm reluctant to change my reserved range, since I feel that a lower range is more productive."

    Well I certainly don't want anyone to get mad. And there is absolutely no need for you to change your reserved range. Just keep on going the way you are and continue to use the SOB.DAT file that you are currently using until you finish your range. That way we will know that all the n < 3000 will have been sieved to 1.1T.

    Now if we could get some "volunteers" to agree to do the range 1100 to 1300 with the current SOB.DAT file, I will volunteer to do 1300 to 1500 with the current SOB.DAT file. Then we will have all the n < 3000 sieved to 1.5T!
    Ok. It's a deal.

    Let's make the changeover at 1.5T.

    Moo_the_cow (10xx-1100), Halon50 (1100-1300) and Joe O (1300-1500) will use the current SoB.dat file for their ranges, and all others will use the updated one.

    Happy sieving all, and welcome back Halon50.

    EDIT: BTW, I'm reserving 1500-1600. Will start when the updated file is ready.

  5. #45
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    and I'll reserve

    1600 - 1700 MikeH

    I'll have an upadted sob.dat ready within the hour.

    EDIT - the updated sob.dat is ready.

    All the fators posted on the forum since the last update, as well as those in todays results.txt have been removed. The nmin has then been lifted to 300K.

    The resulting file actually has nmin=300020, and nmax=2999967, since it is now optimised to the lowest and highest remaining n values.

    I've also included a file sob.txt which has the factors 'clear'. This is just for information.
    Last edited by MikeH; 03-19-2003 at 03:37 PM.

  6. #46
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643

    SMH

    Originally posted by smh
    Joe, the only reason i asked is because you first wrote you had ran 336 curves, and later you said you were at 3M.

    I sent a zip to the e-mail account i used the other day. It's 487Kb, so that shouldn't be a real problem.

    I hope i made it clear how to use the program, otherwise just ask.

    Let us know the results
    I've received your e-mail and unzipped the attachment. Your instructions are very clear. I will try them tomorrow, as it has been a very long day
    Joe O

  7. #47
    Thanks for the welcome back Nuri!

    I have a question for when the current range (1100-1300G) is finished. Do you still need the SoBStatus.dat files for this range in addition to normal submission to the database?

  8. #48
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    Halon50,
    Normal submission of the results should be enough. You can post them here as well if you want. It would make my life easier, but would only be necessary if Louie changes the range of the daily results.txt file. At the moment his cutoff is 1T so your results will show there. Moo_the_cow and I, on the other hand need to post our results here since we are below the 1T mark.
    Joe O

  9. #49
    Ok, thanks for the info; I'll compile and zip the results file tomorrrow when I'm a little more awake and post it in the other thread!

  10. #50
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643

    Nuri,

    I did reply to your post in the other thread. Did you see it before it was deleted? Any way, there were 6 factors found in the range 9440 - 9450 for 1 < n < 3M. So far, I have found 10 factors for 3M < n < 20M. Currently at 9446900000000 and counting. What is the formula for the expected number of factors to be found?


    Update: Now 12 factors, 9447160000000 and counting.

    Update: Total 20 factors.
    Last edited by Joe O; 03-25-2003 at 09:50 AM.
    Joe O

  11. #51
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Thanks for the response Joe.

    Your data for both of the sieves seems within the normal limits I would expect (although the one for DC sieve is slightly higher, and the one for the normal sieve is slightly less than my expected averages).

    I was just wondering if the density of factors at double check sieve is somewhere close to 3/17 of the density of factors at normal sieve. I haven't worked on it yet, but my logic suggests that, since the candidates at DC are smaller, it should be slightly more than 3/17.

    And also, wanted to get some clues to understand how deep of a double check sieve (and what p & nmin values for the following changeovers) would be reasonable.

    Anyway, thx again for the feedback.

  12. #52
    It's been two weeks now...any updates on when the range through 1.5G will be finished?

  13. #53
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    I guess it's a bit early. Yes, two weeks have passed, but we could come to 1.7T yet.

    Here's a quick and dirty analysis that I made last week, to have a better guesstimate of the second changeover timing.

    I don't have data for every entry in the table, so I made some assumptions, and checked with actual data where I have. Sieve and prp speeds are based on P4-1700, so results might vary significantly based on hardware.

    Assumptions in the table:
    - DC Sieve speed increases 1.5% as p doubles.
    - Time to prp test a candidate increases proportional to the square of the increase in n.
    - Number of factors per given range decreases linearly proportional to the increase in p value.
    - Factors are distributed evenly within the n min and n max range.

    I guess deciding on at which p value and up to what n min depends on how much of a conservative stand we want to take. Of course, the most conservative stand would be not to make any changeovers at all (which is still fine, as long as we're aware of the consequences of it and it's alternatives).

    Please feel free to comment on and/or correct the calculations and assumptions of the table.

    Regards,

    Nuri
    Attached Images Attached Images

  14. #54
    Errr, sorry, what I meant was, when will the new sieve file that has removed factors through 1.5G be up? I've been waiting for it before starting my machines on sieving again.

    (Of course, if your graph answers exactly that, then I apologize!)

  15. #55
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    I see what you mean now, no need for apologies. As you know, there are two types of updates possible.

    One is, that removes the found factors, but as far as I know, that does not affect the sieve speed. This was discussed and explained by Paul and Phil in the sieve client thread. As far as I understand, the effect of such an update is that you start to find less duplicates, but the number of unique factors does not change. I guess, this is why the SoB.dat file is not updated frequently in this sense.

    Other one is the one I mentioned above. At a certain point, sieving some ranges becomes meaningless, as prp testing the candidates in that range becomes much faster per candidate. The benefit of such an update is that since the range becomes smaller after the update, the sieving speed increases, and this enables us to find more factors in the larger n candidates, as we can sieve deeper per given time and computing effort. My post was related to that, to have an idea on at what point (and up to which n min) it might be meaningful to have an update.

    Anyway, Halon50, I'd be happy to hear that you continue sieving. Since we have a new version to the sieve client now, you will get %15 speed increase anyway.

  16. #56
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    Halon,

    Are you looking for the revised sob.dat file where the bottom end of the n range was raised to 300K? If so, then it can be found in this post.

    The agreement was that 100<n<3M sob.dat would be used to sieve to p=1.5T, then 300K<n<3M sob.dat would be used from there onwards.

    If you are looking for a newer version of the 300K sob.dat file, sorry there isn't one. We'll probably be running with this one for a while yet.

  17. #57
    Aha, I got it; thanks for the explanations! I was waiting all this time for nothing it seems...

    I'll pick up the 300k-3M file tomorrow and start my machines back up soon. Thanks again!

  18. #58
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Happy sieving Halon.

  19. #59
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    I have generated an alternative sob.dat file. This file covers the range 300K<n<20M, and therefore allows simultaneous main and DC sieving.

    I accept Nuri's comment that this introduces another aspect of confusion, but for anyone that is interested, the file is now available.

    When using this file, please reserve your range from the 3M<n<20M co-ordination thread, then post the same range on the 1<n<3M co-ordination thread. Obviously the same when you complete.

    Again, do not feel in anyway obliged to use this alternative sob.dat file, if you are happy with how you currently work, keep doing just that, don't change.

    Mike.

  20. #60
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Thanks for the alternative file Mike. I'm switching to alternative SoB.dat wherever it makes sense.

  21. #61
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    Having seen today in the sieving stats that I have four excluded factors, I have been investigating what has gone wrong. I was really concerned that something was very wrong with the alternative sob.dat file I created at the weekend.

    After some investigation, my worries were over. The sob.dat file I used as my base was one that was not fully sieved to 1G. As a result, there are candidates in the alternative sob.dat file that would have fallen out before p=1G, thus the excluded factors.

    This means there are no problems with the results that will have been generated with this sob.dat file - no factors have been missed. However, I have now generated a new alternative sob.dat file (the old one is overwritten), which is one based on a 1G sieved sob.dat.

    For information, the number of candidates have been reduced from 785315 to 719699 from duff to good sob.dat files

    Sorry for any confussion this may have caused anyone else.

    Mike.

  22. #62
    it should also be noted that i did a little sieving between 2.7 - 3 million before we reached the 3M barrier to eliminate more tests. that might also cause small factors to be duplicates. i don't really remember which ranges i factored though.

    -Louie

  23. #63
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Originally posted by jjjjL
    it should also be noted that i did a little sieving between 2.7 - 3 million before we reached the 3M barrier to eliminate more tests. that might also cause small factors to be duplicates. i don't really remember which ranges i factored though.

    -Louie
    I did a quick analysis of the results.txt file and in case anyone is interested, here's the data of factors submitted by Louie for n<3m.
    Attached Files Attached Files

  24. #64
    This double sieving is really hard to understand for outsiders, visiting the project only now and then. Could someone please give a short explination on why double sieving is necessary. From the numbers it seems that all those ranges already have been sieved. Is it worth the effort? And how come so many new factors are found?

    If you explain, please begin from the begining or else i will not understand And I think I am not the only one having problems with this.

    Many Thanks,

    Ola

  25. #65
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Hi Ola,

    To say it first, DC sieve is not double cheching the sieve, it sieves for double check. But I guess, this sounds strange, so I'll try to start fromthe beginning.

    I'm sure you know most of the things I'll write below, but I hope they will help with the explanation of DC sieve.

    As you know, the seventeenorbust project is testing for numbers of the form --- k * 2^n +1 --- to find out primes for each of the k values. The project started with 17 k values and as primes for 5 k values have been found in 2002, we are left with 12 k values now.

    There are mainly two different procedures used for the project, namely sieving and PRP testing.

    Sieving, as it's name implies, looks at the remaining candidates and eliminates the ones that are divisible by a smaller prime number. The candidates that are divisible by a smaller prime obviously can not be primes themselves, so they are eliminated from the set of candidates that should be PRP tested.

    On the other hand, PRP testing takes each candidate one by one, and tests if it is a prime or not.

    Sieving can not find primes, but it is very useful, because it can eliminate candidates faster than a PRP test can eliminate (still much faster at the p values we are sieving right now). So, it helps the project proceed faster, by eliminating candidates before they are PRP tested. In a way, sieving cleans up the way for PRP testing.

    Now, the Sieve and DC Sieve part:

    Our main sieving sub-project started a couple of weeks before the main project started PRP testing for values where n in the k * 2^n +1 formula exceeds 3 million. The candidates the main sieving tries to eliminate starts from k * 2^3,000,000 +1 and ends at k * 2^20,000,000 +1, for each of the 12 remaining ks.

    On the other hand, DC sieve aims to decrease the number of candidates from k * 2^1 + 1 up to k * 2^3,000,000 + 1, in case we need to double check them by PRP testing in the future.

    Here comes the question, were they not been sieved in the first place when the main project was testing for n for values smaller than 3,000,000?

    Yes, they were sieved, but not very deep. The trick is here: Just a couple of months ago, the sieve client was much more primitive than the one were are using right now. Thanks to the admirable efforts of Paul and Phil (and lately Mikael as well) it's far much better now.

    To compare the situation, when our main project was PRP testing for numbers n smaller than 3,000,000, the sieve client could only test for one k value at a time, whereas it can test for all 12 k values at the same time. Also, it was much much slower (more than 30 times) than our client now.

    Therefore, despite Louie's computing efforts, they could not be sieved deep enough. With the new client, the DC sieve is now trying to reduce the number of candidates for n smaller than 3,000,000, in case we need to double check their PRP testing in the future.

    Then, the question comes to mind, will we ever need to double check for those numbers? I'm not sure when we should start that, but I'm sure it will be worth the effort when the time comes.

    So, I hope this explanation helped you.

    Please feel free to ask if you have further questions.

    And others, please feel free to add your comments / corrections.

    Regards,

    Nuri

  26. #66
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    Well...
    It's no second sieve run, it's sieving for the double check run.
    So the range was only searched by Louie so far. And sieving speed increased somewhat in the last month - I guess it's roughly 30 times faster now.
    So it's possible to search a much greater range now.

  27. #67
    Thanks a lot Mystwalker and especially Nuri!!



    Finally I understand what you are doing with "double sieving", and it is indeed useful (although sieving itself is much more important).
    I have a slow PIII 450Mhz, but a permanent connection to the internet, so I do PRP testing. Currently it takes me about 14 days for 1 test. That's why sieving is so important (each potentially sievable candidate is 14 days waisted for me)! In the long run, we will find much faster a prime, if we only PRP test deep sieved numbers. That I wanted to state once more, so big thanks to the sievers.

    And thanks again for the explanation!!!!

    Ola

  28. #68
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Here's the first graph for distribution of factors vs. p value at DC sieve.

    As expected, the number of factors per G is 3/17 of the main sieve up to 1.5T, and 2.7/17 of it thereafter.

    If the formula is right, from the ranges where we stand, it suggests:

    A total of 1,600 new factors up to 5T,
    A total of 3,800 new factors up to 10T (2,200 of which is for 5T-10T),
    A total of 5,950 new factors up to 20T (2,150 of which is for 10T-20T),
    A total of 9,050 new factors up to 50T (3,100 of which is for 20T-50T), and
    A total of 13,400 new factors up to 200T (4,350 of which is for 50T-200T).

    Going back to the graph:

    The red and blue lines do not fit because the data shows the sieving effort by Louie (mainly for the range 2.7m<n<3.0m) when we were appraoaching n=3m on prp testing. If we ignored these factors, the two lines would fit perfectly, but I just wanted to show the where those factors stand relative to our current effort.

    Next update for DC sieve graph will come when we finish everything below 5T.

    Regards,

    Nuri
    Attached Images Attached Images
    Last edited by Nuri; 05-05-2003 at 12:55 PM.

  29. #69
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    In order to try to encourage more sievers to move to the 300K-20M sieve effort, and because the whole area of DC sieving is becoming blurred (since 0.6M of the 3M-20M area is also now DC), I am implementing a good suggestion by JoeO to normalise the scores. As a result, the sieve scoring will changed from

    n < 3M, score = p/1T * 0.5 * ((n*n)/(3M * 3M))
    3M < n < 20, score = p/1T
    n > 20M, score = p/1T *0.05
    duplicates score = score * 0.01

    to

    n < 300K, score = p/1T * 0.5 * ((n*n)/(300K * 300K))
    300K < n < 20, score = p/1T
    n > 20M, score = p/1T *0.05
    duplicates score = score * 0.01

    As a result anyone who has performed any sieving in the range 100<n<3M will see there score increase (a little) tomorrow.

  30. #70
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    to

    n < 300K, score = p/1T * 0.5 * ((n*n)/(300K * 300K))
    300K < n < 20, score = p/1T
    n > 20M, score = p/1T *0.05
    duplicates score = score * 0.01
    and another minor change resulting in

    n < 300K, score = p/1T * ((n*n)/(300K * 300K))
    300K < n < 20, score = p/1T
    n > 20M, score = p/1T *0.05
    duplicates score = score * 0.01

  31. #71
    Senior Member
    Join Date
    Jan 2003
    Location
    U.S
    Posts
    123
    Thanks for the stats scoring change, Mike, it improved my score by more than 200 points
    Anyway, it occurred to me that since the "secret" account is already done checking 3 of the 12 k's, the SoB.dat file (for DC sieve) should only contain the 9 k's that are not yet completely checked by "secret". It appears that changing the DC sieve to only include 9 k's instead of 12 will result in a 25% speedup.
    Well, what do you guys think?

  32. #72
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    Well, what do you guys think?
    Depends on the aim of the DC - when is goes further than checking the computations of the "prior" projects (which I'm almost sure of), then it should continue.

    Btw. is there anyone running DC-only with the new client on a fast PC? There should be some astronomic kp/sec values.

  33. #73
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    How does 95999 p/sec grab you? PIII/500 Win98SE


    90278 p/sec Celeron/450 Win NT 4
    Last edited by Joe O; 05-12-2003 at 06:53 PM.
    Joe O

  34. #74
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Well, what do you guys think?
    I think we should not remove any k values up until we find the primes for those ks.



    is there anyone running DC-only with the new client on a fast PC? There should be some astronomic kp/sec values.
    I tried that with my PIII-1000 @ work for one of my DC patches at qround 19T. Unfortunately it slowed down from ~260k p/sec to ~225k p/sec. I don't think so, but may be I did something wrong (it was late, and I was tired of working). I'll try again tomorrow and let you know the result.

  35. #75
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    How does 9599 p/sec grab you? PIII/500 Win98SE
    Sounds a bit low - typo?

  36. #76
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Unfortunately it slowed down from ~260k p/sec to ~225k p/sec.
    That's really strange. It's the same case for my PIV-1700. Speed at DC dropped from 160k to 137k.

    BTW, it shows 32% increase for alternative SoB.dat (from 72k to 95k).

    PS: All tests are for v1.28 vs. v1.32.

    Anyone else tried 1.32 at DC sieve?

  37. #77
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    Originally posted by Mystwalker
    Sounds a bit low - typo?
    Yes, it was a typo!

    95999 p/sec PIII/500 Win98SE 5.5T for the p range


    90278 p/sec Celeron/450 Win NT 4 3.3T for the p range


    Both of these are for the lower range sieving, i.e. n < 3M
    Joe O

  38. #78
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    I'm going back to V1.30, at least until I have time to play with the alpha setting.

    103593 p/sec PIII/500 Win98SE 5.5T for the p range for V1.30
    97576 p/sec was the best for V1.32 and not even that for v1.33
    Joe O

  39. #79
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    I'm going back to V1.28 for now.

    113232 p/sec V1.28
    108353 p/sec V1.30
    92915 p/sec V1.32
    90278 p/sec V1.33
    Celeron/450 Win NT 4 3.3T for the p range
    This is for lower range sieving, i.e. n < 3M
    Joe O

  40. #80
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    I have been tweaking the parameters on the gap finder for low n. We now have everything in 10T strips (a la regular sieving), and the lower 1.5T as well (from here)

    While sorting out the 0-1.5T area, I found a gap of 0.24G right down at 2.69G. I quickly sieved this with the current low n sob.dat - no factors. I then did the same with a 1G sieved sob.dat, and found 591 factors (all reported as new at submission). I've checked, and 563 are new unique factors.

    Since these factors were not present in the current sob.dat file, this means I used then to build the sob.dat, but somehow failed to submit them (I did have thousands to do, but that's no excuse). Sorry.

    With that sorted, I am now intrigued as to why the sieve stats (from tomorrow) show a difference of about 300 factors per million n in the p<3T column when compared with n>3M. Strange. Since Nuri is still going on the range 2500-3100, I guess some of those 300 could be in there (even though I show no gaps). Hope so.

    Mike.

    EDIT: Taking a detailed look at the current submissions, it appears Nuri still has 2.762T-3T to go. I'm not sure this will give us (300 * 2.7) factors, but it should be a reasonable number. The gap finder doesn't show these gaps because (I think) Louie covered this area when PRP had n<3M, but for a very narrow n (say 2.5-3m).
    Last edited by MikeH; 06-08-2003 at 06:54 PM.

Page 2 of 3 FirstFirst 123 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •