Page 3 of 3 FirstFirst 123
Results 81 to 96 of 96

Thread: Double Checking discussion (1<n<3M)

  1. #81
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Mike, you're right. I still haven't finished my 2500-3100 range, and the factors in my range were the ones previously been found by Louie.

    I was away from the pc sievig that range for the last couple of days. I guess the remaining part (2762 - 3000) is about to finish today. I'll take a look and submit the factors tomorrow. But, my estimate is that there should be roughly 300 factors there. So, there is still 500 factors missing.

    Let me take a look at possible gaps below 3T too. I'll let you know if I find something.

    EDIT: Mike, my internet connection is very bad for the last couple of weeks since the earthquake and I could not download lowresults file. But I found a version of it that I previously downloaded, and checked for holes there. There are four instances where the density of factors are abnormally low. These might have been patched later, but since they are very small ranges (1.1G in total), I guess it's worth resieve.

    Mike, could you please check these out (especially the first one).
    1.486G-1.619G (or 1.48-1.62)
    4.769G-4.803G
    49.957G-50.576G
    55.668G-56.019G

    PS: 1T-3T seems hole free.

    Last edited by Nuri; 06-09-2003 at 07:39 PM.

  2. #82
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    I've copied these from the coordination thread:

    Originally posted by biwema
    sorry, i installed the 300k-20m file later, so only 33050-33200 and 33270-33300 in the 33000-33300 range contain the factors of exponents smaller than 3M.

    also be careful about the accepted gap between 33190 and 33280. it is not tested between 33200 and 33270.

    biwema

    quote:
    --------------------------------------------------------------------------------
    Originally posted by chvo
    22500-22550 chvo [complete]

    I found 7 factors (using the original doublesieving SoB.Dat, not the one that starts at 300K). Is that amount expected, or is it low?

    chvo
    Joe O

  3. #83
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    I've copied this from the coordination thread:

    Originally posted by MikeH - Edited by Joe O
    After some analysis of the results.txt file (new scoring is almost ready), I believe the following reservations / completions for 300K - 3M have been forgotten.

    14700-15000 priwo [complete] - Done with 3M-20M
    15600-15700 Slatz [complete]
    15700-15850 cmprince [complete]
    17400-17420 alexr [complete] - Done with 3M-20M
    21300-21320 Titalatas [complete]
    22710-22800 cmprince [complete]
    39500-39538 geeknik [complete]

    Could the users check and confirm what the n range on their sob.dat files was for these ranges. Thanks.
    Mike.
    Last edited by Joe O; 07-10-2003 at 09:51 PM.
    Joe O

  4. #84
    Moderator ceselb's Avatar
    Join Date
    Jun 2002
    Location
    Linkoping, Sweden
    Posts
    224
    Moved to the sieve section of the forum.

  5. #85
    Moderator ceselb's Avatar
    Join Date
    Jun 2002
    Location
    Linkoping, Sweden
    Posts
    224
    Moved from the coordination thread.

    Originally posted by chvo
    To Keroberts1:
    this isn't the place to discuss that, but in short: there have been a lot of PRP tests, but some of these tests will surely have returned wrong results. We double-sieve to reduce the number of candidates that will be retested (to check the results of the previous PRP tests).

  6. #86

    Low n sieving

    I saw that quite a lot of new supersecret tests have been added. Won't it be of interest to run n < 1M sieving until no further tests (or at least just a few tests left) for n < 1M has been left undoublechecked?

  7. #87
    These tests are very easy to test and for the most part have a very low error rate. In fact these test don't need to be redone at all because it is very unlikely that we have missed a prime there and the goal of our project is not to find the smallest prime but to find a prime for each K. It would in most cases be easier to tests larger numbers and hope to find a prime there. Eventually when the main effort gets far enough ahead of the double check it will once again be beneficial to do the double check again. However, for the range between 300000 and 1000000 we have hnot yet found a single test that was reported wrong. This leads me to believe that it would not be worth the effort to retest them when it appears there is almost no chance of finding a prime there. And anyways DC sieving is much less efficient than regular sieving. Eliminating a single test in regular sieving is like eliminating almost a hundred of the 500000 range tests. To me it just doesn't make sense to spend the resources to sieve it out any more. I dn't even think we should be sieving 1million to 5 million range. However because I've been told the speed gain from that would be very small i use the 1-20 million dat. Adding 300000 to 1 million to that would probably make almost no differance in the speed, but would offer almost no benefit to the project.

  8. #88
    Keroberts1 wrote:
    I dn't even think we should be sieving 1million to 5 million range. However because I've been told the speed gain from that would be very small i use the 1-20 million dat. Adding 300000 to 1 million to that would probably make almost no differance in the speed, but would offer almost no benefit to the project.
    I wasn't talking about the main .dat file, but to continue the 300K-3M sieving a little while longer. Just wondering whether it's worth it, really.

  9. #89
    in my opinion no because the same effort could be used to sieve a higher range the first time and this would find more useful factors. Perhaps there is some benefit to sieving 300000 to 3000000 but it is much smaller than the benefit of sieving wit hthe main effort.

  10. #90
    54450 54500 Range Done - 6 factors found & submitted via the sieve page.

    I think I have done something wrong with the sieve page

    factors found:
    54458662105433 | 33661*2^564648+1
    54467953065341 | 55459*2^2230798+1
    54472098648923 | 5359*2^949302+1
    54472893350251 | 24737*2^606727+1
    54478211796511 | 22699*2^1154278+1
    54497220425873 | 4847*2^1575231+1


    Thnx!

  11. #91
    42600 42650 Range Complete => 6 factors found (submitted via the sieve page)

    42602795068649 | 22699*2^1609678+1
    42606060519859 | 55459*2^1492510+1
    42606124542061 | 28433*2^417553+1
    42631715602241 | 67607*2^1968651+1
    42638849820577 | 22699*2^1836838+1
    42646326562783 | 55459*2^2448058+1

  12. #92
    Moderator ceselb's Avatar
    Join Date
    Jun 2002
    Location
    Linkoping, Sweden
    Posts
    224
    What happened? Was it something like this ?

  13. #93
    No it wasn't that....
    I logged in
    username
    password

    I waited 2 - 5 minutes (I saw preferences)

    When I returned

    I submitted the results

    Then I saw Log - IN instead of preferences

    Last edited by cedricvonck; 03-02-2004 at 12:46 PM.

  14. #94
    Moderator ceselb's Avatar
    Join Date
    Jun 2002
    Location
    Linkoping, Sweden
    Posts
    224
    well, I don't know then. Try resubmitting today and tomorrow. If it doesn't work by then, contact louie (jjjjL on the forums)

  15. #95
    ok thank you.
    IMHO, I think that there was (is) a cookie problem???

    0 factors were new.
    12 results verified.
    Last edited by cedricvonck; 03-02-2004 at 03:07 PM.

  16. #96
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Originally posted by cedricvonck
    No it wasn't that....
    I logged in
    username
    password

    I waited 2 - 5 minutes (I saw preferences)

    When I returned

    I submitted the results

    Then I saw Log - IN instead of preferences

    The same also happened to me twice within the last week. But I later checked for the factors I submitted (through Mike's individual stats page), and everything seemed fine.

    In fact, it seems fine for you too. All 12 factors you mention above seems submitted. Well, unluckily, two of them at 54450 - 54500 range were duplicates, but that's another issue.


    On the other hand, it would be much better if this issue with logging in was resolved. It really makes one feel like something is wrong.

Page 3 of 3 FirstFirst 123

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •