Page 2 of 4 FirstFirst 1234 LastLast
Results 41 to 80 of 131

Thread: Sieve Double Checking (1<n<3M)

  1. #41
    ginaguy18p0r
    Guest
    This post is GONE.

    Edited by admin for inappropriate content.

  2. #42
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    Moo_the_cow,
    It might be better if we all used Nuutti's file for the double checking. I know, I'd like to play around with creating my own file too, but I decided it would be better for the project to use Nuutti's file. That way we would all be consistent.
    Joe O

  3. #43
    110 - 200G started

  4. #44
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752

    Am I missing something?

    Just out of curiosity, I checked nuutti's SoB.dat (which has candidates left after siewving up to 5G I guess) file and compiled the factors found files for 5G to 110G.

    The SoB.dat file has 146,586 candidates remaining for 12 k values, for 1 < n < 3,000,000. The distribution of the candidates seems to fit with the proth weights.

    However, factors found between 5G - 110G shows a totally different profile.

    First of all, there are no factors found for 6 of the 12 ks.
    Secondly, k=21181 and k= 33661 has roughly 7 times less factors than I would normally expect.

    There are 3,379 factors for 5G-110G, but if I am not thinking in a false way, there should have been a total of ~9,060 factors distributed somewhere close to the proth weight (or distribution of candidates below 5G - soB.dat file) for all 12 of the ks.

    Is there something I am missing? Can anyone please explain what happened? (why are there no factors for 6 ks, etc.)
    Attached Images Attached Images

  5. #45
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    The k's with no factors are those for which the exponents (n's) would be odd.
    Is everyone using NbeGone?
    Joe O

  6. #46
    Yes, I'm using NbeGon. I just checked the SoB.del files in progress for 110-200G and see the same problem. Is it a quirk in the siever, or a problem with the .dat file?

  7. #47
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    I don't know. I'm rerunning with SobSieve 1.24 to see if it picks up more factors than NbeGone did. I checked out the first few values for each k and they looked OK. Tomorrow I'll do a more thorough check.

    Halon50, What SOB.DAT file are you using. Nuutti's I hope? Joe.
    Joe O

  8. #48
    Yes, I am using the one posted in this thread by Nuutti.

    EDIT: I am restarting the 110-200G range using SoBSieve. NbeGon appears to have problems coping with ranges above 140G or so on my machines.
    Last edited by Halon50; 02-24-2003 at 12:33 AM.

  9. #49
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    Halon50, Thanks for the reply. Let us know if you get any odd exponents, or for that matter, any new ones not reported by NbeGon! Thanks, Joe.
    Joe O

  10. #50
    Sure thing. There is definitely a problem somewhere. Here are some results of SoBSieve and NbeGon side by side of what I have so far:

    NbeGon 0.10
    Code:
    110124872311 | 22699*2^75862+1
    110141391641 | 55459*2^1284646+1
    110187393007 | 5359*2^2696542+1
    110401926199 | 19249*2^1801922+1
    110459686193 | 55459*2^759010+1
    110504531873 | 5359*2^1558702+1
    110650187057 | 19249*2^2069978+1
    110883693353 | 22699*2^1774630+1
    110905272631 | 5359*2^657582+1
    111008295841 | 55459*2^842554+1
    111130595831 | 55459*2^2380846+1
    111232165727 | 5359*2^83206+1
    111416448439 | 19249*2^1402718+1
    111479364287 | 5359*2^1286982+1
    111690703367 | 5359*2^2141902+1
    111783172703 | 5359*2^676942+1
    111807521113 | 55459*2^2502994+1
    111950247623 | 5359*2^804262+1
    111968902271 | 22699*2^951382+1
    112003105783 | 19249*2^1442786+1
    SobSieve 1.24
    Code:
    110187393007 | 5359*2^2696542+1
    110194466563 | 21181*2^1368572+1
    110280749687 | 33661*2^2263200+1
    110290454363 | 21181*2^13940+1
    110310414127 | 21181*2^2067644+1
    110379014047 | 21181*2^1030340+1
    110401926199 | 19249*2^1801922+1
    110504531873 | 5359*2^1558702+1
    110517704443 | 21181*2^175484+1
    110532992543 | 21181*2^1168940+1
    110587726357 | 5359*2^256590+1
    110623372469 | 33661*2^1983816+1
    110650187057 | 19249*2^2069978+1
    110672279719 | 33661*2^1940280+1
    110748547333 | 33661*2^606336+1
    110905272631 | 5359*2^657582+1
    110924314247 | 21181*2^2463068+1
    110959837349 | 21181*2^2186300+1
    111052863797 | 33661*2^1945944+1
    111061477931 | 5359*2^2291302+1
    111204786283 | 21181*2^382340+1
    111232165727 | 5359*2^83206+1
    111321631303 | 33661*2^2752800+1
    111410601251 | 21181*2^1988828+1
    111416448439 | 19249*2^1402718+1
    111438448507 | 5359*2^2979822+1
    111479364287 | 5359*2^1286982+1
    111504162893 | 5359*2^624510+1
    111564158773 | 5359*2^383326+1
    111579443959 | 33661*2^1686024+1
    111612066089 | 33661*2^1950576+1
    111650361041 | 33661*2^407880+1
    111690703367 | 5359*2^2141902+1
    111734982563 | 33661*2^801240+1
    111759086029 | 33661*2^43416+1
    111783172703 | 5359*2^676942+1
    111836892373 | 21181*2^330740+1
    111950247623 | 5359*2^804262+1
    112003105783 | 19249*2^1442786+1
    Neither appear to be exploring the full range of n values. What few matches that do show up appear to be for the n=5359 and n=19249 values only.

    I hope these results aren't indicative of what's happening in the normal Sieve project!

  11. #51
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    Originally posted by Halon50

    I hope these results aren't indicative of what's happening in the normal Sieve project!
    That's my fear also. I think it's high time to get the big guns in here!
    Joe O

  12. #52
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    This is very strange.

    Probably there is a problem with the SoB.dat file, but looking into it everything seems normal. I don't really even want to think about the probability of both of the clients giving false results.

    Anyway, as you mentioned Halon50, there is a pattern in your factors above (as far as I can see).

    (0/0) For k=4847: Neither NbeGon 0.10 nor SobSieve 1.24 found a single factor.
    (8/13) For k=5359: This is strange. NbeGon 0.10 found 8 factors, while SobSieve 1.24 found all of those 8 and an additional 5.
    (0/0) For k=10223: Neither NbeGon 0.10 nor SobSieve 1.24 found a single factor.
    (4/4) For k=19249: Both NbeGon 0.10 and SobSieve 1.24 found exactly the same 4 factors.
    (0/11) For k=21181: NbeGon 0.10 found no factors at all, while SobSieve 1.24 found 11 factors.
    (3/0) For k=22699: NbeGon 0.10 found 3 factors, while SobSieve 1.24 found no factors at all.
    (0/0) For k=24737: Neither NbeGon 0.10 nor SobSieve 1.24 found a single factor.
    (0/0) For k=27653: Neither NbeGon 0.10 nor SobSieve 1.24 found a single factor.
    (0/0) For k=28433: Neither NbeGon 0.10 nor SobSieve 1.24 found a single factor.
    (0/11) For k=33661: NbeGon 0.10 found no factors at all, while SobSieve 1.24 found 11 factors.
    (5/0) For k=55459: NbeGon 0.10 found 5 factors, while SobSieve 1.24 found no factors at all.
    (0/0) For k=67607: Neither NbeGon 0.10 nor SobSieve 1.24 found a single factor.

  13. #53
    Hi guys.

    Set nmin to something greater than 1 - I would recommend 100. All of your problems will go away!

    This is because the code has to work out, for each p, (1/2)^nmin mod p. My code definitely assumes that nmin>1.

    Regards,

    Paul.

  14. #54
    Hi guys.

    Set nmin to something greater than 1 - I would recommend 100. All of your problems will go away!

    This is because the code has to work out, for each p, (1/2)^nmin mod p. My code definitely assumes that nmin>1.

    Regards,

    Paul.

  15. #55
    By the way - this will also apply to the SoB.dat file that was created in the first place, which may explain why there are other problems.

    Sorry about that. Since the program was designed for sieving the Sierpinski p's for n=3-20, I did no testing with different values of k (I know it will fail where 3 divides k) or for this range.



    Regards,

    Paul.

  16. #56
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Thanks for the info paul. That explains everything.

  17. #57
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    Before Paul's suggestions, I had been trying a few experiments.

    I created a new sob.dat file using the SobSieve 1.06 (but using 1 as nmin). For file creation, I sieved to 1G.

    I have then taken the resultant sob.dat, and tried NbeGon_010, SobSieve 1.22 and SobSieve 1.06.

    It would appear that only SobSieve 1.06 works correctly.

    NbeGon_010 and SobSieve 1.22 are not only missing whole k ranges (different in each case), but also factors within the k ranges where factors are not found (which are found with SobSieve 1.06.

    Paul, I'll take your advice, and start again with nmin=100, and see what happens.
    Last edited by MikeH; 02-24-2003 at 02:52 PM.

  18. #58
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    So it will take 13 changes to Nuuttis file.

    Line 2:
    100 replaces 1

    After k=4847,
    111 replaces the two lines:
    1
    +110

    After k=5359,
    366 replaces the two lines:
    1
    +365

    After k=10223,
    101 replaces the two lines:
    1
    +100

    After k=19249,
    266 replaces the two lines:
    1
    +265

    After k=21181,
    620 replaces the two lines:
    1
    +619

    After k=22699,
    190 replaces the two lines:
    1
    +189

    After k=24737
    607 replaces the two lines:
    1
    +606

    After k=27653,
    177 replaces the two lines:
    1
    +176

    After k=28433,
    265 replaces the two lines:
    1
    +264

    After k=33661,
    168 replaces the two lines:
    1
    +167

    After k=55459,
    706 replaces the two lines:
    1
    +705

    After k=67607,
    531 replaces the two lines
    1
    +530

    The new file will be 12 lines shorter.

    I can email the changed file to anyone who wants it, just PM me with your email address. Perhaps Nuutti can host the changed file.

    EDIT:
    MikeH is creating a new file and has offered to make it available after he checks it out. See his post below.
    Last edited by Joe O; 02-24-2003 at 02:22 PM.
    Joe O

  19. #59
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    MikeH,
    You posted while I was preparing my post. Do you want to sieve to 10G or so and make the file available after you check it out?
    Last edited by Joe O; 02-24-2003 at 02:02 PM.
    Joe O

  20. #60
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    Joe O,

    No problem. I'll do that. I'll check the resultant file at 1G against my previous attempt. Hopefully the diffs should be simple, as you suggest.

    I'll the quickly give the same 3 clients a quick run through to see if the problems I observed are now resolved.

    Mike.

  21. #61
    Ok, I'll stop sieving here and wait for the new .dat file, then restart the 20-110G range.

  22. #62
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    OK, good news.

    I used 100<n<3M, sieved 0-1G with SobSieve 1.06. Resultant sob.dat file was as expected. I then tried sieving from 1G onwards with SobSieve 1.06, 1.22 and 1.24 and NbeGon 010. Then compared the first 50 factors from each client - all the same.

    I've now sieved (with SobSieve 1.24) up to 5G (and have submitted the factors for 1G - 5G).

    I've then generated a new Sob.dat file, and compared this against the original 5G file from nuutti. The changes are exactly as Joe predicted. This is really good news, and again proves that this is all now working fine.

    I'd like to attach the sob.dat file and the SobStatus.dat file (eliminated factors for 1G-5G), but even zipped, they are too big to post here.

    Nuutti, I will e-mail you a copy of the files, I hope you are able to host them. If anyone else can help host, drop me a private message.

    I will now go on and sieve 5G-20G.

    Mike.

    P.S. The client of choice (for p=5G) is SobSieve 1.24, @300Kp/s it is about 1.75x faster than NbeGon (I have AMD XP2100+)
    Last edited by MikeH; 02-25-2003 at 04:06 PM.

  23. #63
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    Ok, I just uploaded the file.
    Grab it here

  24. #64
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    5 - 20 MikeH Complete

    Reserving

    110 - 200 MikeH

    I've attached the factors for 5G-20G.

    In order to try to replicate the 3M<n<20M sieving effort, when 200G is complete, I can generate a new sob.dat, which can be used from then onwards.

    Also from 200G onwards, hole finding will be possible, since Louie's daily updated results.txt contains all values of n (which I only noticed today whilst adding k and n discrimination to my sieve hole finder).

    Mike.
    Attached Files Attached Files

  25. #65
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643

    Status

    Just thought that I'd gather the following information together:

    0 - 5 MikeH [complete]
    5 - 20 MikeH [complete]
    20 - 110 Halon50
    110 - 200 MikeH
    200 - 250 frmky


    5000 - 5050 Joe O

    Are there any others?

    I'll continue to edit this post till the time limit and then repost below. Please check all newer posts just in case.
    Last edited by Joe O; 02-27-2003 at 09:12 AM.
    Joe O

  26. #66
    Thanks guys! I'll get cracking on the 20-110G range tonight.

  27. #67
    I'll take the 200-250G range.

    Greg

  28. #68
    200-250G done and submitting. The factors are zipped and attached to this message. I'll take 250-300G now.
    Attached Files Attached Files

  29. #69
    250-300G done, submitted, and attached. Working on 300-350G now. Is anyone else still working on these?

    Greg
    Attached Files Attached Files

  30. #70
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    Is anyone else still working on these?
    I'm still working on the 110-200 range (on a slow PC). I'm currently treating the 3M<n<20M sieving as higher priority.

    Please keep up the good work.

    Mike.

  31. #71
    I'm still working on the 110-200 range (on a slow PC). I'm currently treating the 3M<n<20M sieving as higher priority.
    Just checking. Thanks!
    Greg

  32. #72
    Currently submitting 20-110G. There are thousands of factors here, so it's taking a while to chunk the data up and submit.

    While I wait for the sieve submission, I'll try to post zipped portions of the data files. Here's 20-40G:
    Attached Files Attached Files

  33. #73
    40-75G:
    Attached Files Attached Files

  34. #74
    75-110G:
    Attached Files Attached Files

  35. #75
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643

    Status

    Just thought that I'd gather the following information together:

    0 - 5 MikeH [complete]
    5 - 20 MikeH [complete]
    20 - 110 Halon50 [complete]
    110 - 200 MikeH
    200 - 250 frmky [complete]
    250 - 300 frmky [complete]
    300 - 350 frmky [complete]
    350 - 450 frmky


    5000 - 5500 Joe O

    Are there any others?

    I'll continue to edit this post till the time limit and then repost below. Please check all newer posts just in case.
    Last edited by Joe O; 03-05-2003 at 08:57 AM.
    Joe O

  36. #76
    Whew, just finished up submission.

    In the 20-110G range:

    9898 factors found
    8001 were new factors (not including the 20-21G range to which I lost statistics for the 322 factors submitted)


    I'll resume sieving once y'all set up and post a new SoB.dat file for factors >200G.

  37. #77
    300-350 is done. I'll do 350-450 now.

    Greg
    Attached Files Attached Files

  38. #78
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    I'm taking 450-500

  39. #79
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643

    Halon50,

    I'm not so sure that there will be a new file. From a post by Phil (Carmody)

    quote:
    --------------------------------------------------------------------------------
    Originally posted by jjjjL
    This takes the sieve file from ~815000 n values to 746961 values. This is a linear reduction of 8.3%. However, I believe the speed of the sieve is related to the sqrt(n)
    ...

    --------------------------------------------------------------------------------



    The n in that is the whole range, not the number of candidates remaining. So it won't speed up anything apart from the file load part of the restart times.
    So for the 3M<n<20M effort Louie said
    ahh, i had a feeling the speed dependence might be on the range of n and not the # of candidates. in that case, the sieve file will probably not be updated too often.
    What does everyone else think?

    Edit: The current range is 2999900, we would have to reduce it to 2429919 in order to achieve a 10% speed up.
    Last edited by Joe O; 03-07-2003 at 03:02 PM.
    Joe O

  40. #80
    The speed of the sieve is proportional to the square root of the range, maxn-minn.

    The way that it works is that for each p, it finds the n values - if any - in the range nmin to nmax that that p will remove. It then looks to see if they are present in the sieve, and if so they are removed.

    This means that a sieve containing just two values, n=3 million and n=20 million, will run as fast as one containing lots of other values (well, that is not quite true as other optimisations would kick in, but for the sake of this argument that is irrelevant).

    Regards,

    Paul.

    BTW I see that exponents >3 million are now being tested - congratulations! I have every hope that SoB will soon find one of the largest primes known to man. SoB may also knock a Mersenne off the top spot, which would be a real coup.

Page 2 of 4 FirstFirst 1234 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •