Results 1 to 13 of 13

Thread: Max P value

  1. #1
    Senior Member
    Join Date
    Jan 2003
    Location
    U.S
    Posts
    123

    Max P value

    Anyone know the max p value that the SoBSieve can handle?
    I think that we may be reaching it soon............

  2. #2
    NBeGon may be limited to 4,503,599,627,370,496.
    SoBSieve can likely run to 1,152,921,504,606,846,976.

    Both these limits are more than enough to sieve more or less indefinately.

    -Louie

  3. #3
    Only 6 or 7 (base 10) digits left!?!?!?!?


  4. #4
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    We're already at 800,000,000,000, ain't we?
    So *only* 4 respective 7 digits left. We urgently need an improvement - in like 5 years...

  5. #5
    Member
    Join Date
    Dec 2002
    Location
    new york
    Posts
    76
    On my system, NbeGon 0.10 starts rehashing at 2.25P (peta) and exits at 2.3P.

    SoBSieve's rate climbs suspiciously after 4E (exa) and it overflows somewhere around 10E.

  6. #6
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    How long are we planning to sieve?

    I feel like we should continue siveing even after SoB reaches max n = 3 million. The 3 - 20 million range is so huge and I am still removing numbers siveing 805G at a speed more than 100 times if I would prp test a number within that range.

    I guess Louie could easily pool out some numbers (up to 3.5 m for ex) from the lower bound of SoB.dat file, and when we reach max n 3.5 m, pool out more numbers and update the SoB.dat to have 4 - 20 m, etc.

    I know, of course, it depends on relative speeds of sieving vs prp. But, with the current sieve client speeds, I feel like it would make sense to go forward a bit more.

    So, two questions:

    To Louie: What is your game plan?

    To all fellow sievers: How long more (and how much of) your resources would you be willing to commit if the sieving effort would continue, let's say, for the next 12 months?

    Regards,

    Nuri

  7. #7
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    Unaware of Louie exact plans (at least officially ), I have to confine myself to answering the second question - for myself:

    I signed up at this project after RC5-64 was completed and I looked around for a suitable successive project. I'll (let) work for it as long as I'm interested - and I stayed @ RC5-64 for several years. So I guess I won't loose interest in this project that early.
    Concerning the decision whether sieving is preferable for me or prp testing, I'll do whatever Louie (or other ppl with the proper insight) thinks is best for the project, as they know pretty good what they talk about.
    (at least compared to me)

    Plus, I don't share resources across several projects if it's avoidable. So I'll put as much computing power into this project as I can reasonably aquire.

  8. #8
    Moderator ceselb's Avatar
    Join Date
    Jun 2002
    Location
    Linkoping, Sweden
    Posts
    224
    Unlesss I find some other interesting project I'll continue to sieve.

  9. #9
    Right now, I'd say we're under-sieved. It's nothing critical, but I think that any amount of sieving could be justified for at least the next 2-3 months.

    Lets do a quick analysis: My dual celeron 300A@504MHz.

    It just got a test at 1pm, 2 days ago and it's only 53% done. A little math shows that a full test on a single proc is taking.

    (1/.53) * (2*24+7) = ~103 hours / test.

    For happy round numbers, and since 504MHz is pretty arbitrary anyway, lets say that 100 hrs is the average test time. This ignores that fact that tests will take EVEN longer when we start testing above n = 3mill... but anyway...

    Now lets compare to the sieving I'm doing right now. The newest siever, is runing at 117k/s on my Athlon 1.33GHz. Just to make it fair, I'll assume it would be 1/3 that speed (40k p/s) on one of the celeron procs. In the current range I'm sieving the density of factors is found like this

    total factors / (range high - range low)

    43 / (613406406359 - 612237314749) = ~3.7 x 10^-8 or 37 factors / billion

    Now you can use the rate and the density to calculate a rough removal rate:

    ( 37 factors / billion ) * ( 40000 x 10^-9 billion / sec) = 0.00148 removals / sec

    converting to hours for easy comparison, we have:

    0.00148 * 3600 sec/hr = 5.328 factors / hour

    Now realize that the times are sooo drastically different, that they aren't even expressed in the same fashion. We talk about "hours per test" vs. "factors per hour"... there's definately a reason here . If we invert the prp test number, we get that:

    100 hours / test = 0.01 prp tests / hour.

    Just to be as fair as possible, there should probably be a small amount of "handicap" applied to factoring to account of the fact that some factors do not actually remove tests (because they are for a number that already has a known factor). Conservatively, it would only be around a 10% decrease, making the corrected factoring speed ~4.8 crucial factors / hour.

    So factoring at the 600G level is about 48000% more effective than prp tests. This is not taking into account the obvious slowdown in prp speed as n grows OR the fact that a factor is 100% proof of a composite while a prp test may eventually need to be redone to verify residues for the rare computation error.

    So in the future, two things will be working against each other... finding new factors will get harder since their density will quickly deplete... and prp tests will get slower as "n" climbs. However, for the forseeable future, putting whatever resources you want into sieving makes sense. The question of "should we still be sieving?" is WAY premature at this point. The answer is a resounding "Yes!". The question should probably be raised again when the factor removal rate falls below 1 factor / day, which will likely not happen until around 25,000-50,000G or so.

    Hope that helps those of you who are intersted in the heuristics of determining optimal sieve lengths. Many factors could change the above limits though... if a faster sieve comes out or a k value is eliminated or SB itself speeds up, all of what I calculated above would have to be redone.

    -Louie

  10. #10
    Member
    Join Date
    Sep 2002
    Location
    London
    Posts
    94
    OK. I think that there is one problem in our analysis.
    I mean that our factor removal rate is affected by our
    sieving range. We are now sieving 3-20 million and if we increase
    this to 3-100 million we will remove factors a lot faster.
    And we will find several primes under 8 million so factoring
    to these numbers is wasted when we factor up to 20 million.
    If we compare how many numbers we are removing in the range
    3- 5 million then analysis is quite different. Then our rate is only 2/17 * current rate. But it seems that we shoul sieve at leat up to 10T before we are in 3 million and continue sieving all the time. We have to remember that sieving in the future is faster than now
    and it will take years to get over 10 million in prp testing.

    Yous,

    Nuutti

  11. #11
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Thanks for the responses.

    It's nice to hear that we agree on that we're only at the beginning of the sieving effort.

  12. #12
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    For comparative purposes, can someone remind me exactly how deep the existing numbers (n < 3M) were sieved? I know that k=4748 went to about 1.5T. How about the rest (which Louie did)?

    I realise the new sieves are so much faster than was NewPGen, but I'd just like to get a feel for the potential difference.

    As to the question of when to stop sieving, I assume you'll now keep the factor submission page up for good. Now I've put together a (fairly) simply batch file that automates the distribution of p ranges to my small number of sieving PCs, I plan to keep that going indefinitely - it's not too fast, cumulative ~150Kp/s, but since they can't get to the internet (so can't do SB), they might as well be doing something! When everyone else has given up, I'll still be picking out one factor per week!

  13. #13
    Senior Member
    Join Date
    Jan 2003
    Location
    U.S
    Posts
    123
    quote:
    __________________________________

    Only 6 or 7 (base 10) digits left!?!?!?!?
    __________________________________

    edit: Only 5 or 6 (base 10) digits left

    A P.S that is MUCH longer than the message:

    Only 5 days ago, when I reserved my 10000 G to
    10001 G range for testing (which only has 1 factor),
    the idea of the eveyone sieving to 10T was completely
    absurd. At that time, my sort-of outdated PIII was
    crunching a mere 8000 p/s and my antique collectible
    P-90 was crunching an even more pathetic 670 p/s.
    Now, things have drastically changed, with my total
    sieving rate being a (then) shocking 65,000 p/s. I
    realize that a sieve limit of up to 12 T is possible,
    especially if the sieving versions keep improving
    So, the main point of this PS is to keep up the sieving,
    and we should try to make 10T our goal.
    btw, I'll update this "countdown" to reaching the max p
    value regularly (though I don't know how often that
    may be)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •