Results 1 to 3 of 3

Thread: The dat file p<25T vs p<300T

  1. #1
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331

    The dat file p<25T vs p<300T

    There was a comment made in the sieve section about changing the dat file to reflect all p<300T vs the p<25T at current.

    What's this all about and will/does it need to be updated?

  2. #2
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    The dat file was last updated when we found our last prime. 90% sieve point was at 162T at that time. I think it's a bit early to update the file.


    On the other hand, I guess they're talking about the results.txt file. This is important only if you are regularly downloading that file for factoring reasons. The purpose, I guess, is to lower the bandwith needed for the process. The current size of the file (zipped) is 1.56 MB.


    I dunno about up to 300T, but I think we can safely increase the lower limit of results.txt file up to 132T* (or even 100T is just fine). Roughly 45% of the factors in the results.txt file are 25T<p<100T. And there is an additional 10% between 100T and 132T.

    * 132T is the 100% sieve point as of the last dat file.

    If we want to increase the lower limit to 300T, it's also fine. We'll get rid of the 80% of the results.txt file size. On the other hand, we'll have to update the dat file to avoid some unnecessary P-1 tests being performed. (The impact would be less than 1%. Still each unnecessary test is a waste of computer time anyway).

  3. #3
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Thanks Nuri,

    Lets just hope for that next prime then my bets, although optimistic are by 7.25M.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •