Page 4 of 4 FirstFirst 1234
Results 121 to 141 of 141

Thread: Sieve coordination discussion

  1. #121
    Senior Member
    Join Date
    Apr 2004
    Location
    Florianopolis - Santa Catarina - Brazil
    Posts
    114
    I am using proth cmov for sieving. What result file should I send to SoB: factexcl.txt, factrange.txt, or both?

  2. #122
    The only really necessary file is fact.txt
    If you don't have that, it means you haven't found any unique factors yet-keep on sieving, it will appear eventually!
    Generally, I think it's a good idea (for completeness) to include also factors from factexcl.txt and factrange.txt, but these won't save any tests...

  3. #123
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    But your factrange.txt file can help us. Please send your factrange.txt files to factrange@yahoo.com. It would be nice, but not necessary to put factrange somewhere in the subject line. Please zip it or compress it with your program of choice. We read .zip, .bz2, .bzip2, .7z, .rar, .tar, .gz, .gzip, etc, and of course, .txt. Thank you.
    Joe O

  4. #124
    Senior Member
    Join Date
    Apr 2004
    Location
    Florianopolis - Santa Catarina - Brazil
    Posts
    114
    OK, Joe O, I will send it to you.

  5. #125
    Senior Member engracio's Avatar
    Join Date
    Jun 2004
    Location
    Illinois
    Posts
    237
    Joe O & vjs,


    Weekly I have been sending the factrange.txt and factexl.txt to this url: http://www.seventeenorbust.com/sieve/ . The sieve results submission page, do you guys have access to those .txt files?

    For your purposes do you still need those .txt files sent to: factrange@yahoo.com ?


    e



  6. #126
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    e,

    Yes we still need them set to factrange and we do apply all sob factors to the high-n dat, on a ~weekly basis, quite a few people are sending...

    Any factors above 20M are actually ignored by the server, so if you are only sending factrange to sob we can't get those >20M.

    You have to PM me with your e-mail I'll put you on the e-mail list, graphs stats etc etc.

    Humm, have you been e-mailing those files to factrange? You might be one of the users who we don't know who you are???

    If you only sent them to sob and not factrange@yahoo.com, then deleted them don't worry too much. Alot of the factors in factrange are actually excluded and duplicates. Last time I checked only about 5-10% of those were unique, I'd have to check with Joe he does a wonderful job with the db and can probably answer your question better than I can.

  7. #127
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    VJS' 5-10% may be a little low. The last two files I processed were 4% and 40%.
    Input 97 factors Added 4 factors

    Input 1963 factors Added 79 factors
    Attached Images Attached Images
    Joe O

  8. #128
    Senior Member engracio's Avatar
    Join Date
    Jun 2004
    Location
    Illinois
    Posts
    237
    Joe O & vjs,


    I was mainly sending the factrange.txt and factexcl.txt to sob keep track of the ranges I have completed. I'll keep on sending those .txt file to sob and email those .txt to factrange@yahoo.com every couple of weeks. vjs, does it sound like a plan? As for the previous data's, it is history.


    e

  9. #129
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    engracio,
    Sounds like a plan to me!
    Joe O

  10. #130
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Joe's really the person to answer question regarding where the factors are coming from.

    But I do have a hacked together stat of what's been going on, I hope this one is the corrected version.

    Code:
    Lower	Upper	k/n's	k/n's	Factors	Found 	Found 	Found
    (n>)	(n<)	Orginal	Remain	Found	by 10K	by 2.5T	by 3T+
    0	1	28187	27992	195	0	39	156
    1	3	53908	53787	121	0	23	98
    3	8	131984	131700	284	0	0	284
    8	10	53115	52991	124	0	0	124
    10	20	265330	264755	575	0	240	335
    20	30	648872	311109	337763	331271	3871	2621
    30	40	648663	311598	337065	330829	3604	2632
    40	50	649463	312441	337022	330923	3500	2599
    50	60	649117	319329	329788	318159	5780	5849
    60	70	648603	320929	327674	315355	6131	6188
    70	80	648590	321341	327249	310861	10282	6106
    80	90	648497	320569	327928	310689	11080	6159
    90	100	648923	321483	327440	310061	11187	6192
    							
    							
    0	1	28187	27992	195	0	39	156
    dat	%	100	99.31	0.69	0.00	0.14	0.55
    1	20	450429	449446	983	0	240	743
    dat	%	100	99.78	0.22	0.00	0.05	0.16
    20	50	1946998	935148	1011850	993023	10975	7852
    dat	%	100	48.03	51.97	51.00	0.56	0.40
    0	50	2479522	1466373	1013149	993023	11277	8849
    dat	%	100	59.14	40.86	40.05	0.45	0.36
    50	100	3243730	1603651	1640079	1565125	44460	30494
    dat	%	100	49.44	50.56	48.25	1.37	0.94
    0	100	5723252	3070024	2653228	2558148	55737	39343
    dat	%	100	53.64	46.36	44.70	0.97	0.69
    The dat %'s are a comparison to the number of k/n pairs in the orginal dat.

    Please note that the 10K, 2.5T, 3T+, references don't accurately point out the true p-level.

    Found by 2.5T (for example) - We have not sieved to 100M with all p<2.5T, also included would be the factrange submissions as they come in, those values from the maineffort, other higher ranges which have been completed, "frodo's missed factor range, etc"...

    I'd like to add at this point people should not consider sieving at higher T values our best bet effort at the moment is to continue with the main effort until the limit of proth is reached or advised otherwise by the project owners.

    I think it's also important to point out the n<20M k/n pairs factors etc. The orignal number is not the "orginal number of total k/n pairs possible" it's just the number we started with thanks to the main effort. If you compare 10M<n<20M to 20M<n<30M you can see how much deeper the main effort has sieved.

    Also alot of those k/n pairs eliminated <20M are actually those included from the maineffort. Those less than 1M are our effort and factrange of course. (Joe correct me here if my comments are incorrect)

    Last edited by vjs; 01-28-2005 at 12:52 PM.

  11. #131
    Unholy Undead Death's Avatar
    Join Date
    Sep 2003
    Location
    Kyiv, Ukraine
    Posts
    907
    Blog Entries
    1

    not logged

    699598206253927|4847*2^12764247+1
    699599049182837|22699*2^15108238+1

    well, everything is allright
    Last edited by Death; 02-01-2005 at 04:29 AM.
    wbr, Me. Dead J. Dona \


  12. #132
    TeamRetro Siever
    Join Date
    Oct 2004
    Location
    Lebanon, NH
    Posts
    39
    I was just reading the thread in the main forum regarding error testing.
    Do new users of the sieve need to start with a known small range to verify that their computers are working properly? Or is the math not as much a problem with sieving as it is in the PRP effort?

    Stromkarl

  13. #133
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    Originally posted by Stromkarl
    I was just reading the thread in the main forum regarding error testing.
    Do new users of the sieve need to start with a known small range to verify that their computers are working properly? Or is the math not as much a problem with sieving as it is in the PRP effort?

    Stromkarl
    That's a good idea, but we haven't done it. Just reserve a decent size range and "start your engines". Welcome to sieving! And yes, there is not as much a problem with sieving as in the PRP effort. Here you may just miss a factor. If you miss too many, gap analysis will find it, and someone will post about it. In PRP if you miss a prime, it will be a long time before double checking finds it.
    Joe O

  14. #134
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Doublechecking is really something that Joe, myself and a few nameless others are working on.

    We are not truly doublechecking but sieving from 991<n<50M, where as the main effort is sieving from the secondpass n ~1.5M<n<20M currently. In the past several dats have been used 300K<n<3<, 3M<n<20M, and 1M<n<20M. Our large dat spans all of these dats and is about 15% slower than the current dat. In doing so our large range does infact find factors missed by previous programs and dats as well as find factors below and above.

    There have been quite a few factors missed, nothing to get excited about but enough to keep Joe and I going and interested. Currently the worst thing to do would be recheck one of the old ranges using the current 1.5M<n<20M dat your chances of fiding a factor is better reserving a new range.

    Welcome to sieve, there is alot to learn in this project if your interested and alot of question still remain.

  15. #135
    TeamRetro Siever
    Join Date
    Oct 2004
    Location
    Lebanon, NH
    Posts
    39
    Originally posted by vjs
    <snip>
    If people are interested in where you and others are Mike has a fantastic page...

    http://www.aooq73.dsl.pipex.com/gaps...n3_20_ps0u.htm
    The above is a link shows all the gaps..
    <snip>
    I have gone to this site and it doesn't show anything for 678000-678500 yet. How often is it updated? Does it include all 78 factors I have submitted already from factexcl.txt and 1 from factrange.txt that was below 1T and the 1 I have submitted from fact.txt?

    I have also found 5 factors above 20T. I will submit them to the factrange[at]yahoo[dot]com email address when the range is done, if Joe_O and vjs still want them.

    Stromkarl

  16. #136
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Try these pages for more a more detailed look at the gaps...

    http://www.aooq73.dsl.pipex.com/gaps...n3_20_p04u.htm

    I have also found 5 factors above 20T.
    Are you sieving in that range? or how did you find them. Joe, myself and a few others are already working on all ranges less than 50T.

    Please e-mail the factors but if they are within the range of 20-23T it's already been sieved and we have found quite a few between 1M<n<20M.

    If anyone is planning on resieving any range please check with us first we have a better system and my have done the range already... Thanks.

  17. #137
    VJS, I think Stromkarl means he has found 5 factors with n>20 million in his factrange.txt

  18. #138
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Ahhh...

    20M 20T are totally different I hate these k,K,M,m,N,n,p,P,T,G,Y people including myself get them confused.

    Yes if you found factors for numbers like


    678XXXXXXXXXXXXXXXX | 67097*2^ 21,156,785+1


    Yes we still need them and send them to factrange yahoo com thanks for helping out.

    we also need factors like

    678XXXXXXXXXXXXXXXX | 24737*2^991+1

    as well so just send your entire factrange.txt file.

  19. #139
    TeamRetro Siever
    Join Date
    Oct 2004
    Location
    Lebanon, NH
    Posts
    39
    Sorry, brain lapse there. It is indeed 20M, not 20T. I also found one around 186k also. I submitted that one to SOB, but I will submit it to you also.

    Stromkarl

  20. #140

    Stopping Seiving

    I have to stop seiving. I am not sure how well I did, but will post my results here.

    file name: SoB.bat <contents below>
    ./NbeGon_010_osx -s=SoB.dat -f=SoB.del -d=1.42 -p=630688573751303-631000000000000

    file name: SoB.del <contents below>
    4758824368237 | 22699*2^2421118+1

    I am assuming that I did not finish my range. Please assign the remainder to another siever.

    thxbai

  21. #141
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    ceselb,

    In the future could you *, +, >, or note in the main thread those ranges reserved with a 991<n<50M dat please.

    740000-742000 Nuri (991<n<50M dat) was omitted and it's possible that I will eventually I'll miss one of these reservations.

    Examples


    747000-761600 engracio (on 991>50m sob.dat)
    738500-739000 Silverfish (with 991-50M dat - ETA:End of April)

    Also do you have a working copy of chucks new sieve program for testing or has this effort entirely been dropped. I havn't heard anything from him this year!!!

    I'd like to test it against our dat and dat's for missed factors etc. I could use the most recent build for either the 32 or 64-bit clients... if there is a difference.

    If you don't want to give it out would you test a small range for me?

    P.S. Sorry for messing up the thread I just want to make sure you see this post.

Page 4 of 4 FirstFirst 1234

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •