Page 3 of 5 FirstFirst 12345 LastLast
Results 81 to 120 of 194

Thread: Factor bragging

  1. #81
    Hater of webboards
    Join Date
    Feb 2003
    Location
    København, Denmark
    Posts
    205
    Sorry for causing confusion. I don't remember having turned off GCD at the end of stage 1, but if you say it's an option, that's probably right.

  2. #82
    Hater of webboards
    Join Date
    Feb 2003
    Location
    København, Denmark
    Posts
    205
    Notice the n of this factor:

    204324288555810331 | 33661*2^6900000+1


    204324288555810330 = 2*3^5*5*97*1543*6007*93523

  3. #83
    Senior Member Frodo42's Avatar
    Join Date
    Nov 2002
    Location
    Jutland, Denmark
    Posts
    299
    Found in stage 1
    14201216740129537 | 10223*2^6983885+1

    14201216740129536 = 2 ^ 8 x 3 ^ 3 x 13 x 29 x 271 x 2671 x 7529

    My smoothest factor yet

  4. #84
    Senior Member Frodo42's Avatar
    Join Date
    Nov 2002
    Location
    Jutland, Denmark
    Posts
    299
    P-1 found a factor in stage #2, B1=40000, B2=440000.
    4847*2^7160703+1 has a factor: 1043098013353843

    1043098013353842 = 2 x 3 ^ 5 x 5387 x 7127 x 55903

    My first factor found with Geoges new code.
    And I got my 9th place back, but not for all that long unless I find more factors.

  5. #85
    Forgotten Member
    Join Date
    Dec 2003
    Location
    US
    Posts
    64
    22699*2^7247638+1 has a factor: 581445854728607717657

    This seems pretty large, am I wrong for thinking that?

    581445854728607717656 = 2 ^ 3 x 17 ^ 2 x 37 x 2029 x 4643 x 11117 x 64901
    Last edited by pixl97; 10-19-2004 at 06:02 PM.

  6. #86
    Member
    Join Date
    Feb 2003
    Location
    Lucerne, Switzerland
    Posts
    30
    Very smoth:

    67607*2^9107451+1 has a factor: 4921636623093246299

    4 921636 623093 246298 = 2 x 11 x 29 x 127 x 1553 x 2161 x 2531 x 7151

  7. #87
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    WoW,

    That's great, but may I ask why you are factoring at such a high n-level?

    There is nothing wrong with factoring a particular k and some n level, but please let us know that you have. I was personally thinking of doing some factoring around 13m to remove some of those ?33661? tests.

  8. #88
    Member
    Join Date
    Feb 2003
    Location
    Lucerne, Switzerland
    Posts
    30
    Sorry for not reserving, but I thought, that my range does not fit into the main range.

    At the moment, I work on 67607 starting from 9million. Depending on how much time it takes I will continue until 9.3..10 million.

    At the moment I use B1= 60000

    Reason why I factor in that area:

    No special reason. I like to play around with numbers.
    I this case I want to bring the entry 67607 at 9..10M in MikeH's Database below 1000.
    It's just for fun



    I will keep the factoring details, so you will have that information, when the main factoring effort reaches that area.

    Reto

  9. #89
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Thought you were trying to get an early start on the 200K point score ...

    As for your choice of B1, ... others will have to comment, sieving will probably reach 2^50 by then, but your goal is to reduce it to <1000 ...

    I'd just worry about P-1'ing very lightly and having to redo them again later, otherwise I'd say knock yourself out.

    I was considering doing the same sort of thing...

    Trying to eliminate the k/n's in the range of 13460000 to 13470000 for those tests reported dropped http://www.seventeenorbust.com/secret.

    I wish Alien88 would remove those 9999999 tests and place those 33661 tests back where they should be.

  10. #90
    Member
    Join Date
    Feb 2003
    Location
    Lucerne, Switzerland
    Posts
    30
    Originally posted by vjs
    I'd just worry about P-1'ing very lightly and having to redo them again later, otherwise I'd say knock yourself out.
    Do you think I should increase my bounds that it is not necessary to redo the work?

    At the moment I am happy with the efficiency. I got 3 factors out of 100. unfortunately the biggest one disappeared.

    P-1 found a factor in stage #2, B1=60000, B2=660000.
    67607*2^9151371+1 has a factor: 17603629608134545337

    I assume it is excluded because of a smaller factor of that number...

  11. #91
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    unfortunately the biggest one disappeared.
    I just posted that factor into the large factor submission form and it was accepted and added as new. I'd logged out, so all you need to do now is reserve the range in the factor forum and you'll get the credit.

    Largest factor of the year.

    EDIT: Just looked at the posts above, after noticing that all your last findings were 67607. Don't worry about reserving, I'll sort it on my side.
    Last edited by MikeH; 01-10-2005 at 05:27 PM.

  12. #92
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    To me those bounds look good, perhaps someone with a second opinion...

    I'd probably look more at the factor found per time then factors per number of tests. If you can get the same amount of factors in the same time with less tests that would be better of course. I think this can be done by increasing B1 and leaving B2 as is???

    I'm not a factoring guy, but if I member correctly each time you factor a number a file will be created. You can use this file later to go back and choose different bounds and it takes less time etc...I wouldn't delete those files just yet.

    We should be testing those numbers by mid to late year, so I don't think your efforts are in vien. Also when reservations get close or your finished with 9M-10M, you should make a post in the reservation section. Also if you get to 10M before main effort gets there you could always go back with different bounds if you keep those files...

  13. #93
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    Largest factor of the year.
    Make that second largest

  14. #94
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    Originally posted by vjs
    I'd probably look more at the factor found per time then factors per number of tests. If you can get the same amount of factors in the same time with less tests that would be better of course. I think this can be done by increasing B1 and leaving B2 as is???
    The bounds are quite high (I'm not saying they are too high, though). Maybe factors/time is higher with lower bounds, but I don't know.
    I wouldn't lower B1 only, as the optimal ratio between B1 and B2 is important for efficient factoring. As soon as 2 prime factors of the P-1 are > B1, the this factor won't be found by P-1.

    I'm not a factoring guy, but if I member correctly each time you factor a number a file will be created. You can use this file later to go back and choose different bounds and it takes less time etc...I wouldn't delete those files just yet.
    Those files can be re-used, right. But only under certain circumstances (which I can't remember) there won't be a lowered efficiency.

  15. #95
    Member
    Join Date
    Feb 2003
    Location
    Lucerne, Switzerland
    Posts
    30
    Interesting,

    17603629608134545337 is less than 64 bits and should be submittable by the normal form. I thought it was accepted.

  16. #96
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Your right, it should have been accepted with the <64-bit, everyone should remember to check their factor submissions regarless.

    Since you are actually working quite a bit ahead of prp why don't you try slighly higher b1 b2 for 10 pairs for example. See how many more you get and what the increased time expenditure is.

    I've tried factoring some smaller numbers with bounds as high as

    b1=7000000
    b2=17000000

    I think this might be a little to large for those n>9m numbers... take ages etc.

    If I had a fast p4 with alot of memory, I'd try b1=1000000 b2=5000000 just too see how long it would take and the chances etc.

    If I understand correctly your chances of finding a factor with p-1 increase with n as well, it's the old time vs chance of factor. Since we are not at 9m yet you have the time to experiment.

  17. #97
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Originally posted by MikeH
    EDIT: Just looked at the posts above, after noticing that all your last findings were 67607. Don't worry about reserving, I'll sort it on my side.
    @ Biwema: Still, it might be a good idea to simply open a thread under factoring subforum to announce the work (ranges) you've done on 67607. That way, we can simply skip those 67607/n pairs that you've already tested. But, do this only if you're doing some massive work over there. I dunno, may be an n range of, say 1 million or more. If not, do not bother at all.

  18. #98
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    If I understand correctly he has started

    k=67607 from 9M up and is currently around 9.3M

    He is using bounds of

    B1=60000, B2=660000

    These bounds seem pretty reasonable... perhaps he shouldl clarify that he did start at 9M and is increasing systematically...

  19. #99
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Ooops, my bad. I should have read previous posts more carefully.

  20. #100
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    No problem Nuri,

    I'm also really guessing that he has done all inbetween as well...

    Can you comment on the bounds and possibilities of increasing those for better success.

    His main goal is to reduce the 9M<n<10M for k=67607 to (prp tests < 1000)

    sounds like just as good of a goal as any IMHO.

    Also

    17 603629 608134 545336 = 2 ^ 3 x 19 x 97 x 157 x 2473 x 18661 x 164789

    So I guess B1=20000 B2=200000

    Would have found this
    Last edited by vjs; 01-18-2005 at 12:07 PM.

  21. #101
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Well looks like someone has already tried one of the large k/n and gave up in less than a week .

    Just wondering if anyone has tried factoring any of these...

    10223 13467677 (Now in dropped que lowest one )
    24737 13467703 (looks like this one may be assigned but not dropped)
    55459 13467718 (looks like this one may be assigned but not dropped)
    24737 13467727 (Next to be assigned)
    24737 13467751

  22. #102
    24737 13467703 (looks like this one may be assigned but not dropped)
    I got this one will be done in a few hours actually

  23. #103
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Wow!!!

    How long did it take you to finish on what type of machine etc...

  24. #104
    about 27 days on a athalon xp 2400
    running almost constantly butthere was some sieving going on too.

  25. #105
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    1363473963746084778807 | 22699*2^8600311+1


    I've aldo got this! 79.9T factor

    79897555161219 | 22699*2^8600263+1


    and this!!!! 70k factor (which was not verified buy te submission page, as one might expect)

    70173 | 22699*2^8600217+1

  26. #106
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Nuri sorry to say this but here are the factors that acutally exist between 8600215-8600265

    k=27653 n=8600217
    k=4847 n=8600223
    k=4847 n=8600247
    k=33661 n=8600256
    k=24737 n=8600263

    I think you may have a problem with your worktodo.ini

  27. #107
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Ooops!!!

    I guess I know why..

  28. #108
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Corrected & restarted the range..

  29. #109
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    It happens...

    I had a question for the factoring guys...

    How does one get the newest version on prime 95 to do stage1 only factoring with set b1 bounds of say 2M?

  30. #110
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    Either assign too few memory to stage 2, or set the value for stage2 in the workfile to 1.

  31. #111
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    I'm not really even sure about the worktodo.ini values etc.

    Perhaps we can make a stickpost with all of the factoring clients with links and examples of files to create and their contents.

    I'm not sure what to write in the work to do ini

  32. #112
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Here's a sample...

    Pfactor=21181,2,8601140,1,49,1.5
    Pfactor=22699,2,8601238,1,49,1.5
    Pfactor=19249,2,8601278,1,49,1.5
    Pfactor=55459,2,8601286,1,49,1.5


    or, did I get you wrong?

  33. #113
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    No, but thanks nuri this is a good example of the way a typical worktodo.ini should look...

    But what if you only want to run first stage with a paricular B1.

    Or potentially run P+1, not sure if prime95 will do this but the latest ECM program will. I just don't know how to add the switches etc.

  34. #114
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Hmmm. I found 645160797731449 | 4847*2^8799543+1 through P-1 and it was a bit disappointing to see it was already sieved...

    May be it's time for me to move 49 (~563T) setting to something like 49.2 (~647T) to match with 98% sieve point..



    Any suggestions?

    Are you all using 49, or anyone using other cutoff point?

  35. #115
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    And by the way, 49.3 seems very close to current 95% sieve point.

    May be it's a good cut-off as well....

  36. #116
    I'm using

    Pfactor=xxxxx,2,8xxxxxx,1,49,1.6

    with a Celeron 2.0 GHz, 110-170 MB assigned.
    I find usually 1 factor per 100 tests (rough estimate). One test takes 2 hours and half.

    With prime95, the cutoff doesn't support floating point numbers; 49.x isn't accepted. (Right?) And I don't think it is time to go to 50, as even like this, we are slower then the main effort. (Or should I try?)

    What is the relation between memory need and the numbers? I don't have a lot and would like to use is as good as possible. Don't tell me to PRP -> registry.

  37. #117
    Senior Member Frodo42's Avatar
    Join Date
    Nov 2002
    Location
    Jutland, Denmark
    Posts
    299
    Garo wrote and excellent explanation about these things.
    hc_grove added the explanation to this page

  38. #118
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    (from Feb 2004)
    I was wrong, I just coded up a quick trialfactorer and it factors both your numbers and Louie's so fast the time command can't measure it (on my 2GHz laptop that does regular factoring in the background).

    You can download a UNIX version of the program (called trial) and the source code (trial.c) here (the same place as where you can get my version of the factorer).

    You can give the program either p or p-1 as input it will figure that out.

    It only works with numbers < 2^64.
    hc_grove,

    I don't see this program or the source on the linked page. If it's still available (in particular the source) I'd be very interested. Thanks.

  39. #119
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    Originally posted by hhh
    With prime95, the cutoff doesn't support floating point numbers; 49.x isn't accepted. (Right?)
    The latest, and fastest version for most machines, is 24.6. And yes it does accept floating point numbers for the last two parameters.
    Joe O

  40. #120
    Frodo42, thanks for the plug

    The latest version of Prime95 DOES support floating point arguments so using 49.3 is not a problem.

    P-1 does not do a complete check in the same way that sieving does. The P-1 limits are defined by B1 and B2. The 49 or 49.3 is just used to help determine the "optimal" B1 and B2 as I explained in the above mentioned post. Note that your factor -1 has these two largest factors: 2393 x 118343. So any limits where B1 was greater than 2393 and B2 was greater than 118343 would have found this factor.

    It is unfortunate that you wasted your time on finding this factor but the solution is to filter out numbers that have already been factored and not change the P-1 limits (though that may be necessary for other reasons).

    Bottomline: Sieving and P-1 find factors in different ways so messing with limits will not ensure NO overlap. P-1 should NOT be done on numbers that have alerady been factored.

Page 3 of 5 FirstFirst 12345 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •