Results 1 to 40 of 185

Thread: Small n factoring

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Yet the same thing with
    Pminus1=168451,2,1116,1,4294967295,1,0

    I had to stop it. Now 400MB were assigned for sure.
    ___________________________________________________________________
    Sievers of all projects unite! You have nothing to lose but some PRP-residues.

  2. #2
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    How many are a few hours < 10?

    I think 2^991 which is roughly the same size took me something like 3 days on a 2.4G P4...

  3. #3
    Yes, about 5 hours.
    Today, I tried the factorisation with B1=2G, it worked, tough the GCD took 0 seconds. Then I extended to 3G, but had to leave before the end.

    Normally, one gets a residue, doesn't one? It think I got none.
    We'll see on monday.
    ___________________________________________________________________
    Sievers of all projects unite! You have nothing to lose but some PRP-residues.

  4. #4
    Is anyone still doing ECM factoring?

  5. #5
    Old Timer jasong's Avatar
    Join Date
    Oct 2004
    Location
    Arkansas(US)
    Posts
    1,778
    I'm assuming you're interested in participating, so my response to
    Quote Originally Posted by SlicerAce
    Is anyone still doing ECM factoring?
    is:

    Who cares? If you want to get involved in ecm factoring of Sierpinski numbers, go for it.

    My advice is to go to the gmp-ecm forum at http://www.mersenneforum.org/ and tell them your intentions. Or you could simply surf that sub-forum and probably be able to figure things out on your own, at which point you would come back here to get some numbers.

    On second thought, your first stop should be the User Guides in the 'Information and Answers' Forum at that same website I listed.

    Good luck.

  6. #6
    There may not be much of a point to calculating the lower n values, but I still think its kinda fun to get rid of them. Anyways, I believe I may be the first person to have found a factor for 10223*2^1181+1. It came up on the 2260th curve I was calculating.

    2869295942753555058435842630879466239475749080003 | 10223*2^1181+1


  7. #7
    Quote Originally Posted by SlicerAce View Post
    Is anyone still doing ECM factoring?
    I have been playing a bit with gmp-ecm latey, and run a few more rounds on 24737*2^991+1.

    an unreasonable amount of rounds with B1 < current levels.
    ca 22000 rounds of B1=1.08e9, B2 ~ 22e12 (54558 recommended for 65 digit factors)
    ca 7000 rounds of B1=2.52e9, B2 ~ 78e12 (118242 recommended for 70 digit factors)

    No luck. A smallest factor with less than 60 digits is very unlikely. I will probably give up soon. This number may break with SNFS some day, but it is much to large for me.
    Last edited by sturle; 01-06-2009 at 07:30 PM.

  8. #8
    Quote Originally Posted by sturle View Post
    an unreasonable amount of rounds with B1 < current levels.
    Wow. You meant B1> current levels, though, right? I thought about attacking a bit of the 55 digit level with my limited horsepower, but I will not mess with sturle, of course. Please post your progress when you are definitely fed up.

    Code:
    EDIT: How did you come up with your B1/B2-bounds? The readme states: 
    
           digits D  optimal B1   default B2           expected curves
                                                           N(B1,B2,D)
                                                  -power 1         default poly
              45       11e6        3.5e10           4949             4480 [D(12)]
              50       43e6        2.4e11           8266             7553 [D(12)]
              55       11e7        7.8e11          20158            17769 [D(30)]
              60       26e7        3.2e12          47173            42017 [D(30)]
              65       85e7        1.6e13          77666            69408 [D(30)]
    Yours seem way too high...

    BTW, Kman1293, how many curves did you run, on which numbers? Not to duplicate work, if someone wants to continue. Did you submit your factor? n=1181 is still in the .dat file...

    H.
    Last edited by hhh; 01-07-2009 at 02:55 AM.
    ___________________________________________________________________
    Sievers of all projects unite! You have nothing to lose but some PRP-residues.

  9. #9
    Quote Originally Posted by hhh View Post
    Wow. You meant B1> current levels, though, right?
    B1 less than 1.08e9.
    How did you come up with your B1/B2-bounds?
    I did something like this on the particular CPU / memory combination I am mainly using:
    Code:
    for i in `seq 1 300`; do 
      ./ecm-time -v -inp 24737.991.inp -maxmem 4000 ${i}e7 >> timings-4g.txt
    done
    And similar for 8000 MiB RAM. Among the output, gmp-ecm reports the following interesting lines:
    Code:
    Using B1=2520000000, B2=77978684123358, polynomial Dickson(30), sigma=1683006743
    dF=1048576, k=6, d=11741730, d2=19, i0=196
    Expected number of curves to find a factor of n digits:
    40      45      50      55      60      65      70      75      80      85
    30      96      338     1303    5454    24566   118242  608303  3288378 1.9e+07
    [...]
    Expected time to find a factor of n digits:
    40      45      50      55      60      65      70      75      80      85
    18.06d  57.24d  201.29d 2.13y   8.91y   40.12y  193.09y 993.37y 5370y   30569y
    I selected B1 levels and default B2 level which gave the lowest expected time to find a factor of given size. For the machines I have been using the optmal B1 is 1080000000 for 65 digit factors using up to 4 GB RAM for stage 2, and 2520000000 for 70 digit factors using up to 8 GB ram for stage 2. This test is using a modified gmp-ecm to get expected times for factors > 65 digits.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •