Notice the n of this factor:
204324288555810331 | 33661*2^6900000+1
204324288555810330 = 2*3^5*5*97*1543*6007*93523
Sorry for causing confusion. I don't remember having turned off GCD at the end of stage 1, but if you say it's an option, that's probably right.
Notice the n of this factor:
204324288555810331 | 33661*2^6900000+1
204324288555810330 = 2*3^5*5*97*1543*6007*93523
Found in stage 1
14201216740129537 | 10223*2^6983885+1
14201216740129536 = 2 ^ 8 x 3 ^ 3 x 13 x 29 x 271 x 2671 x 7529
My smoothest factor yet
P-1 found a factor in stage #2, B1=40000, B2=440000.
4847*2^7160703+1 has a factor: 1043098013353843
1043098013353842 = 2 x 3 ^ 5 x 5387 x 7127 x 55903
My first factor found with Geoges new code.
And I got my 9th place back, but not for all that long unless I find more factors.
22699*2^7247638+1 has a factor: 581445854728607717657
This seems pretty large, am I wrong for thinking that?
581445854728607717656 = 2 ^ 3 x 17 ^ 2 x 37 x 2029 x 4643 x 11117 x 64901
Last edited by pixl97; 10-19-2004 at 06:02 PM.
Very smoth:
67607*2^9107451+1 has a factor: 4921636623093246299
4 921636 623093 246298 = 2 x 11 x 29 x 127 x 1553 x 2161 x 2531 x 7151
WoW,
That's great, but may I ask why you are factoring at such a high n-level?
There is nothing wrong with factoring a particular k and some n level, but please let us know that you have. I was personally thinking of doing some factoring around 13m to remove some of those ?33661? tests.
Sorry for not reserving, but I thought, that my range does not fit into the main range.
At the moment, I work on 67607 starting from 9million. Depending on how much time it takes I will continue until 9.3..10 million.
At the moment I use B1= 60000
Reason why I factor in that area:
No special reason. I like to play around with numbers.
I this case I want to bring the entry 67607 at 9..10M in MikeH's Database below 1000.
It's just for fun
I will keep the factoring details, so you will have that information, when the main factoring effort reaches that area.
Reto
Thought you were trying to get an early start on the 200K point score ...
As for your choice of B1, ... others will have to comment, sieving will probably reach 2^50 by then, but your goal is to reduce it to <1000 ...
I'd just worry about P-1'ing very lightly and having to redo them again later, otherwise I'd say knock yourself out.
I was considering doing the same sort of thing...
Trying to eliminate the k/n's in the range of 13460000 to 13470000 for those tests reported dropped http://www.seventeenorbust.com/secret.
I wish Alien88 would remove those 9999999 tests and place those 33661 tests back where they should be.
Do you think I should increase my bounds that it is not necessary to redo the work?Originally posted by vjs
I'd just worry about P-1'ing very lightly and having to redo them again later, otherwise I'd say knock yourself out.
At the moment I am happy with the efficiency. I got 3 factors out of 100. unfortunately the biggest one disappeared.
P-1 found a factor in stage #2, B1=60000, B2=660000.
67607*2^9151371+1 has a factor: 17603629608134545337
I assume it is excluded because of a smaller factor of that number...
I just posted that factor into the large factor submission form and it was accepted and added as new. I'd logged out, so all you need to do now is reserve the range in the factor forum and you'll get the credit.unfortunately the biggest one disappeared.
Largest factor of the year.
EDIT: Just looked at the posts above, after noticing that all your last findings were 67607. Don't worry about reserving, I'll sort it on my side.
Last edited by MikeH; 01-10-2005 at 05:27 PM.
To me those bounds look good, perhaps someone with a second opinion...
I'd probably look more at the factor found per time then factors per number of tests. If you can get the same amount of factors in the same time with less tests that would be better of course. I think this can be done by increasing B1 and leaving B2 as is???
I'm not a factoring guy, but if I member correctly each time you factor a number a file will be created. You can use this file later to go back and choose different bounds and it takes less time etc...I wouldn't delete those files just yet.
We should be testing those numbers by mid to late year, so I don't think your efforts are in vien. Also when reservations get close or your finished with 9M-10M, you should make a post in the reservation section. Also if you get to 10M before main effort gets there you could always go back with different bounds if you keep those files...
Make that second largestLargest factor of the year.
The bounds are quite high (I'm not saying they are too high, though). Maybe factors/time is higher with lower bounds, but I don't know.Originally posted by vjs
I'd probably look more at the factor found per time then factors per number of tests. If you can get the same amount of factors in the same time with less tests that would be better of course. I think this can be done by increasing B1 and leaving B2 as is???
I wouldn't lower B1 only, as the optimal ratio between B1 and B2 is important for efficient factoring. As soon as 2 prime factors of the P-1 are > B1, the this factor won't be found by P-1.
Those files can be re-used, right. But only under certain circumstances (which I can't remember) there won't be a lowered efficiency.I'm not a factoring guy, but if I member correctly each time you factor a number a file will be created. You can use this file later to go back and choose different bounds and it takes less time etc...I wouldn't delete those files just yet.
Interesting,
17603629608134545337 is less than 64 bits and should be submittable by the normal form. I thought it was accepted.
Your right, it should have been accepted with the <64-bit, everyone should remember to check their factor submissions regarless.
Since you are actually working quite a bit ahead of prp why don't you try slighly higher b1 b2 for 10 pairs for example. See how many more you get and what the increased time expenditure is.
I've tried factoring some smaller numbers with bounds as high as
b1=7000000
b2=17000000
I think this might be a little to large for those n>9m numbers... take ages etc.
If I had a fast p4 with alot of memory, I'd try b1=1000000 b2=5000000 just too see how long it would take and the chances etc.
If I understand correctly your chances of finding a factor with p-1 increase with n as well, it's the old time vs chance of factor. Since we are not at 9m yet you have the time to experiment.
@ Biwema: Still, it might be a good idea to simply open a thread under factoring subforum to announce the work (ranges) you've done on 67607. That way, we can simply skip those 67607/n pairs that you've already tested. But, do this only if you're doing some massive work over there. I dunno, may be an n range of, say 1 million or more. If not, do not bother at all.Originally posted by MikeH
EDIT: Just looked at the posts above, after noticing that all your last findings were 67607. Don't worry about reserving, I'll sort it on my side.
If I understand correctly he has started
k=67607 from 9M up and is currently around 9.3M
He is using bounds of
B1=60000, B2=660000
These bounds seem pretty reasonable... perhaps he shouldl clarify that he did start at 9M and is increasing systematically...
Ooops, my bad. I should have read previous posts more carefully.
No problem Nuri,
I'm also really guessing that he has done all inbetween as well...
Can you comment on the bounds and possibilities of increasing those for better success.
His main goal is to reduce the 9M<n<10M for k=67607 to (prp tests < 1000)
sounds like just as good of a goal as any IMHO.
Also
17 603629 608134 545336 = 2 ^ 3 x 19 x 97 x 157 x 2473 x 18661 x 164789
So I guess B1=20000 B2=200000
Would have found this
Last edited by vjs; 01-18-2005 at 12:07 PM.
Well looks like someone has already tried one of the large k/n and gave up in less than a week .
Just wondering if anyone has tried factoring any of these...
10223 13467677 (Now in dropped que lowest one )
24737 13467703 (looks like this one may be assigned but not dropped)
55459 13467718 (looks like this one may be assigned but not dropped)
24737 13467727 (Next to be assigned)
24737 13467751
24737 13467703 (looks like this one may be assigned but not dropped)
I got this one will be done in a few hours actually
Wow!!!
How long did it take you to finish on what type of machine etc...
about 27 days on a athalon xp 2400
running almost constantly butthere was some sieving going on too.
1363473963746084778807 | 22699*2^8600311+1
I've aldo got this! 79.9T factor
79897555161219 | 22699*2^8600263+1
and this!!!! 70k factor (which was not verified buy te submission page, as one might expect)
70173 | 22699*2^8600217+1
Nuri sorry to say this but here are the factors that acutally exist between 8600215-8600265
k=27653 n=8600217
k=4847 n=8600223
k=4847 n=8600247
k=33661 n=8600256
k=24737 n=8600263
I think you may have a problem with your worktodo.ini
Ooops!!!
I guess I know why..
Corrected & restarted the range..
It happens...
I had a question for the factoring guys...
How does one get the newest version on prime 95 to do stage1 only factoring with set b1 bounds of say 2M?
Either assign too few memory to stage 2, or set the value for stage2 in the workfile to 1.
I'm not really even sure about the worktodo.ini values etc.
Perhaps we can make a stickpost with all of the factoring clients with links and examples of files to create and their contents.
I'm not sure what to write in the work to do ini
Here's a sample...
Pfactor=21181,2,8601140,1,49,1.5
Pfactor=22699,2,8601238,1,49,1.5
Pfactor=19249,2,8601278,1,49,1.5
Pfactor=55459,2,8601286,1,49,1.5
or, did I get you wrong?
No, but thanks nuri this is a good example of the way a typical worktodo.ini should look...
But what if you only want to run first stage with a paricular B1.
Or potentially run P+1, not sure if prime95 will do this but the latest ECM program will. I just don't know how to add the switches etc.
Hmmm. I found 645160797731449 | 4847*2^8799543+1 through P-1 and it was a bit disappointing to see it was already sieved...
May be it's time for me to move 49 (~563T) setting to something like 49.2 (~647T) to match with 98% sieve point..
Any suggestions?
Are you all using 49, or anyone using other cutoff point?
And by the way, 49.3 seems very close to current 95% sieve point.
May be it's a good cut-off as well....
I'm using
Pfactor=xxxxx,2,8xxxxxx,1,49,1.6
with a Celeron 2.0 GHz, 110-170 MB assigned.
I find usually 1 factor per 100 tests (rough estimate). One test takes 2 hours and half.
With prime95, the cutoff doesn't support floating point numbers; 49.x isn't accepted. (Right?) And I don't think it is time to go to 50, as even like this, we are slower then the main effort. (Or should I try?)
What is the relation between memory need and the numbers? I don't have a lot and would like to use is as good as possible. Don't tell me to PRP -> registry.
Garo wrote and excellent explanation about these things.
hc_grove added the explanation to this page
hc_grove,(from Feb 2004)
I was wrong, I just coded up a quick trialfactorer and it factors both your numbers and Louie's so fast the time command can't measure it (on my 2GHz laptop that does regular factoring in the background).
You can download a UNIX version of the program (called trial) and the source code (trial.c) here (the same place as where you can get my version of the factorer).
You can give the program either p or p-1 as input it will figure that out.
It only works with numbers < 2^64.
I don't see this program or the source on the linked page. If it's still available (in particular the source) I'd be very interested. Thanks.
The latest, and fastest version for most machines, is 24.6. And yes it does accept floating point numbers for the last two parameters.Originally posted by hhh
With prime95, the cutoff doesn't support floating point numbers; 49.x isn't accepted. (Right?)
Joe O
Frodo42, thanks for the plug
The latest version of Prime95 DOES support floating point arguments so using 49.3 is not a problem.
P-1 does not do a complete check in the same way that sieving does. The P-1 limits are defined by B1 and B2. The 49 or 49.3 is just used to help determine the "optimal" B1 and B2 as I explained in the above mentioned post. Note that your factor -1 has these two largest factors: 2393 x 118343. So any limits where B1 was greater than 2393 and B2 was greater than 118343 would have found this factor.
It is unfortunate that you wasted your time on finding this factor but the solution is to filter out numbers that have already been factored and not change the P-1 limits (though that may be necessary for other reasons).
Bottomline: Sieving and P-1 find factors in different ways so messing with limits will not ensure NO overlap. P-1 should NOT be done on numbers that have alerady been factored.