Log in

View Full Version : P-1 factorer



Pages : 1 [2]

Mystwalker
10-14-2003, 04:50 PM
The output looks good until we reach "sieve finished".
But then, the # of transforms should be in the 10,000s.

The calling syntax is
sbfactor.exe <start> <end> <whatWasThisFor?> <boundChanger> <allowedMemoryConsumption>

But I suspect yours is just fine...

Nuri
10-14-2003, 05:04 PM
Here's an example from one of the alternative ranges I tried. (This was to test if it finds the factor frodo42 found).

sbfactor.exe 4929741 4929743 46 1.5 128

46 is the sieve dept - 2^46 - (we've finished sieving of almost everything below 70T, so I tried 46 instead of 45)

or another one; sbfactor.exe 4980000 4982000 45 2 192

Both gives similar type of output (i.e. # transforms < 1,000)

Which version of the client are you using?

Frodo42
10-15-2003, 03:04 AM
sbfactor.exe 4929741 4929743 46 1.5 128

For most of the factors i found I used 45 sieve depth and 196 memory.
Anyways I run this client on Linux, but it also works fine with win98.
I use client version 1.1
I have not yet gone to ranges above 4983000.
Sadly, I don't see what is going wrong here.

Mystwalker
10-15-2003, 05:56 AM
Is there probably a difference between 2 and 2.0?

Nuri
10-15-2003, 05:44 PM
That's strange. I've downloaded the client again, and it seems to work now (for n below 4981000. above that, it still exits). Unlike previous efforts, it now shows percent completion info. I am reserving a range close to that (4980000 4982000) to see what will happen.

Thanks for the ideas.

PS: I hope the exit problem for ranges above 4981000 is something related to what I might be doing wrong. Otherwise, it might be due to a bug in the program. While browsing through the pages, I noticed that ceselb also reported such a problem.

Originally posted by ceselb
Btw, anyone tried to do a range over 5M? Sbfactor just exits if I try :( :confused:

Keroberts1
10-15-2003, 08:06 PM
where does the progra moutput factors when they are found?

Mystwalker
10-15-2003, 08:32 PM
It writes them into a separate text file. I don't konw the exact file name ATM, as I haven't found any factors the last couple of days. :(
But I'm sure you'll find the file once factors are found.

edit:

Originally posted by Joe O
fact.txt


Oh, and I also have the same problem (crash) when trying to factor sth. bigger than 5359*2^4980726+1...

Joe O
10-15-2003, 08:35 PM
.

Joe O
10-15-2003, 08:56 PM
Originally posted by Nuri
That's strange. I've downloaded the client again, and it seems to work now (for n below 4981000. above that, it still exits). Unlike previous efforts, it now shows percent completion info. I am reserving a range close to that (4980000 4982000) to see what will happen.

Thanks for the ideas.

PS: I hope the exit problem for ranges above 4981000 is something related to what I might be doing wrong. Otherwise, it might be due to a bug in the program. While browsing through the pages, I noticed that ceselb also reported such a problem.

The executable dated 7/2/03 4:35am seems to work for ranges above 5000000. At least on a PIII/500 under Win98SE.

Keroberts1
10-15-2003, 09:24 PM
Ijust started running the client today and it finished the first part of the first test in 3791 seconds but the seond part is taking alot longer Iunderstand this is right but how long should the second part take. It has only reported about 2 % progress adn its ben running for a long while now. Ihave an AMD2000+ with 256 MB DDR. The resource manager says I'm using very little of my CPU and alot of memory. Is this supposed ot take this long.

Joe O
10-15-2003, 10:44 PM
The short answer is "yes".

The long answer is:
What is your command line? especially the last parameter.
e.g.
sbfactor.exe 4929741 4929743 46 1.5 128
The last parameter 128 says to use 128MBytes of ram for the second stage of the P-1. You *might* be able to specifiy 128 on your 256MB machine, as long as you are not running anything else. Anything higher and you will *thrash*, i.e. page back and forth from memory and the swap file on your hard drive without doing any useful work.

I almost forgot to ask: Do you have a separate video card, or is the video built in? If built in, how much "shared" memory is it using?

Keroberts1
10-15-2003, 10:48 PM
I think i was running 256 I'll try changin it

Nuri
10-16-2003, 03:41 PM
Originally posted by Joe O
The executable dated 7/2/03 4:35am seems to work for ranges above 5000000. At least on a PIII/500 under Win98SE.

This is the same executable that I downloaded on my last trial. It's probably either not working on PIVs, or the problem is specific to my box.

I'll try it on AMD Athlon and post the case for it.

Nuri
10-16-2003, 03:49 PM
Originally posted by Joe O
The executable dated 7/2/03 4:35am seems to work for ranges above 5000000. At least on a PIII/500 under Win98SE.

This is the same executable that I downloaded on my last trial. It's probably either not working on PIVs, or the problem is specific to my box.

I'll try it on AMD Athlon and post the case for it.

Mystwalker
10-16-2003, 04:00 PM
Originally posted by Nuri
This is the same executable that I downloaded on my last trial. It's probably either not working on PIVs, or the problem is specific to my box.


Seems like it's a problem with P4s. I have the mentioned version of sbfactor, too. And it does work on my P3...

Nuri
10-16-2003, 05:01 PM
Yes, it seems so. I've just tried on Athlon 2400, and it works fine.

As far as I remember, it was mentioned somewhere that PIV is the most efficient machine for this task. Too bad if it actually is the case.

I hope Louie can have the time to take a look at it.

Joe O
10-16-2003, 06:18 PM
Nuri, Have you had a chance to try out the two "beta" (http://www-personal.engin.umich.edu/~lhelm/sbfactor11-15dev.zip) versions? They are descirbed in Louie's July 14th post in this thread. I would be very interested to know if either or both of them worked.

Just above that is the following from Mystwalker:
"btw. version 1.1 did work on the P4 system that crashed with the old version."
Is that still true? Mystwalker?

Mystwalker
10-16-2003, 08:55 PM
Originally posted by Joe O
Just above that is the following from Mystwalker:
"btw. version 1.1 did work on the P4 system that crashed with the old version."
Is that still true? Mystwalker?

Unfortunately, I don't have access to that system anymore. But I guess it's the same as other P4 systems: Up to almost 5M, it works.
I'll try the developer version tomorrow. Maybe that one can cross this frontier...

Keroberts1
10-17-2003, 12:58 AM
does anyone know thename of the file that factors are written to?

Frodo42
10-17-2003, 02:28 AM
The factors are written in cut-'n-paste format to:
fact.txt
it writes to the end of the file, so if your siever has left a fact.txt in the same directory, the results will be on the end of the file.

Keroberts1
10-17-2003, 02:36 AM
when it finds a factor is it at the end of the process of factoring or does it stop in the middle say at 56%. Just wondering.

Frodo42
10-17-2003, 02:46 AM
It reports factors at the end of the GCD process (at the end of the stage1 and stage2 tests).
I have been wondering if it would pay of to make GCD in the middle of the test's.
But I don't think that it would pay off, since it's pretty few factors that are found and it takes precious time from factoring.

Joe O
10-17-2003, 11:50 AM
Originally posted by Keroberts1
does anyone know thename of the file that factors are written to?

The answer is still the same as it was when you asked this 2 days ago. "where does the progra moutput factors when they are found?"

FACT.TXT

Nuri
10-17-2003, 03:10 PM
Keroberts, I guess you asked that question because you are running P-1 client on a different folder, and could not find the fact.txt file. Just a wild guess.

The client creates the fact.txt file right after it finds it's first factor. If it has not found any factors since you started, there will not be a fact.txt yet.

I hope this helps.

Keroberts1
10-17-2003, 03:37 PM
My main question ws wether or not it would write to the same file as proth sieve. Frod opointed out that it wil write to the same file and I think I'm good with that. I just need to remember to keep checking for any new factors at the end of that file.

Nuri
10-17-2003, 04:15 PM
Originally posted by Joe O
Nuri, Have you had a chance to try out the two "beta" (http://www-personal.engin.umich.edu/~lhelm/sbfactor11-15dev.zip) versions? They are descirbed in Louie's July 14th post in this thread. I would be very interested to know if either or both of them worked.

That was one of the versions that I could not make work (1.1dev, and also 1.0, 0.9, 0.8, and 0.6). I did not notice previously that there was another 1.1 apart from 1.1dev. So, when I noticed that it was something different, I downloaded 1.1 and it simply worked (below 4.9 million etc).

Having read the post you mentioned a couple of times more, I now understand that the client that I should have tried is ecm.exe, not 1.1dev. "If" I understood the post correctly, 1.1.dev is for finding larger factors of the k/n pairs that we know has some smaller factors (which we should write in the badsieve.txt file for the client to skip those). Although it is cool, that's not something I am particularly trying to do right now, so I'll skip 1.1dev. (PS: I might be wrong on this, but this is how I understood it.)

Anyways, I'll give ecm.exe a try and let you know the result.

Also, I'll try all these versions (1.1 and ecm) on another P4 - a notebook to see if it will work for larger numbers. If either of them works, that would be great. This machine is not connected to the net, and all I could do previously was to crunch the sieve client at 130kp/sec. It would perform definitely much better on P-1.

Joe O
10-17-2003, 04:31 PM
Nuri, I agree that ecm.exe is the one to try. 1.1dev without the badsieve.txt file should behave like 1.1. In fact it may be the same as 1.1. If ecm.exe doesn't work, then 1.1dev without the file may be worth a try, just in case it is different from 1.1. (Oops, i see that you did already try it to no avail!) This is looking more and more like an FFT boundary issue. Looking forward to hearing your results.

Nuri
10-17-2003, 07:18 PM
Made the tests for two different P4 machines as posted above.

mostly bad news... :(


- The notebook was the third P4 machine that reports 1.1 exiting for n >= 4980726.

- Strangely, 1.1dev without badsieve does not behave like 1.1. Tried again, behaves like I explained on my post above. (total factoring time < 3 min, # transforms < 1,000, and not surprisingly, can not find the factors that 1.1 can find with the same factor value, factor depth, and memory settings), and exits for n>=4980726

- I'm not sure if I could run ecm correctly. I tried two alternatives.

ecm.exe <n low> <n high> <factor depth> <factor value> (mem usage at local.ini) ==> exits starting at 4980726.

ecm.exe k n B1 B2 #curves ==> seems to work for all n. (but, one can not enter a range. also, I'm not sure what values might be reasonable for B1, B2, and #curves. I used 120, 500 and 100, respectively. I dunno, may be I understood the usage wrongly. Any help is welcome).

MikeH
10-19-2003, 06:49 AM
Made the tests for two different P4 machines as posted above. I have done similar tests, with the same results. I've tried bigger n values to see if this is an FFT boundary issue, I tried 0.5M steps upto 7M, and all fail in the same way.

Is there anyone out there that has P-1 working for n>5M? If not, then it would appear that for a short while at least, we have lost a very valuable tool.

Joe O
10-19-2003, 10:39 AM
Originally posted by MikeH
Is there anyone out there that has P-1 working for n>5M? If not, then it would appear that for a short while at least, we have lost a very valuable tool.

Originally posted by Joe O
The executable dated 7/2/03 4:35am seems to work for ranges above 5000000. At least on a PIII/500 under Win98SE.

Mike, I think that the problem is only with PIVs. I have a PIII/500 merrily chugging away at 5007000 to 5008000. Joe.

Nuri
10-19-2003, 10:52 AM
Yes, that's true, the problem exists for P4 only.

It also works on AMD Athlon without any problem. But, to be honest, I do not want to give up 515 kp/sec for P-1 at the moment.

Still, I think it would be nice if this bug is repaired. P-1 needs P4 power as well. ;)


Originally posted by Nuri
Yes, it seems so. I've just tried on Athlon 2400, and it works fine.

As far as I remember, it was mentioned somewhere that PIV is the most efficient machine for this task. Too bad if it actually is the case.

I hope Louie can have the time to take a look at it.

Joe O
10-19-2003, 08:48 PM
Originally posted by Nuri
Still, I think it would be nice if this bug is repaired. P-1 needs P4 power as well. ;)
I most heartily concur!

Mystwalker
10-20-2003, 07:51 AM
My P4 does a P-1 test at 1.5 in < 45 minutes.
I assume P4s are a quite important factor in P-1 factoring.

Joe O
10-21-2003, 11:10 AM
Posted in the coordination thread:

Originally posted by Frodo42
4930000 4935000 Frodo42 169 2.483284 3 [completed]

5100000 5110000 Frodo42 317 5.049642 ? [reserved]

I don't seem to have any problem for n>4980670 with ver. 1.1 on P4 .

Let's everyone with PIVs please post the details of their machine and the point at which it seems to start failing.

Joe O
10-21-2003, 11:23 AM
Nuri, Did you actually compute 4980670, or is that the one that failed?

Nuri
10-21-2003, 02:18 PM
I did. May be it was not clear, I had to emphesize.


From reservation thread:


Please replace

4980000 4982000 Nuri 91 1.624419 ? [reserved]
4982000 4990000 ? ???? ?.?????? ? [soon to be passed]

with

4980000 4980670 Nuri 28 0.295272 1 [completed]
4980726 4990000 ? ???? ?.?????? ? [soon to be passed]

PS: 4980670 is the upper limit for my P4.

Joe O
10-21-2003, 03:27 PM
I didn't want to miss any so I tried the interval 4980670 4981000 and the first one it tried was 4980670. I wasn't sure if the endpoint was included in the range or the start point. I know that Louie posted it somewhere But I haven't been able to find it. If you make your endpoint 4980700 and i make that my start point, then it won't look like a there is a gap between our ranges, and there should be no confusion.

On another note, have you tried ranges way above 5000000? Frodo42 has posted that he has no difficulty with 5100000 5110000 on a PIV.

Frodo42
10-21-2003, 05:06 PM
My success might be because I use the Linux version?
I have now done something like 15 tests at 5.1M, they take 1.5 times the time needed at 4.9M, with the same parameters, but thats probably because a factor counts that much more at this point, their probability of success is also somewhat higher.

Nuri
10-21-2003, 06:34 PM
Originally posted by Joe O
If you make your endpoint 4980700 and i make that my start point, then it won't look like a there is a gap between our ranges, and there should be no confusion.
Sure, I agree with you. Of course that should be the case normally.
Since that was a specific case, I wanted to post it that way at the first place to show the exact maximum n in the dat file that it can handle, and exact minimum n it fails to run (to make it clear where exactly the problem starts). But, I see it created some confusion on your side.

Originally posted by Joe O
On another note, have you tried ranges way above 5000000? Frodo42 has posted that he has no difficulty with 5100000 5110000 on a PIV.
Yes, I tried many alternatives. It takes just a few seconds for the client to exit for those ranges, therefore it was very easy to try tens of them up to 20m in a couple of minutes per client version. :D

Joe O
10-21-2003, 08:32 PM
Originally posted by Nuri
ecm.exe <n low> <n high> <factor depth> <factor value> (mem usage at local.ini) ==> exits starting at 4980726.

I get the same behaviour on an ATHLON!

Mystwalker
10-22-2003, 05:15 AM
Originally posted by Joe O
Let's everyone with PIVs please post the details of their machine and the point at which it seems to start failing.

It seems like everyone with the P4/Windows combination has the problem.
Maybe someone can fire up a Knoppisx version and start teh P-1 factorer? (I won't be able to get to my PC in the next days...)

Keroberts1
10-23-2003, 03:53 AM
what happened here?

3 | 33661*2^4993368+1

bounds 20000 140000 with 128 and 1.5

Keroberts1
10-23-2003, 08:57 AM
I ran the same test again and got nothing. Should i paly with the bounds, is there a factor that I'm just barely missing or is this just a bug.

Joe O
10-23-2003, 09:47 AM
Originally posted by Keroberts1
I ran the same test again and got nothing. Should i paly with the bounds, is there a factor that I'm just barely missing or is this just a bug.

Probably a bug. This has happened recently to Frodo42 (http://www.free-dc.org/forum/showthread.php?s=&threadid=3160&perpage=25&pagenumber=10) , and one other time as well.

Keroberts1
10-24-2003, 04:21 AM
same thing again

now i have two

3 | 33661*2^4993368+1
3 | 10223*2^4993901+1

Does anyone know wahts causes this?

Frodo42
10-24-2003, 07:40 AM
Does anyone know when we can expect Louie back, we kind'a need him as he probably also is the only one who can fix these bugs with the P-1 factorer.
If he does not expect to be able to give some time to this project within a reasonable time perhaps it would be a good idea if someone else began looking at the source code for the P-1 factorer (not me I study physic's, not math's)

Keroberts1
10-26-2003, 10:10 PM
i think it happened again except this time its

49 | 21181*2^4995332+1

Thats three in about a range of 2 thousand. Is anyone else getting this error this often? I don't know whats wrong with my computer, maybe its because i only use 128 MBs RAM Could someone tell me if there is something wrong with it happening this often. For now I'm stopping on my ranges done upto 4997000. I havent' foudn any factors but i have had that error thre times. I think I'd be better off putting all o my resources into sieving.

m0ti
10-28-2003, 06:33 PM
Please check out my last post here (http://www.free-dc.org/forum/showthread.php?s=&threadid=4565) and let me know if this is a good idea (should give a speed boost of 2x)

hc_grove
11-28-2003, 09:24 AM
I'm having a little trouble reporting the last factor that I have found.

261252098359999236987971694398316499 | 28433*2^5314225+1

The factor is larger than 2^117 and the Sieve Result Submission page only accepts factors less than 2^64.

Without the limit to the bias in the score calculation this factor would have scores around 2.3*10^25 points.

I think a factor of that size calls for a couple of smilies:
:|party|: :cheers: :jester: :elephant: :drink:

Keroberts1
11-28-2003, 09:48 AM
wow is that 36 digits?

What is the upper limit of a factor size that can be found using the P-1 factorer?

That ones a real trophy.

MikeH
11-28-2003, 09:50 AM
I assume you've tried submitting to

http://www.seventeenorbust.com/largesieve/

hc_grove
11-28-2003, 10:00 AM
Originally posted by MikeH
I assume you've tried submitting to

http://www.seventeenorbust.com/largesieve/

Now I have. (I had completely forgotten about that page).


Originally posted by Keroberts1
wow is that 36 digits?


That's what I counted it to. :thumbs:

mklasson
11-28-2003, 10:54 AM
Congratulations, but that factor isn't prime (Mr. Killjoy here). :)

261252098359999236987971694398316499 =
1695263162018308849 * 154107105146414851

MikeH
11-28-2003, 11:47 AM
Keep this up, and you'll make it into the record books :)

http://www.loria.fr/~zimmerma/records/Pminus1.html

Oh, and I might need to make some hasty changes to the sieve scoring, not sure if it can cope with displaying that many digits. :blush:

hc_grove
11-28-2003, 12:15 PM
Just in case anyone is interested, here is the factorization of p-1:

2*3*457*86491*8622413*13809557*9251514854849

mklasson
11-28-2003, 12:31 PM
But that's not really what the P-1 algorithm found. Rather it found two factors (simultaneously, since the bounds covered both):
1695263162018308849-1:
2^4 * 3^4 * 7^2 * 157 * 317 * 2237 * 239779
and
154107105146414851-1:
2 * 3 * 5^2 * 17 * 37 * 3217 * 10271 * 49433

ceselb
11-28-2003, 12:55 PM
Very smooth indeed. :cheers:

rosebud
11-29-2003, 07:17 AM
I was trying to redirect the output of sbfactor to a file with something like

./run.sh 5322000 5323000 > progress

but there is nothing written to progress.

The idea behind this is to start sbfactor at system startup, but later monitor it under kde with tail -f ...

Can anybody help me with this?

hc_grove
11-29-2003, 08:52 AM
Originally posted by rosebud
I was trying to redirect the output of sbfactor to a file with something like

./run.sh 5322000 5323000 > progress

but there is nothing written to progress.

The idea behind this is to start sbfactor at system startup, but later monitor it under kde with tail -f ...

Can anybody help me with this?

I redirect both STDOUT and STDERR with

./sbfactor 5134000 5136000 45 1.5 512 > uddata 2>&1

and that works.

Frodo42
11-29-2003, 09:30 AM
./sbfactor 5134000 5136000 45 1.5 512 > uddata 2>&1
I use 47 instead of 45 as 2^47=140.74*10^12 and the sieve 90% point has passed.
As I understand this just makes sure that the P-1 factorer does not check for or find duplicates.

rosebud
11-29-2003, 10:59 AM
Originally posted by hc_grove
I redirect both STDOUT and STDERR with

./sbfactor 5134000 5136000 45 1.5 512 > uddata 2>&1

and that works.

Well, I tried that, too, but it equally just creates an emtpy file, doesn't write anything into it.




I use 47 instead of 45 as 2^47=140.74*10^12 and the sieve 90% point has passed.
As I understand this just makes sure that the P-1 factorer does not check for or find duplicates.


I thought about that, too, but it gives me lower bounds than with 45. Isn't it actually less likely to find factors with smaller bounds, especially when the sieving point goes up? So shouldn't be the bounds be increased instead? Please correct me if I'm wrong...

garo
11-29-2003, 01:06 PM
Rosebud,
run.sh invokes sbfactor. So you need to redirect the output of sbfactor. Try to edit run.sh and it should work.

rosebud
11-29-2003, 01:49 PM
Thanks for your help, I just found out, that I was just not patient enough.
Apparently some buffer is written to the outputfile in 4kb blocks, I just didn't wait long enough to see it happen before I killed the program again. :bang:

Citrix
12-01-2003, 02:45 PM
I used version 11 dev.

This is what I found.

[Mon Dec 01 11:50:44 2003]
ECM found a factor in curve #1, stage #1
Sigma=7999200192601744, B1=2000, B2=200000.
2956796616715477563801784791791170379265868824629460864415946527130271264717466812248233802997132847 5423878344513 | 265711*2^199920+1
[Mon Dec 01 13:21:36 2003]
ECM found a factor in curve #1, stage #1
Sigma=636312383509189, B1=2000, B2=200000.
453696229622164192714306768330898875898888916174816736249304693155093236170413062337 | 265711*2^199920+1


both the factors are wrong.

k=265711 is being used on the PSP project.
(http://www.geocities.com/eharsh82/)

Citrix
:cool: :cool: :cool:

Keroberts1
12-08-2003, 12:43 PM
What is the limit on the size of a factor we could expect to find using P-1. Could we find a record factor? Very soon sieving will run past its usefullness and many sievers will probably be switching CPUs towards P-1. Or at least i will be. At the point we'll have a rather significant amount of CPU power running there. Just wondering what our chances are of hitting a record.

rosebud
12-08-2003, 01:22 PM
Hypothetical the maximum size is unlimited. But I think the odds of finding a record factor are pretty small. I run sbfactor with B1=30000 and B2=320000 which is what the bounds optimizer gives me.
All record p-1 factors (40+ digits) where found using much higher bounds: B1 > 10^7, B2 > 10^8 :notworthy



http://www.loria.fr/~zimmerma/records/Pminus1.html

hc_grove
12-08-2003, 03:34 PM
Originally posted by rosebud
Hypothetical the maximum size is unlimited. But I think the odds of finding a record factor are pretty small. I run sbfactor with B1=30000 and B2=320000 which is what the bounds optimizer gives me.
All record p-1 factors (40+ digits) where found using much higher bounds: B1 > 10^7, B2 > 10^8 :notworthy

But my 36 digit factor was found using B1=35000, B2=420000, so B1 and B2 doesn't have to be that big.

Having said that, I do think my factor will be our largest for quite a while. Maybe MikeH should change the stats to show the 3 or 5 largest factors so others have a chance of having big factors recognized?

Keroberts1
12-08-2003, 03:53 PM
I'm sorry but i thought tha had been found ot be a composite of two factors? Does that matter?

mklasson
12-08-2003, 07:06 PM
Yes, that matters. The reason hc_grove's bounds didn't need to be so high is that the bounds only had to be sufficient for a 19-digit factor (and an 18-digit one).

The chance that a set of small factors completely factors a number decreases as the number gets bigger.

And if you're aiming for a record I'm sure nothing but a prime factor is accepted.

Nuri
01-11-2004, 08:38 AM
Are there any news on fixing the P-1 bug for PIVs?

I am aware that it is not of top priority, but some feedback would be useful.

ceselb
01-11-2004, 09:13 AM
I know that some work is being done/planned, but I suspect that a new client and a functional factor submission form is top priority right now.

jjjjL
01-12-2004, 03:36 AM
Originally posted by Nuri
Are there any news on fixing the P-1 bug for PIVs?

I am aware that it is not of top priority, but some feedback would be useful.


I can release a new factoring version before v 3 of the client is finished. Now is not a good time to do it because I'm not at home where I have a P4 (thanks Fritz!) to test with.

Wednesday will be the first day I get to work on it. Sometime this week should be a reasonable ETA for a new version.

Also, I think I understand the problem, but if someone could summarize what the issue is and give me examples so I can reproduce the problem, that will help me make sure the next version doesn't suffer from the same deficiency. Email me at [email protected] with the info. Thanks.

Cheers,
Louie

Nuri
01-12-2004, 06:11 AM
Thanks for the feedback, and good luck.

Simply speaking, the client exits for n>4980700 on PIVs. As far as I recall, this is valid for all versions (and under various settings/parameters).

MikeH
01-12-2004, 07:52 AM
As far as I recall, this is valid for all versions I thought it was only the Windows version that was faulty. That was certainly where I saw the problem (Win2K). Someone please correct me if I'm wrong.

hc_grove
01-12-2004, 08:43 AM
Originally posted by MikeH
I thought it was only the Windows version that was faulty. That was certainly where I saw the problem (Win2K). Someone please correct me if I'm wrong.

You're absolutely right. All my p-1 factoring is done on P4's running Linux.

Nuri
01-12-2004, 12:00 PM
Sorry for the misunderstanding :blush: and thanks for the clarification.

What I meant by all versions was, including the previous releases of the client for windows.

Frodo42
01-25-2004, 12:34 PM
I'm having trouble finding/making the SoB.dat file to use for P-1 factoring.

When using the updated SoB.dat file (11 k's) result.txt does not remove any tests, so therefore I have made my own SoB.dat file by adding a few zeros to the line after k=5359, thereby making the tests for k=5359 totally out of range and preserving the result.txt ability. This has worked fine for some time, but now suddenly the P-1 factorer wants to do test some weird out of bounds tests for k=4847 when I start it.

I can't figure out why this happens. Anyone got a clue?

jjjjL
01-26-2004, 05:16 AM
Link to SBFactor v1.2 (http://www-personal.engin.umich.edu/~lhelm/sbfactor12.zip)

Been tinkering with the factorer.

This version is similar to v1.1 except:
-Doesn't crash on P4s for n > 5M
-Newer gw code so probably 5-10% faster (have done validation tests but no benches)
-Bundled w/ new 11k dat file
-Bundled w/ new, smaller results.txt file (p > 25T instead of p > 3T)
-Updated run.bat so factor depth is 47 (140T) instead of 45 (35T)
-No longer reads lowresults.dat (because .dat file has those values removed)
-Minor cosmetic differences
-*EDIT* LINUX BINARY NOW INCLUDED. *EDIT*

Those are the biggies. Hopefully now that P4s can run in Windows again we can get a little more P-1 work done. It'd be nice to not abandon large ranges of number due to lack of factoring resources.

About the result.txt files... changed the ranges this morning. Should save a little bandwidth and save download time for anyone getting the results.txt file since it was creeping up to 2MB compressed. Now it's down much lower. However, there's a good chance this broke sieve stats because until MikeH downloads the new lowresults.txt (http://www-personal.engin.umich.edu/~lhelm/lowresults.txt.bz2), it may appear as though all results between 3T and 25T vanished. Sorry. Not to worry though, it's a temporary thing.

The important part is the new version of SBFactor (http://www-personal.engin.umich.edu/~lhelm/sbfactor12.zip) is out.

Let me know how it performs:
1) On machines where v1.1 crashes (P4, Centrino, Mobile P4)
2) Relative speedwise to v1.1 (any increase?)
3) On tests for numbers where it returned false factors (like 3 or 7)
4) And in general.

Looking forward to hearing from people and seeing more factoring activity.

Cheers,
Louie

Frodo42
01-26-2004, 05:32 AM
Nice, I haven't got any Windows machines so I can't say if it works, just waiting for a Linux version. If there is a speed increase, then let's have it, there is afterall more Linux boxes doing som P-1 factoring than windows boxes at present (although that probably is about to change with the new windows-version)

Anyways fixing result.txt down to 11 k's solved my problem, so thanks for that.

Maybe a link to the new factorer in the coordination thread would be a good idea.

hc_grove
01-26-2004, 07:06 AM
Well, I don't have any windows machines either, so I'd like a Linux version of 1.2 too.

After the release of 1.1 there was some talk abourt manually setting B1 and B2, what happened to those plans?

jjjjL
01-26-2004, 07:17 AM
Turns out building the new version of the factorer went really smoothly so I already uploaded a new zip file that has both Linux and Windows versions in it.

Cheers,
Louie

hc_grove
01-26-2004, 09:51 AM
Originally posted by jjjjL

1) On machines where v1.1 crashes (P4, Centrino, Mobile P4)


1.1 didn't crash in Linux, 1.2 doesn't either.



2) Relative speedwise to v1.1 (any increase?)


Nothing noticeable.



3) On tests for numbers where it returned false factors (like 3 or 7)


Haven't had any of those myself, and haven't tested.



4) And in general.


It's nice not to need lowresults.txt anymore.

MikeH
01-26-2004, 03:10 PM
Hi Louie,

Sieving and P-1 scores are now sorted.

Just one little question. Why are the following factors at the end of the LowResults.txt file?


11431293770421995731 27653 5103393 584 61 0
14500316478379213543 33661 5343552 960 9 0


These ones are 11431293T (and bigger), which isn't usually less than 25T ;)

It's not a problem, I just removed them from the file, and now all is OK.

Many thanks for fixing the P-1, I'm now happily crunching again. :)

Mystwalker
01-26-2004, 03:57 PM
Originally posted by jjjjL
Let me know how it performs:
1) On machines where v1.1 crashes (P4, Centrino, Mobile P4)

Works without a glitch (P4, Windows).


2) Relative speedwise to v1.1 (any increase?)


I'm not sure I remember the correct time I needed with 1.1, but I think the time hasn't changed at all.


Good to be able to do factoring again. Thanks, Louie! :cheers:

Joe O
01-26-2004, 09:35 PM
Total factoring Time: 640 minutes for 27653*2^5689005+1 on a PIII/500 under Win98SE.

If memory serves me right, this is about a 10% improvement. I'll try to verify this later in the week. But it definitely is an improvement.

I'll try it on a PIV this weekend.

Edit: Overnight results:

Total factoring Time: 617 minutes for 27653*2^5689041+1 on a PIII/500 under Win98SE

jjjjL
01-27-2004, 02:59 AM
So I'm looking into making the client run P+1 factoring again.

I'm having a lot of trouble finding any explainations of exactly how P+1 factoring works.

MathWorld has almost nothing http://mathworld.wolfram.com/WilliamspPlus1FactorizationMethod.html

and most other sites I find are nearly as sparse in details. The only somewhat techincal document is the README and comments in the source code of ECMNET.

However, it's a little challenging translating the P+1 code from this program into gw code for Woltman's multiplication routines.

Anyone know of any useful documents that describe the P+1 algoritm outside of ECMNET?

Cheers,
Louie

ceselb
01-27-2004, 09:54 AM
Originally posted by Frodo42
Maybe a link to the new factorer in the coordination thread would be a good idea.

Good idea, added.

Joe O
01-27-2004, 10:36 AM
Have you tried Williams, H. C. "A Method of Factoring." Math. Comput. 39, 225-234, 1982?
If you still have access to the University Library it should be possible to get this in PDF format.

Or you could try "A Survey of Modern Integer Factorization Algorithms" by Peter L. Montgomery (http://citeseer.ist.psu.edu/cache/papers/cs/17549/ftp:zSzzSzftp.cwi.nlzSzpubzSzCWIQuarterlyzSz1994zSz7.4zSzMontgomery.pdf/montgomery94survey.pdf/)

Then there is "Prime Numbers: A Computational Perspective", by Richard Crandall
and Carl Pomerance or
"Prime Numbers and Computer Methods for Factorization (Progress in Mathematics, Vol 126) " by Hans Riesel
Both of these are available from Amazon.com

jjjjL
01-27-2004, 04:46 PM
Originally posted by Joe O
Have you tried Williams, H. C. "A Method of Factoring." Math. Comput. 39, 225-234, 1982?
If you still have access to the University Library it should be possible to get this in PDF format.

Yeah, I grabed that paper from JSTOR. If anyone needs it, I can send it to you. It's dense but it may help.

BTW, I just got two new factors:
5013169446620738903 | 4847*2^5621151+1
1042382178159409 | 21181*2^5621468+1

5013169446620738903-1 = 2 x 191 x 709 x 797 x 1619 x 2273 x 6311
1042382178159409-1 = 2 ^ 4 x 3 ^ 2 x 103 x 337 x 9649 x 21613

smooth :)

Cheers,
Louie

hc_grove
01-27-2004, 05:21 PM
Originally posted by jjjjL
Yeah, I grabed that paper from JSTOR. If anyone needs it, I can send it to you. It's dense but it may help.


I've some unsuccesful searches for P+1-factoring too, so I'd like a copy. As I have a master degree in mathematics I can live with it being a bit dense.

As an administrator you should be able to send me email, but else tell me, and I'll send you a PM with the address.



BTW, I just got two new factors:


:hifi:Congratulations, now I just wish that I could find an interesting factor again.

Joe O
01-27-2004, 05:58 PM
Originally posted by jjjjl
Yeah, I grabbed that paper from JSTOR. If anyone needs it, I can send it to you. It's dense but it may help.

Louie, I'd like a copy. Maybe I can translate it into English from Mathematics.

jjjjL
01-27-2004, 06:00 PM
Originally posted by hc_grove
As an administrator you should be able to send me email, but else tell me, and I'll send you a PM with the address..

I'm actually not a free-dc administrator, I'm only a moderator of this particular board so I can't lookup your email that way... but I can look it up on SB so I'll email it to you there. :) PM me if you don't get it or you need me to send it to an address other than the one you registered for SB with.


Cheers,
Louie

hc_grove
01-28-2004, 02:34 PM
Originally posted by jjjjL
I'm actually not a free-dc administrator, I'm only a moderator of this particular board


Well that was what I meant, and I though that was enough.



so I can't lookup your email that way... but I can look it up on SB so I'll email it to you there. :) PM me if you don't get it or you need me to send it to an address other than the one you registered for SB with.


It's the same address, and I have received it, and will look at it later (just have a newsserver that ran out of disc space:( to fix first)

mklasson
01-31-2004, 06:54 AM
I just noticed something odd:
3 | 460139*2^361-1 (9 | too)
but sbfactor (v1.25 at least) is unable to find that factor in stage 1, no matter how high you go.

EDIT:
and another thing:
172599179 | 460139*2^380-1
172599179-1 = 2*86299589
b1=30, b2=100M finds it successfully
b1=86M, b2=1G does _not_ find it
b1=86M, b2=1.5G finds it
b1=86M, b2=2G does _not_ find it

I don't know whether the errors are due to something related to saving and restoring the state. I.e. I haven't tried going to 86M,1G from scratch.

jjjjL
01-31-2004, 08:50 AM
Originally posted by mklasson
I just noticed something odd:
3 | 460139*2^361-1 (9 | too)
but sbfactor (v1.25 at least) is unable to find that factor in stage 1, no matter how high you go.

EDIT:
and another thing:
172599179 | 460139*2^380-1
172599179-1 = 2*86299589
b1=30, b2=100M finds it successfully
b1=86M, b2=1G does _not_ find it
b1=86M, b2=1.5G finds it
b1=86M, b2=2G does _not_ find it

I don't know whether the errors are due to something related to saving and restoring the state. I.e. I haven't tried going to 86M,1G from scratch.

That is bad. As I pointed out in the other thread, I did test several other numbers with no problems but not many test points were available (ie I had to sieve most of my own using your program ;)).

hc_grove is hinting that he may have fixed this but I don't really know what would have been wrong.

Cheers,
Louie

Joe O
01-31-2004, 09:14 AM
Originally posted by mklasson
I just noticed something odd:
3 | 460139*2^361-1 (9 | too)
but sbfactor (v1.25 at least) is unable to find that factor in stage 1, no matter how high you go.

EDIT:
and another thing:
172599179 | 460139*2^380-1
172599179-1 = 2*86299589
b1=30, b2=100M finds it successfully
b1=86M, b2=1G does _not_ find it
b1=86M, b2=1.5G finds it
b1=86M, b2=2G does _not_ find it

I don't know whether the errors are due to something related to saving and restoring the state. I.e. I haven't tried going to 86M,1G from scratch.

Not to panic yet. This could be normal. P-1 does not find all small factors, and sometimes giving it too large of a B1 hides small factors. This comment is based on experience with P-1 in GIMPS(Prime95). I'm not sure if the theory backs that up, but at least one other implementation of the theory does. I'm up to my elbows in P-1 and P+1 theory papers now, and Ill report if they back this up.

hc_grove
01-31-2004, 09:29 AM
Originally posted by jjjjL
hc_grove is hinting that he may have fixed this but I don't really know what would have been wrong.


I'm sorry to say but my version doesn't find that factor 3 (or 9) either. :(

And to Joe_O: As I have understood the theory, it's doesn't back your claim. :(

Joe O
01-31-2004, 12:17 PM
Originally posted by hc_grove
As I have understood the theory, it's doesn't back your claim. :(

Not a claim, just an observation that certain implementations fail to find small factors. While looking for a discussion where this behaviour was observed, I came across this excellent description of P-1 and ECM that was posted by Phil Moore. It's too long to quote here, so this is the link. (http://www.mersenneforum.org/showthread.php?t=194) It is followed by additional insights from E W Mayer and again by Phil Moore.

Addendum: Please check the last paragraph of section 6.3 on page 346 (10th page of the PDF) in "A Survey of Modern Integer Factorization Methods" (http://citeseer.ist.psu.edu/cache/papers/cs/17549/ftp:zSzzSzftp.cwi.nlzSzpubzSzCWIQuarterlyzSz1994zSz7.4zSzMontgomery.pdf/montgomery94survey.pdf/) by Peter L. Montgomery. The point is made that only factors of a certain form are found by P-1.

jjjjL
02-01-2004, 04:48 AM
Originally posted by cedricvonck
sbfactor 6000100 6000110 47 .... (ram) 128

Ahh, even without the whole command I think I see what happened. Must have dropped an argument and sent the program into single number test mode instead of testing the range. 47 is probably the B1 you used there....

and as it turns out, that factor you found is actually correct:

6000110 * 2^6000150+1 is divisible by 9 (I checked). In fact, that number is divisible by 3... but you happened to find that it was divisible by 3^2.

If you post the full command line you used, we could probably set you straight. Be aware v1.25 for Riesel sieve takes an extra argument of " + " before the mem size to signify that you use the +1 transforms instead of -1 (for Riesel). Most people on the forum here are still using v1.20 which doesn't have this. It's just unfortuante that the commands are very similar (yet different) and the program accepts the command with one less argument but just runs in a mode other than the one you wanted.

If you want to use run.bat, either use it w/ v1.20 or edit run.bat to include a " + " between 1.5 and 128 like " ... 1.5 + 128".

Cheers,
Louie

jjjjL
02-01-2004, 04:55 AM
Originally posted by Citrix
I am looking for all the P-1 factors (not the factors by sieve) found by SOB. where can I find these numbers. Please let me know.

Thanks,
Citrix
:cool: :cool: :cool:

There is no file for just P-1 factors. A list of all factors from sieving and P-1 and any other method is in http://www.seventeenorbust.com/sieve/results.txt.bz2 . Uncompress it and look at the end for the large factors which were most likely found by P-1. By comparing it to the sieve list, you can probably determine which factors were found in which way.

Also, in the future, don't post general questions in the coordination thread. This thread is meant only for reserving and checking in completion data for ranges of numbers. A more appropriate thread for discussion is the P-1 factorer (http://www.free-dc.org/forum/showthread.php?s=&threadid=3160&perpage=25&pagenumber=14) thread. I'm going to delete your original message and leave only this response so as to minimize the cluter in the thead before I remove this message in a few days. Nobody's mad, but just a kind reminder for next time. :)


Cheers,
Louie

cedricvonck
02-01-2004, 05:32 AM
I used the following cmd:

sbfactor.exe %1 %2 47 1.5 + 128

name of zip file: sbfactor12.zip:blush: :blush:

MJX
02-01-2004, 05:22 PM
The point is made that only factors of a certain form are found by P-1.
the only point I see is related to the smoothness : to find a factor p with parameters B1 (for stage 1) and B2 (for stage 2), p-1 must be "B1-B2 semismooth" : all its factors must be below B1 exept one that can be between B1 and B2 (if it is below B1, factor is found in stage 1, if p-1 is not semismooth, the factor, even if it is small, is not found).


I just noticed something odd:
3 | 460139*2^361-1 (9 | too)
but sbfactor (v1.25 at least) is unable to find that factor in stage 1, no matter how high you go.

concerning ECM, I remember that P. Zimmermann said that gmp-ecm didn't work well if the number to factorize was divisible by 2 or 3... fact that can be avoided by simple trial division... (maybe this can happen with p-1 too, since p-1 (in that case 3-1=2)and the base number "a" that you exponentiate (surely a=2, because 2^(p-1) is easy to compute) should be pairwise prime, if I am right about Fermat theorem...)

mklasson
02-01-2004, 05:56 PM
Ah, you're right! sbfactor uses a=3, so a^b-1 will never be a multiple of 3.

And the 172599179 issue turns out to be a non-issue as well. It wasn't found because I probably ran up to 86M,100M first and found the factor, and then just raised the B2 bound causing sbfactor to do B2 from 100M upwards, skipping the already passed 86299589... Don't know why 86M,1.5G found it again though. Perhaps Brent-Suyama kicked in?

Ho hum. Sorry about all the confusion. Must learn to think farther. :)

MJX
02-04-2004, 01:50 PM
I use a low suboptimal B1=10000 B2=97500 to get the ability of testing more numbers (is it better to test a maximum of numbers with low bounds or keep large gaps of untested numbers between optimally tested ones??)

(removed reservation /ceselb )

garo
02-04-2004, 07:40 PM
MJX,
Given the fact that a large number of "numbers" are not getting any P-1 I think you are on the right track. It is more efficient to do a little P-1 on all the numbers than to do a lot on only some numbers. Of course, ideally one would want to do a lot of all the numbers but since that is not possible....

Frodo42
02-08-2004, 01:02 AM
I think there is a problem with either the Linux-version of the factorer or the result.txt file.

For some time I have had the problem that when i download a new result.txt file and start the factorer
- I get the output: "Removed 0 numbers using the factor file"
- The last estimated number is "Estimating for k=4847 n=13338904" no matter which range i set it to do. From there of there is no more output but it keeps using CPU-power, I haven't tried letting it run for long enought too see if it generates files but have instead used the result.txt coming with the sbfactor12.zip file that does not give any problems.

This problem was also present ind the last factorer, so I think it is a problem with the result.txt file after the last prime.
Damned be those primes they only cause problems ;)

hc_grove
02-08-2004, 08:06 AM
Originally posted by Frodo42
I think there is a problem with either the Linux-version of the factorer or the result.txt file.

For some time I have had the problem that when i download a new result.txt file and start the factorer
- I get the output: "Removed 0 numbers using the factor file"
- The last estimated number is "Estimating for k=4847 n=13338904" no matter which range i set it to do. From there of there is no more output but it keeps using CPU-power, I haven't tried letting it run for long enought too see if it generates files but have instead used the result.txt coming with the sbfactor12.zip file that does not give any problems.

This problem was also present ind the last factorer, so I think it is a problem with the result.txt file after the last prime.


I don't think results.txt changed because of the prime. (There are still factors for k=5359 in it).

I can't see why this would happen. Can I get you to try out my version (see the source code thread)?



Damned be those primes they only cause problems ;)

:)

Frodo42
02-10-2004, 12:13 AM
I can't see why this would happen. Can I get you to try out my version (see the source code thread)? I just tried it out and it does the same thing.

used the range 5600000 5620000 as I know there should be some factors to remove with result.txt
...
Removed 0 numbers using the factor file
...
Estimating for k=10223 n=5619929
Estimating for k=4847 n=8654994
Estimating for k=4847 n=1930939537
from here it just freezes.

Another thing Henrik, in your latest of the tar.gz version in the sbfactor.cnf file type = ECM it took me a while to figure out that I had to change it to P-1 as there is no other possibilities than ECM listed in the comments :rolleyes:

hc_grove
02-10-2004, 06:05 AM
Originally posted by Frodo42
I just tried it out and it does the same thing.

used the range 5600000 5620000 as I know there should be some factors to remove with result.txt
...
Removed 0 numbers using the factor file
...
Estimating for k=10223 n=5619929
Estimating for k=4847 n=8654994
Estimating for k=4847 n=1930939537


That n is strange. I can't figure out where that should come from.

Could you make the "good" and one of the "bad" results.txt's and the sbfactor.log from my version available or mail them to me (I'll PM you my email)?



Another thing Henrik, in your latest of the tar.gz version in the sbfactor.cnf file type = ECM it took me a while to figure out that I had to change it to P-1 as there is no other possibilities than ECM listed in the comments :rolleyes:

Ooops! :bang: That wasn't even the only option that was set to some strange value. It's fixed now. (And I'll try to remember to put a normal configuration in future releases.

Frodo42
02-18-2004, 11:44 AM
I have come to miss a little program that tells me how big a range I should reserve if I for example want to do 6 test with P-1 factorer with given parameters.

When I'm having classes with a P4 computer put in front of me I can't resist the temptation of doing a few test's, but when my class finishes before the test's it's a little anoying.

I don't think it would be all that hard to make that kind of code from the current prediction part of the current factorer. Actually the option just to run the prediction part without starting the factoring part would probably be good enough.

If it's not to diffucult it could be cool to be able to tell the factorer that I have app. x hours of CPU-time available please factor an apropriate range starting from y

Btw. hc_grove thanks for teaching me a lesson on unpacking files, which was the reason for my previos problemt with result.txt. Insted of unpacking result.txt.bz2 I packed an extra time and renamed it to result.txt :o

dmbrubac
03-16-2004, 07:57 AM
Is it me or did we loose a bunch of factorer's?

(EDIT: removed reserved ranges from post. /ceselb)

Mystwalker
03-16-2004, 09:17 AM
Originally posted by dmbrubac
Is it me or did we loose a bunch of factorer's?

Well, I'm currently using my P4 for Riesel PRPing. I likely switch back when my range there is complete - which should still take approx. 2 more weeks...

Frodo42
03-17-2004, 05:03 PM
I just found another of these blasted factors ...
3 | 55459*2^5769706+1
I had the hope that this bug was removed with the new version of the factorer, but sadly that does not seem to be the case ...

Keroberts1
04-01-2004, 04:17 PM
I would like to set a P4 on factoring but the file won't run I'm not sure what i need to do to get it running. What files do i need with it can i just download the factoring package from the link above adn will it run after that. What about updating the results.txt file? When i run it it just opens and then shuts the window before i see what it says. am i maybe missing something. Do i need ot edit the bat file or perhaps something else?

dmbrubac
04-01-2004, 04:37 PM
I assume you are on Windows.
Open a command prompt and navigate to the directory with sobsieve. Better yet use the the "Command Window Here" powertoy available from Microsoft.
type run start end then enter.
For instance on one of my machines I've typed run 5880000 5882000

Nuri
04-02-2004, 05:40 AM
As dmdrubac suggested, you should:

Click start,
Click run,
Browse, and find run.bat, click open
Add your nmin and nmax at the very end of the command line at run window,
Click ok.

I guess that would solve your problem.


But, before that, if you want, you can also edit the run.bat file. I guess it should be as below in the original file.

@ECHO OFF
IF !%2==! GOTO BADFORMAT
sbfactor.exe %1 %2 47 1.5 256
:BADFORMAT

Things to think about changing:

47: As there is only 30% of the range is left on the 2^27 to 2^48 range at sieving, I think you might consider changing it to 48 (the effect will be lower B1 and B2 determined by the program, thus faster finish time per test. If you change to 48, it will not look for possible factors below 2^48. Therefore, you will end up with less factors per test. As I've mentioned above, as more than 70% of the ranges between 2^47 and 2^48 is already sieved, that will not be much of a problem in terms of the project.)

1.5: I guess anything between 1.2 and 1.5 would be fair for the moment.

256: I think you should lower this to something below 200 if your machine uses 256 MB of ram. It it's more than that, no need to change.


PS: Do not forget to update your results.txt file from the link above. This might save you from testing one (or more) tests for the k/n pairs for which a factor was found within the last couple of days since the package was compiled.

Keroberts1
04-10-2004, 03:17 PM
can the factorer be modified so that stage one and stage 2 can be done seperatly. Perhaps on different machines so that a machine that has large amounts of mamory can concentrate on only doing the second stage and other machines that don't have alot of memory can perform the first stage and still be productive once sieving becomes a little less favorable than it is now. I've been noticing myself that the benefits of sieving has decreased and will continue to do so. Once we reach about 500T It'll be pretty pointless to continue except for the fact that machines that don't have internet connections and have low memory would otherwise be useless to the projet.

hc_grove
04-10-2004, 05:42 PM
Originally posted by Keroberts1
can the factorer be modified so that stage one and stage 2 can be done seperatly.


Yes it can. As I see it the main problem is that the current format for the save files doesn't support 'stage 1 done' as a state.


Perhaps on different machines so that a machine that has large amounts of mamory can concentrate on only doing the second stage and other machines that don't have alot of memory can perform the first stage


There's been a little talk about that in the ' SOURCE CODE RELEASED FOR FACTORER' thread (which hasn't been active in the last 60 days, so ou might need to change that setting to see it the thread listing) - We can even split stage 2 among several machines.

(Well If I'm going to do it, the main problem is that the code is terrible to work with, it's C with way too many global variables and .c files including other .c files. - The fact that I got practically no comments on the changes I made doesn't help on my wish to work on in).

Keroberts1
04-10-2004, 06:04 PM
well i have some experience programming in C but no knowledge of P-1 factoring. If i can get a copy of the source maybe some time when I'm bored i can spend a few hours pouring over it and try ot neaten it up a little. I can't promise i won't muck things up though. Exactly howmany files and of what size are there?

How much work are we taling about? Hours/days/weeks/ not in my lifetime.?

hc_grove
04-10-2004, 06:46 PM
Originally posted by Keroberts1
well i have some experience programming in C but no knowledge of P-1 factoring.


I don't think much is needed. You don't have to modify the function doing the factoring.



If i can get a copy of the source


There should be plenty of links in the source code thread, but else the newest version I've uploaded is
here (http://www.sslug.dk/~grove/sbfactor/sbfactor1255.zip)



maybe some time when I'm bored i can spend a few hours pouring over it
and try ot neaten it up a little. I can't promise i won't muck things up though. Exactly howmany files and of what size are there?


There are quite a few files in that zip, but only 5-7 relevant, each containing about a few hundred lines of code. (AFAIR)



How much work are we taling about? Hours/days/weeks/ not in my lifetime.?

That probably depend on your ambitions. I guess a lot could be done in a couple of hours.

Keroberts1
04-10-2004, 06:57 PM
Well that is encouraging this weekend I'm of course busy wiht the holidays and next week i will be working nearly 60 hours so i can't hope to get much done but next weekend i should have acouple days of peace and quiet so I'll probably sit down and work on it then. Unless of course anyone familiar with the code would like ot tackle it before then.

Troodon
05-03-2004, 08:46 AM
Input: sbfactor 6110000 6111000 [depth] [value] [mem]

If [depth]=48 and [value]<1.2 it gives this error (sbfactor 1.2 et 1.25.5):

B1=0 B2=0 Success=0.000000 Squarings=0
P-1 factoring doesn't make sense for this input.
If [depth]=47 it works fine with any [value].Could anybody please explain me why? Thanks!

Edit: Another question. Why sbfactor 1.25.5 gives a lower number of expected factors than sbfactor 1.2? (using the same parameters, of course).

hc_grove
05-03-2004, 09:17 AM
Originally posted by Troodon
Edit: Another question. Why sbfactor 1.25.5 gives a lower number of expected factors than sbfactor 1.2? (using the same parameters, of course).

Same n's too?

I haven't touched the bounds estimation code which is what calculates the expectation values.

Which set of parameters do you use?

I just tried your range (6110000-6111000) with 47 1.6 640 and got an expectation of 0.317969 from both 1.2 and 1.25.5.

Troodon
05-03-2004, 10:01 AM
Originally posted by hc_grove
Same n's too?

Which set of parameters do you use?


My bad!!! The results.txt files were not the same - sbf 1.2 had one a bit older!!! :blush:

Keroberts1
05-03-2004, 05:41 PM
Trodoon as sieving depth deepens the factoring becomes less efficient because the regions where factors are most likely to be found have already been searched. Hence as sieving gets deeper factoring will become less and less necessary. Eventually if won't make sense (at least to the bounds optimizer) to check any N values for factors because the sieve will get to deep. I believe that there isa problem with this however because many ofthe factors found from P-1 factoring would never have been found fro mregular sieving. Perhaops the Optimal bounds selection could be improved. Is there a way to control them so that they only search for factors larger than the sieve depth rather than just using the sieve depth t opick optimal bounds?

hc_grove
05-03-2004, 06:27 PM
Originally posted by Keroberts1
Trodoon as sieving depth deepens the factoring becomes less efficient because the regions where factors are most likely to be found have already been searched. Hence as sieving gets deeper factoring will become less and less necessary. Eventually if won't make sense (at least to the bounds optimizer) to check any N values for factors because the sieve will get to deep. I believe that there isa problem with this however because many ofthe factors found from P-1 factoring would never have been found fro mregular sieving. Perhaops the Optimal bounds selection could be improved. Is there a way to control them so that they only search for factors larger than the sieve depth rather than just using the sieve depth t opick optimal bounds?

No, the nature of the P-1 factoring algorithm makes it find any sufficiently smooth factor regardless of it's size.

The n's also affect the optimal bounds, so while some parameters might not make sense with current n's they might make a lot of sense for larger n.

Keroberts1
05-03-2004, 10:34 PM
How does the sieve dept hdetermine the optimal bounds? What do these bounds mean?

Paperboy
05-09-2004, 02:01 PM
I just started factoring last night because I was reading how a couple more people were needed to factor and I like to try new things.
Does this look like a good way to factor on my p4 3.2ghz with 512mb of ram or should I change 47 to 48 for the next range I reserve.

sbfactor.exe 6182000 6183000 47 1.5 256


This computer is just being used for distributed computing projects and to encode video files.

Frodo42
05-09-2004, 02:56 PM
Does this look like a good way to factor on my p4 3.2ghz with 512mb of ram or should I change 47 to 48 for the next range I reserve.

sbfactor.exe 6182000 6183000 47 1.5 256


This computer is just being used for distributed computing projects and to encode video files.

Looks fine to me, I have changed from 47 to 48 a while ago, but 47 does stille make sence since there are factors remaining in this range.
With a 3.2 GHz I would guess that range is done in less than two days.
If you find any factors these will be placed in fact.txt and they should then be submitted to http://www.seventeenorbust.com/sieve (remember to log in before submitting) and you should then be able to follow your scores on http://www.aooq73.dsl.pipex.com/ (use the scores link).

Good luck with factoring, your resources are needed if we are to P-1 factor ahead of prp.

Paperboy
05-09-2004, 07:08 PM
Looks like my last 2 k/n pairs took 48 and 47 minutes to complete and there are 42 k/n pairs in the range I specified so that under 2 days estimate looks about right.

I was wondering what these other files in my sbfactor dir are.
276536182025
554596182266
276536182241
etc

Frodo42
05-10-2004, 12:08 AM
I was wondering what these other files in my sbfactor dir are.
276536182025
554596182266
276536182241
etc
They are files created for each test, they help you to be able to pick up stage 1 again if you want to do a test with higher bounds or if you for some reason stop the client then it's a save so that you don't have to start all over.
276536182025 is the test for 27653*2^6182025

You can safely delete these files when you are done with you range or else after some time of factoring they can begin to fill a quite big part of your HD.

Paperboy
05-10-2004, 01:04 AM
Originally posted by Frodo42
They are files created for each test, they help you to be able to pick up stage 1 again if you want to do a test with higher bounds or if you for some reason stop the client then it's a save so that you don't have to start all over.
276536182025 is the test for 27653*2^6182025

You can safely delete these files when you are done with you range or else after some time of factoring they can begin to fill a quite big part of your HD.


Thanks for the reply. Your explantion was nice and easy for me to understand. I am not going to start to understand what exactly I am doing beside eliminating k/n pairs so they don't have to be prp tested. I'd have no idea how to figure out that 276536182025 is the test for 27653*2^6182025 even though I worked on it.

Troodon
05-21-2004, 04:21 PM
Originally posted by jjjjL
That is bad. As I pointed out in the other thread, I did test several other numbers with no problems but not many test points were available (ie I had to sieve most of my own using your program ;)).

hc_grove is hinting that he may have fixed this but I don't really know what would have been wrong.

Cheers,
Louie

Just got


33661*2^6195480+1 stage 2 complete. 18930 transforms. Time: 6570 seconds
Starting stage 2 GCD - please be patient.
P-1 found a factor in stage #2, B1=15000, B2=120000.
5 | 33661*2^6195480+1
Total factoring Time: 110 minutes
:confused: Using v 1.25.5.

hc_grove
05-22-2004, 05:59 AM
Originally posted by Troodon
Just got


33661*2^6195480+1 stage 2 complete. 18930 transforms. Time: 6570 seconds
Starting stage 2 GCD - please be patient.
P-1 found a factor in stage #2, B1=15000, B2=120000.
5 | 33661*2^6195480+1
Total factoring Time: 110 minutes

:confused: Using v 1.25.5.

I've never said that I solved that problem, but as I've moved a little forwarding in my understanding of the code in the last couple of days I guess I could take a look. Did you use manual or optimal bounds, and in the latter case what parameters?

Troodon
05-22-2004, 08:20 AM
The input was sbfactor 6195200 6196000 and the parameters were optimal bounds, 1.3, 48, 325 Mb. Anyway, I'll try to refactor it when I have some free time to see if it's reproducible.

hc_grove
05-22-2004, 03:40 PM
Originally posted by Troodon
The input was sbfactor 6195200 6196000 and the parameters were optimal bounds, 1.3, 48, 325 Mb. Anyway, I'll try to refactor it when I have some free time to see if it's reproducible.

I can't reproduce it neither with those settings (which gives me different bounds from the reported) nor with the reported bounds, that makes it impossible to debug, sorry.