Well, I'm currently using my P4 for Riesel PRPing. I likely switch back when my range there is complete - which should still take approx. 2 more weeks...Originally posted by dmbrubac
Is it me or did we loose a bunch of factorer's?
Is it me or did we loose a bunch of factorer's?
(EDIT: removed reserved ranges from post. /ceselb)
Last edited by ceselb; 03-16-2004 at 03:25 PM.
Well, I'm currently using my P4 for Riesel PRPing. I likely switch back when my range there is complete - which should still take approx. 2 more weeks...Originally posted by dmbrubac
Is it me or did we loose a bunch of factorer's?
I just found another of these blasted factors ...
3 | 55459*2^5769706+1
I had the hope that this bug was removed with the new version of the factorer, but sadly that does not seem to be the case ...
I would like to set a P4 on factoring but the file won't run I'm not sure what i need to do to get it running. What files do i need with it can i just download the factoring package from the link above adn will it run after that. What about updating the results.txt file? When i run it it just opens and then shuts the window before i see what it says. am i maybe missing something. Do i need ot edit the bat file or perhaps something else?
I assume you are on Windows.
Open a command prompt and navigate to the directory with sobsieve. Better yet use the the "Command Window Here" powertoy available from Microsoft.
type run start end then enter.
For instance on one of my machines I've typed run 5880000 5882000
As dmdrubac suggested, you should:
Click start,
Click run,
Browse, and find run.bat, click open
Add your nmin and nmax at the very end of the command line at run window,
Click ok.
I guess that would solve your problem.
But, before that, if you want, you can also edit the run.bat file. I guess it should be as below in the original file.
@ECHO OFF
IF !%2==! GOTO BADFORMAT
sbfactor.exe %1 %2 47 1.5 256
:BADFORMAT
Things to think about changing:
47: As there is only 30% of the range is left on the 2^27 to 2^48 range at sieving, I think you might consider changing it to 48 (the effect will be lower B1 and B2 determined by the program, thus faster finish time per test. If you change to 48, it will not look for possible factors below 2^48. Therefore, you will end up with less factors per test. As I've mentioned above, as more than 70% of the ranges between 2^47 and 2^48 is already sieved, that will not be much of a problem in terms of the project.)
1.5: I guess anything between 1.2 and 1.5 would be fair for the moment.
256: I think you should lower this to something below 200 if your machine uses 256 MB of ram. It it's more than that, no need to change.
PS: Do not forget to update your results.txt file from the link above. This might save you from testing one (or more) tests for the k/n pairs for which a factor was found within the last couple of days since the package was compiled.
can the factorer be modified so that stage one and stage 2 can be done seperatly. Perhaps on different machines so that a machine that has large amounts of mamory can concentrate on only doing the second stage and other machines that don't have alot of memory can perform the first stage and still be productive once sieving becomes a little less favorable than it is now. I've been noticing myself that the benefits of sieving has decreased and will continue to do so. Once we reach about 500T It'll be pretty pointless to continue except for the fact that machines that don't have internet connections and have low memory would otherwise be useless to the projet.
Yes it can. As I see it the main problem is that the current format for the save files doesn't support 'stage 1 done' as a state.Originally posted by Keroberts1
can the factorer be modified so that stage one and stage 2 can be done seperatly.
There's been a little talk about that in the ' SOURCE CODE RELEASED FOR FACTORER' thread (which hasn't been active in the last 60 days, so ou might need to change that setting to see it the thread listing) - We can even split stage 2 among several machines.Perhaps on different machines so that a machine that has large amounts of mamory can concentrate on only doing the second stage and other machines that don't have alot of memory can perform the first stage
(Well If I'm going to do it, the main problem is that the code is terrible to work with, it's C with way too many global variables and .c files including other .c files. - The fact that I got practically no comments on the changes I made doesn't help on my wish to work on in).
Last edited by hc_grove; 04-10-2004 at 05:48 PM.
well i have some experience programming in C but no knowledge of P-1 factoring. If i can get a copy of the source maybe some time when I'm bored i can spend a few hours pouring over it and try ot neaten it up a little. I can't promise i won't muck things up though. Exactly howmany files and of what size are there?
How much work are we taling about? Hours/days/weeks/ not in my lifetime.?
I don't think much is needed. You don't have to modify the function doing the factoring.Originally posted by Keroberts1
well i have some experience programming in C but no knowledge of P-1 factoring.
There should be plenty of links in the source code thread, but else the newest version I've uploaded is
If i can get a copy of the source
here
maybe some time when I'm bored i can spend a few hours pouring over it
and try ot neaten it up a little. I can't promise i won't muck things up though. Exactly howmany files and of what size are there?
There are quite a few files in that zip, but only 5-7 relevant, each containing about a few hundred lines of code. (AFAIR)
That probably depend on your ambitions. I guess a lot could be done in a couple of hours.
How much work are we taling about? Hours/days/weeks/ not in my lifetime.?
Well that is encouraging this weekend I'm of course busy wiht the holidays and next week i will be working nearly 60 hours so i can't hope to get much done but next weekend i should have acouple days of peace and quiet so I'll probably sit down and work on it then. Unless of course anyone familiar with the code would like ot tackle it before then.
Input: sbfactor 6110000 6111000 [depth] [value] [mem]
If [depth]=48 and [value]<1.2 it gives this error (sbfactor 1.2 et 1.25.5):
If [depth]=47 it works fine with any [value].Could anybody please explain me why? Thanks!B1=0 B2=0 Success=0.000000 Squarings=0
P-1 factoring doesn't make sense for this input.
Edit: Another question. Why sbfactor 1.25.5 gives a lower number of expected factors than sbfactor 1.2? (using the same parameters, of course).
Last edited by Troodon; 05-03-2004 at 08:56 AM.
Same n's too?Originally posted by Troodon
Edit: Another question. Why sbfactor 1.25.5 gives a lower number of expected factors than sbfactor 1.2? (using the same parameters, of course).
I haven't touched the bounds estimation code which is what calculates the expectation values.
Which set of parameters do you use?
I just tried your range (6110000-6111000) with 47 1.6 640 and got an expectation of 0.317969 from both 1.2 and 1.25.5.
Last edited by hc_grove; 05-03-2004 at 09:35 AM.
My bad!!! The results.txt files were not the same - sbf 1.2 had one a bit older!!!Originally posted by hc_grove
Same n's too?
Which set of parameters do you use?
Last edited by Troodon; 05-03-2004 at 10:40 AM.
Trodoon as sieving depth deepens the factoring becomes less efficient because the regions where factors are most likely to be found have already been searched. Hence as sieving gets deeper factoring will become less and less necessary. Eventually if won't make sense (at least to the bounds optimizer) to check any N values for factors because the sieve will get to deep. I believe that there isa problem with this however because many ofthe factors found from P-1 factoring would never have been found fro mregular sieving. Perhaops the Optimal bounds selection could be improved. Is there a way to control them so that they only search for factors larger than the sieve depth rather than just using the sieve depth t opick optimal bounds?
No, the nature of the P-1 factoring algorithm makes it find any sufficiently smooth factor regardless of it's size.Originally posted by Keroberts1
Trodoon as sieving depth deepens the factoring becomes less efficient because the regions where factors are most likely to be found have already been searched. Hence as sieving gets deeper factoring will become less and less necessary. Eventually if won't make sense (at least to the bounds optimizer) to check any N values for factors because the sieve will get to deep. I believe that there isa problem with this however because many ofthe factors found from P-1 factoring would never have been found fro mregular sieving. Perhaops the Optimal bounds selection could be improved. Is there a way to control them so that they only search for factors larger than the sieve depth rather than just using the sieve depth t opick optimal bounds?
The n's also affect the optimal bounds, so while some parameters might not make sense with current n's they might make a lot of sense for larger n.
How does the sieve dept hdetermine the optimal bounds? What do these bounds mean?
I just started factoring last night because I was reading how a couple more people were needed to factor and I like to try new things.
Does this look like a good way to factor on my p4 3.2ghz with 512mb of ram or should I change 47 to 48 for the next range I reserve.
sbfactor.exe 6182000 6183000 47 1.5 256
This computer is just being used for distributed computing projects and to encode video files.
Looks fine to me, I have changed from 47 to 48 a while ago, but 47 does stille make sence since there are factors remaining in this range.Does this look like a good way to factor on my p4 3.2ghz with 512mb of ram or should I change 47 to 48 for the next range I reserve.
sbfactor.exe 6182000 6183000 47 1.5 256
This computer is just being used for distributed computing projects and to encode video files.
With a 3.2 GHz I would guess that range is done in less than two days.
If you find any factors these will be placed in fact.txt and they should then be submitted to http://www.seventeenorbust.com/sieve (remember to log in before submitting) and you should then be able to follow your scores on http://www.aooq73.dsl.pipex.com/ (use the scores link).
Good luck with factoring, your resources are needed if we are to P-1 factor ahead of prp.
Looks like my last 2 k/n pairs took 48 and 47 minutes to complete and there are 42 k/n pairs in the range I specified so that under 2 days estimate looks about right.
I was wondering what these other files in my sbfactor dir are.
276536182025
554596182266
276536182241
etc
They are files created for each test, they help you to be able to pick up stage 1 again if you want to do a test with higher bounds or if you for some reason stop the client then it's a save so that you don't have to start all over.I was wondering what these other files in my sbfactor dir are.
276536182025
554596182266
276536182241
etc
276536182025 is the test for 27653*2^6182025
You can safely delete these files when you are done with you range or else after some time of factoring they can begin to fill a quite big part of your HD.
Originally posted by Frodo42
They are files created for each test, they help you to be able to pick up stage 1 again if you want to do a test with higher bounds or if you for some reason stop the client then it's a save so that you don't have to start all over.
276536182025 is the test for 27653*2^6182025
You can safely delete these files when you are done with you range or else after some time of factoring they can begin to fill a quite big part of your HD.
Thanks for the reply. Your explantion was nice and easy for me to understand. I am not going to start to understand what exactly I am doing beside eliminating k/n pairs so they don't have to be prp tested. I'd have no idea how to figure out that 276536182025 is the test for 27653*2^6182025 even though I worked on it.
Just gotOriginally posted by jjjjL
That is bad. As I pointed out in the other thread, I did test several other numbers with no problems but not many test points were available (ie I had to sieve most of my own using your program ).
hc_grove is hinting that he may have fixed this but I don't really know what would have been wrong.
Cheers,
Louie
Using v 1.25.5.33661*2^6195480+1 stage 2 complete. 18930 transforms. Time: 6570 seconds
Starting stage 2 GCD - please be patient.
P-1 found a factor in stage #2, B1=15000, B2=120000.
5 | 33661*2^6195480+1
Total factoring Time: 110 minutes
I've never said that I solved that problem, but as I've moved a little forwarding in my understanding of the code in the last couple of days I guess I could take a look. Did you use manual or optimal bounds, and in the latter case what parameters?Originally posted by Troodon
Just got
Using v 1.25.5.33661*2^6195480+1 stage 2 complete. 18930 transforms. Time: 6570 seconds
Starting stage 2 GCD - please be patient.
P-1 found a factor in stage #2, B1=15000, B2=120000.
5 | 33661*2^6195480+1
Total factoring Time: 110 minutes
The input was sbfactor 6195200 6196000 and the parameters were optimal bounds, 1.3, 48, 325 Mb. Anyway, I'll try to refactor it when I have some free time to see if it's reproducible.
I can't reproduce it neither with those settings (which gives me different bounds from the reported) nor with the reported bounds, that makes it impossible to debug, sorry.Originally posted by Troodon
The input was sbfactor 6195200 6196000 and the parameters were optimal bounds, 1.3, 48, 325 Mb. Anyway, I'll try to refactor it when I have some free time to see if it's reproducible.