I have just found
148603027109277394117 | 67607*2^4090091+1
with sbfactor but the submission script doesn't accept it (because p>2^64 ?). What should I do?
Ah, I remember. Ok.The linux version of SBFactor is in the same zip file as the windows version. the "sbfactor" and "run.sh" files should do the trick.
btw. on another P4 machine, it _does_ run.
I have just found
148603027109277394117 | 67607*2^4090091+1
with sbfactor but the submission script doesn't accept it (because p>2^64 ?). What should I do?
what a monster congratulations
SBFactor v1.1
http://www-personal.engin.umich.edu/...sbfactor11.zip
New Features:
-re-sorts tests by n instead of by k and then n
-prints k/n values before testing
-allows intelligent range splitting
-run.sh now same factor value (1.5)
-newer results.txt file
the first two are self-explanitory. the re-sorting by n makes sense so you can more accurately guess where you are and break up ranges after starting them. the printing of k/n values was asked for by mklasson.
the range splitting is a new feature i added for my own purposes but i made it a documented feature since others may want it. when doing a range of n values, you can just add two values before the amount of ram to signify the computer # and the total number of computers splitting the range.
so to split a process like
./sbfactor 4200000 4205000 45 1.5 256
across 3 computers, you'd use the commands
./sbfactor 4200000 4205000 45 1.5 1 3 256
./sbfactor 4200000 4205000 45 1.5 2 3 256
./sbfactor 4200000 4205000 45 1.5 3 3 256
the reason i say the splitting is "intelligent" is because it doesn't just break the range into 3 even chunks of width range/3. instead, it starts each computer at a different offset in the list of all values to test and steps by the total number of computers. this means that not only is the range processed from lowest to highest across all computers but each computer will get as close to an even amount of work as possible even if the values aren't evenly distributed by n.
right now, I'm using the splitting feature to run sbfactor on multiple computers remotely with my little awk script i created:
hostinfo -ALL -LINUX | awk '/0.00/ {i=1;while(match(substr($1,i++,1),"[a-z0-9]"));tmp = "ssh -f " substr($1,1,i-2) " ./sbfactor 4041200 4045000 45 1.5 " 1+comps++ " xXx 256" tmp} END {gsub(/xXx/, comps "\n", tmp);print tmp}' | bash
i don't think many people can use that but hopefully the new version helps those who are doing a lot of factoring. if you're happy with v1.0, there's no real need to upgrade. the next version will have manual bound setting for ranges and will use improved P-1 code.
-Louie
Last edited by jjjjL; 07-02-2003 at 05:42 AM.
Very nice, Louie.
Just to make sure: the mod operation does use something clever like Gallot's idea, right? I'm assuming it does, but better safe than sorry.
Oh, and I found a really, really smooth factor last night:
40315798264717 | 21181*2^4031084+1
2^2*3^3*13*211*367*601*617+1
If only we could use B1=601,B2=617 all the time...
Hi Mike and Louie,
The factors I submitted without logging in have not been credited yet. Any ETA on this?
thanks
Question:
If a factor is found buy sieving ias it still checked for the P-1 factoring if it was found after the dat file for P-1 was made.
I ask because i just found
40112001560353 | 10223*2^4322789+1
to be tested very soon
Another question for you Louie:
don't factors > 2^64 get included in results.txt?
In the new scoring I'll pick up the P-1 co-ordination thread so that I can identify any factors that were submitted without being logged in. This has always worked well for sieving, and looking at the typcal size of the P-1 factors, there shouldn't be confussion. Sorry all, I haven't had a lot of time recently to look at the scoring, but should have some free time soon.Hi Mike and Louie,
The factors I submitted without logging in have not been credited yet. Any ETA on this?
thanks,
Shortly after I reserved 4055000 4060000 I noticed that the main effort already has catched up to that number.
I'm guessing there's not much use to do that range now, or will the server stop a test while in progress (if partial blocks are reported)?
As far as I can see, it's only the 67607 range that has a correct "max n tested" value at the moment. The others have had values above 4 million several tests before I got my first test above 4 million. I'm not sure how many tests there are below 5 million, but I guess that there are a few (1100 above 4 million are pending at the moment).
My very uneducated guess is that most tests out there are still below 4010000-4020000.
I've been watching the progress for some days now and the "real" assigned max usually goes up by 8-10000 a day. With that in mind I think a safe guess is 4020000.
So do your range
I want to help with P-1 factoring, but I'm a bit lost. I've downloaded the last version of the factorer, and my computer has 128 MB of physical RAM. After choosing a range, what parameters should I enter?
I'd suggest something like
sbfactor #start #end 45 1.5 128
If you don't mind the lag, you might even try 256 at the end.
45 is sieve depth, (2^45 ~= 35T )
1.5 is how many prp tests you value one number to. Most people use 1.5 afaik.
128 is memory used (in the B2 stage I think).
I'll try, thanks! Does it save the progress?
It's good to see that the effects of the P-1 factoring are finally making their mark in the stats.
Over the last two weeks the 4M<n<5M band has averaged 15 candidates removed per day. Compare that with with the other 1M bands of 10-13, and we see that P-1 is generating a useful 2-3 factors per day in the very useful area.
Since this is an active thread, IMHO it should be "stickied" along with the P-1 coordination thread.
Threads are now sticky. I hesitated to make it myself, since Louie unstuck the doble checking coordination a while back.
Maybe we could even split the forum into two parts.
My computer it's doing a range. What should I do (secuence) to power off it without data loss? Thanks!
No real need to do anything special, it saves the progress every few minutes.
If you want to minimize the work lost, wait until the xxxxxxxxxx file updates and then ctrl-c.
And when I want to restart it? Should I enter all parameters again? Thanks!
Yes.
But you could create a bat file and place it in the startup folder. I don't do this myself, but it's mentioned in this thread one or two pages back iirc.
I've enteredand it has factored up to 27653*2^4154145+1. Now I want to change some parameters: the end of the range (to 4155000), the value (2.0) and memory (256). Could I entersbfactor 4154000 4300000 45 1.5 128or will there be any problem? Thanks!sbfactor 4154146 4155000 45 2.0 256
Troodon,
Yes it should be fine. However, since you are changing the factor value parameter to 2.0 the bounds will increase and some work will need to be redone for the numbers you have already done. It may be a better idea to stick with 1.5 for this range.
That shouldn't be a problem, as he raised start_n as well. It might be better to use start_n = 4154145 though -- some other k could be untested for that n. With that startn_n some additional work would be done for 27653*2^4154145+1.Originally posted by garo
Troodon,
Yes it should be fine. However, since you are changing the factor value parameter to 2.0 the bounds will increase and some work will need to be redone for the numbers you have already done.
is the number of factors estimated in the cordination thread a dependable number because there definatly seems to be more getting found? It appears to be findany many more than the estimates say? Is P-1 more effective than we previously thought?
the estimated # of factors is assuming that the number has been completely pre-sieved to 2^45. this would cause some underestimating since it's not taking into account the probablility of finding factors in the unsieved ranges below 2^45.
is the difference between estimates and actual statistically significant?
-Louie
only if you're trying to figure out the value of P-1 factoring and weighing where you should devote your resources
which brings up the isue of a resource allocation model. I know this has been tried before but could P-1 be intriduced? just a thought i doubt it would have muc hof an impact but i was wondering how the resource allocation effects the timeline for the discovery of the last prime, since that is truely the only discovery that makes this a complete success. I'm refering to the original resource allocation model created earlier but if a new one could be constructed to be more accurate that would be great too, I don't expect anyone to build a completely new model though because obviously that would be alot of work.
garo, mklasson
I've decided to redo the whole range, thanks!
When you still have the old files, it takes less time to compute the new bounds.
Yeah! My first factor! 1918126591518655393 | 4847*2^4155831+1 where 1918126591518655393=2^5*3*7*137*191*541*7013*28751+1. I'll submit it to the database tomorrow.
Is anyone keeping the intermediate files? They seem useful only if someone wishes to later try a larger B1 or B2.
Greg
As it seems like we're factoring faster than Proth testing right now (at least it seems that way for me), maybe we should set the bound to a higher value?
Can someone approximate what bound factor would be best?
btw. version 1.1 did work on the P4 system that crashed with the old version.
If you have 128MB of physical memory, then you shouldn't tell the program to use more than 96MB or so (assumes OS and various programs need 32MB). If you tell the program to use too much memory, you will thrash in stage 2 and run very slowly.Originally posted by Troodon
I want to help with P-1 factoring, but I'm a bit lost. I've downloaded the last version of the factorer, and my computer has 128 MB of physical RAM. After choosing a range, what parameters should I enter?
i have compiled a couple new copies of SBFactor.
one uses the newer GIMPS code and should be faster. However, tests seem to indicate that it is slower (??). i'll post it in a zip file for brave souls to try out. i wouldn't use it for regular testing until it's a bit more polished, but if you want to speed test it you can have it. i also didn't strip all the normal GIMPS functionality out of it so it reads standard ini files that the Prime95 client uses to set it's memory (Day and Night) along with other settings. it also writes to factors.txt and it writes messges to the log even when no primes are found which i know someone wanted. anyway, it's called ecm.exe in the zip and use local.ini to set memlevel usage (not on the command line anymore).
the other new version (sbfactor.exe) has a special feature that lets you overwrite the internal sieve data. this lets you do quirky things like quickly check that factors will be found by only using the factors you know you need. for instance, all i do is write a file called 'badsieve.txt' and put this in it:
2
7
23
29
53
181
2699
6473
then i run sbfactor with the command
sbfactor 55459 4132918 7000 8000 256
SBFactor v1.1dev
P-1 and ECM factoring for number of the form k*2^n+1.
Adapted from GIMPS v23.4 by George Woltman and Louis Helm
AMD Athlon(tm) processor detected.
256MB of memory avilable for stage 2
Reading false sieve data from badsieve.txt
P-1 on 55459*2^4132918+1 with B1=7000, B2=8000
initializing test
sieve finished
55459*2^4132918+1 stage 1 complete. 170 transforms. Time: 32 seconds
Starting stage 1 GCD - please be patient.
1565008878285119 | 55459*2^4132918+1
Total factoring Time: 3 minutes
And that was while I was running two other background processes so assuming you know all the factors of P-1, you can get a factor a minute.
in reality this is probably only useful for testing and doing absurd things like "finding" 40-digit factors in under a minute.
the assumption with the B1/B2 levels is that the lower the prime, the more likely it is to be a factor of P-1. that's why you'd never what to add primes to the factorization in any order but from lowest to highest. that's also why B2 starts at the place you leave off with B1. i agree that this is normally a safe assumption, but if you know better, feel free to prove us inside-the-box thinkers wrong.
if someone found a way to prove that certain factors could not be part of P-1 or that certain factor sets were mutally exclusive, it may be possible to improve efficency. btw, the sbfactor.exe is still "old code" unlike ecm.exe which is "new code". use sbfactor.exe to test out the badsieve features and ecm.exe to test out the new code's speed, extra ini features (look at GIMPS ini files for ideas), and the new bound setter.
feel free to post whatever you discover using these but at the same time, cavet emptor. i know for a fact these aren't well documented, easy to use, or feature rich. don't bother requesting features, pointing out shortcomings, etc for these versions... use v1.1 if you want something stable. these dev versions are for testing only. anyway, i'm going to be out of town for a week and couldn't update them even if i wanted to. good luck!
-Louie
http://www-personal.engin.umich.edu/...or11-15dev.zip
Hi
I am using sbfactor for prothsearch. It is really marvelous.
Many thanks to everyone involved in its developement.
But I have encountered a small problem with B1.
For detecting the prime
27711198115583 = 2 * 47^2 *229 * 619 *44249 + 1
which is a factor of 15*2^270458+1,
B1=619 must be enough.
But in practice it does not work with any limit less than 2251.
Regards,
-Payam
Last edited by Samidoost; 07-20-2003 at 10:30 AM.
Contrary to my previouse post
which was essentially a bug report,
here is some bonus:
B1=10000
B2=100000
539336741813170023987503 | 15*2^4000+1
p-1 = 2 * 7 * 17 * 17599 * 42406229 * 3036444899
B1=100000
B2=1000000
1259443246573 | 15*2^2035+1
p-1 = 2^2 * 3^2 * 7 * 29 * 53 * 3251653
B1=100000
B2=10000000
163718110671719644337719 | 15*2^374+1
p-1 = 2 * 3 * 191 * 142860480516334768183
Note that the last factor(s!) are greater than B2.
An explanation for both bug and bonus is needed.
- Payam
Last edited by Samidoost; 07-21-2003 at 07:55 AM.
Samidoost:
539336741813170023987503 = 836228323141 * 644963494883
836228323141 = 2^2*3*5*7*23*8443*10253+1
644963494883 = 2*7*1097*1811*23189+1
so both factors are found at the same time with your bounds.
With that said, factors of p-1 that are bigger than the bounds can apparently sometimes be found when using "Brent-Suyama's extension" (which I don't really know anything about...). That's probably what happened in your second case. Your third factor is again composite, but with one of the p-1 having a bigger factor.
EDIT: oh, btw, regarding your earlier post about the B1 bound needing to be unnecessarily high: I figure it has to do with the fact that your p-1 has two 47 factors. You'll need a B1 bound >= 47^2 = 2209 then.
Mikael
Last edited by mklasson; 07-21-2003 at 08:26 AM.
I can't tell anything about the bug, but the bonus is probably due to the Brent-Suyama extension, aka Suyama's powers.An explanation for both bug and bonus is needed.
There is a post from Alexander Kruppa about this on the Mersenne mailing list.
Sander
mklasson, smh
Thank you for your useful comments.
You have almost fixed both of the problems.
There are still primes 2213, 2221, 2237, 2239, 2243, 2251 between 47^2 and my practical B1 limit 2251.regarding your earlier post about the B1 bound needing to be unnecessarily high: I figure it has to do with the fact that your p-1 has two 47 factors. You'll need a B1 bound >= 47^2 = 2209 then.
It apears that B1=48*47 is a better choice.
- Payam
Just compare B1=48*47 with k=4847