Shouldn't it be within that B1-100*B1 range or something, as it will only work if and only if a single factor is between B1 and B2?
Yes but all "factors" can be less than B1 as well, i.e. found in stage1. So if everything was less than B1 and there can be one very large portion, so in that respect the bigger the B2 the better. It's just a time trade-off for when to start another curve.


The choice of B2=100*B1 was based upon the probablility of finding a factor that was b1 smooth with one factor that was within B1<factor<B2 and the time spent doing stage1 and stage2.

What this means is that at one point in time with some client b1 and b2=100*B1 was the optimal choice.

Now with ecm6.0, stage2 is much faster than it use to be. So in order to spend the same time doing stage2 as people did in the past we simply extend out the B2 bound.

At least this is the way I think of it.

Perhaps it might be time to look at this again once the prime95 client gets to a final release. BTW 24.12 version 3 is out

I guess I shouldn't b/c they do such a great job with this client. It's also stress tested to the extreme IMHO, debugged like crazy, and optimized to the point of making a drag racer envy. Perhaps MS should pay a little notice to the way things should be done?