These tests are very easy to test and for the most part have a very low error rate. In fact these test don't need to be redone at all because it is very unlikely that we have missed a prime there and the goal of our project is not to find the smallest prime but to find a prime for each K. It would in most cases be easier to tests larger numbers and hope to find a prime there. Eventually when the main effort gets far enough ahead of the double check it will once again be beneficial to do the double check again. However, for the range between 300000 and 1000000 we have hnot yet found a single test that was reported wrong. This leads me to believe that it would not be worth the effort to retest them when it appears there is almost no chance of finding a prime there. And anyways DC sieving is much less efficient than regular sieving. Eliminating a single test in regular sieving is like eliminating almost a hundred of the 500000 range tests. To me it just doesn't make sense to spend the resources to sieve it out any more. I dn't even think we should be sieving 1million to 5 million range. However because I've been told the speed gain from that would be very small i use the 1-20 million dat. Adding 300000 to 1 million to that would probably make almost no differance in the speed, but would offer almost no benefit to the project.