how long will it really take for us to reach 1 million in the double check. If the chance of a missed prime is only one in a hundred then we should only keep the double check at the point where the likelyness of finding a prime per amount of time devoted is equal. I'm not sure exactly where this is but it would require to consider the likelyness of finding a prime at current level, the likely ness of finding a prime at double check level, the odds of a prime being missreported at the double check level, and the difference in time required to perform a test. I don't have any of these values but if someone who does have some of them would like to take some time and figure the numbers on that I would be very interested in seeing the results. If the double check doesn't look like it will ever reach very far then perhaps we should just trim the sieve range to 5,000,000 plus. Of course I'd want to see the numbers before making up my mind as for what side of the issue I am on. Perhaps we'll find that the double check effort really needs to be enhanced greatly to reach optimal depths.