Page 1 of 4 1234 LastLast
Results 1 to 40 of 141

Thread: Sieve coordination discussion

  1. #1
    this is horrible these were found by 4:13 pm yesterday but not submitted until now I'm a few houras late.

    450034195420763 | 33661*2^7544760+1
    450077622239711 | 33661*2^12431160+1
    450092349108361 | 27653*2^5865873+1
    450093208218179 | 4847*2^19462671+1
    450521306741677 | 67607*2^9058811+1
    450557372899177 | 55459*2^6366298+1

  2. #2
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    I'm in the same situation

    Just started sieving (Newbee) and I have three files:
    fact
    factexcl
    factrange

    Not totally 100% sure what they are etc

    But fact has a k/n pair of n=6.45m I don't think I'll finish the range before we reach 6.45m can we simply submit factors before the range is done???

    If you can pre-submit which file contains the one we need to submit etc.

  3. #3
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    Originally posted by vjs
    I'm in the same situation

    Just started sieving (Newbee) and I have three files:
    fact
    factexcl
    factrange

    Not totally 100% sure what they are etc

    But fact has a k/n pair of n=6.45m I don't think I'll finish the range before we reach 6.45m can we simply submit factors before the range is done???

    If you can pre-submit which file contains the one we need to submit etc.
    You can submit them as you go, I do. The fact.txt file is the one to submit. Just keep track of which ones you have submitted, so that eventually you submit them all.
    Joe O

  4. #4
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Joe I did as you said but I also submitted factexcl

    The database said 22 new factors verified and accepted????

    562901010463619 | 24737*2^1514407+1
    562902758879399 | 10223*2^3798137+1
    562902807708023 | 67607*2^1627955+1
    562902827483363 | 33661*2^11136072+1
    562904037724211 | 19249*2^16016558+1
    562904473884997 | 33661*2^5351616+1
    562905183512027 | 10223*2^15699641+1
    562908456832559 | 28433*2^7558201+1
    562911958989661 | 19249*2^17539502+1
    562914314479537 | 4847*2^18026559+1
    562914516462629 | 28433*2^4348081+1
    562919384671031 | 22699*2^16826446+1
    562919391957341 | 33661*2^8934024+1
    562922843517563 | 10223*2^15113021+1
    562923979286951 | 33661*2^14977296+1
    562924049930783 | 21181*2^18544796+1
    562926713563607 | 55459*2^4042534+1
    562928171203031 | 10223*2^9530921+1
    562932700877003 | 67607*2^3545627+1
    562933349297351 | 55459*2^10552894+1
    562934652822127 | 55459*2^8704054+1
    562936448049823 | 55459*2^7839202+1

    along with

    562906036826687 | 28433*2^2890417+1

    from fact

  5. #5
    Senior Member
    Join Date
    Dec 2002
    Location
    Madrid, Spain
    Posts
    132
    · fact.txt - Here go the new factors.
    · factexcl.txt - Duplicate factors (there is already another factor for that k/n pair but at another p).
    · factrange.txt - Here go the factors found outside the n range (1 M<n<20 M).

    - You should submit ALL the factors from fact.txt.
    - From factrange.txt you should submit those with n<1 M, as they can be very usefull for the double-checking accounts. As currently only factors with n<20 M are accepted, you don't have to search for n<1 M and you can select all the factors and submit them. Or you can use this program by Mystwalker.
    - There isn't any need to submit the factors from factexcl.txt.
    - And the most impotant thing: you don't have to wait to finish the range you picked up. Evenmore, every time a factor with n within the current PRP active windows (see the stats), or close to them, is found, you should submit it ASAP.
    - http://www.seventeenorbust.com/sieve/ is the page for submitting factors. You just have to copy them and press the submit button. Don't forget to be logged in!
    - If you're on Windows, you can also use SoBistrator by mklasson to monitor your sieve (range, rate, factors, etc.) and automatically submit the found factors.
    Last edited by Troodon; 06-14-2004 at 01:37 PM.

  6. #6
    Moderator Joe O's Avatar
    Join Date
    Jul 2002
    Location
    West Milford, NJ
    Posts
    643
    "(there is already another factor for that k/n pair but at another p)" This is exactly correct.

    But "· factexcl.txt - Duplicate factors ."
    This is not quite right. These are not the same as MikeH's duplicate factors. They could be, but they are also most likely to be what MikeH calls excluded factors. It depends on whether the originally found factor was or was not below 1G "Excluded factors (those factors not present after sieving 100<n<20M to p=1G) "

    "The database said 22 new factors verified and accepted????" Factors are accepted, as long as they are within the stated bounds, and have not been submitted before. The sieve submission script only considers them to be "new" as long as that P K N triple has not already been submitted. So your 22 factors were accepted. But when you look at , your stats on MikeH's page you will only see the unique factors. Later on, when MikeH manually adds your reservation range, you will see the counts for duplicate and excluded points.
    Joe O

  7. #7
    Unholy Undead Death's Avatar
    Join Date
    Sep 2003
    Location
    Kyiv, Ukraine
    Posts
    907
    Blog Entries
    1
    Originally posted by Ken_g6[TA]
    I'm a newbie to sieving, and I'm running it on a really slow and rarely used laptop, so I don't really want to reserve much.

    470001-470002 KenG6 (ETA: mid July)
    Ken, whats your crunchin speed???
    wbr, Me. Dead J. Dona \


  8. #8
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Originally posted by Joe O
    "The database said 22 new factors verified and accepted????" Factors are accepted, as long as they are within the stated bounds, and have not been submitted before. The sieve submission script only considers them to be "new" as long as that P K N triple has not already been submitted. So your 22 factors were accepted.
    Not exactly, I guess. As far as I know, the submission page first checks if the submitted line p | k*2^n+1 contains an actual factor or not (i.e. if k really divides k*2^n+1). If so, it is verified. If not, it's not verified.

    If that specific p | k*2^n+1 was previously submitted to the database, then it is not a new factor.

    Regardless of being duplicate or not, if the factor is verified and it is submitted for the first time, then it is a new factor.

    p | k*2^n+1 becomes a duplicate if there was another p submitted, which divides that specific k*2^n+1.

    There might be different types of duplicates.

    If a factor for the specific k*2^n+1 was found before the sob.dat file you are using was created, then the client knows it's a duplicate (i.e. n is within 1m-20m limit, and the related k/n pair does not exist in the dat file). So, it is dumped into the factexcl.txt file. (there might be other cases of dumping into the factexcl.txt file, but I guess this is one of the dominant cases).

    If a factor for the specific k*2^n+1 was not found before (but found after) the sob.dat file you are using was created, then the client thinks it is a first time factor, although it is not. Such a factor also passes submission page without problem, but Mike's database catches it as duplicate, because it knows the other factor for that k/n pair, and when it was submitted.

    Since the latest sob.dat was created after a deep sieving point, this can be considered as a rare occurance (roughly 1.2% of the factors thought of as unique by the client are duplicates. This figure was as high as 10%-15% before the last update on sob.dat).

  9. #9
    Moderator ceselb's Avatar
    Join Date
    Jun 2002
    Location
    Linkoping, Sweden
    Posts
    224
    Originally posted by Death
    Ken, whats your crunchin speed???
    Oops, deleted that post before I saw this. He's on a laptop that doesn't get used much.

  10. #10
    Senior Member engracio's Avatar
    Join Date
    Jun 2004
    Location
    Illinois
    Posts
    237
    ceselb,


    Just wondering if somebody has already done this range 322500-323000. The reason I am asking is because the first four (4) factors I found said it is a duplicate. Am I just resieving this range? Thanks.


    e

  11. #11
    Originally posted by engracio
    ceselb,


    Just wondering if somebody has already done this range 322500-323000. The reason I am asking is because the first four (4) factors I found said it is a duplicate. Am I just resieving this range? Thanks.


    e
    A duplicate isn't really a duplicate factor, but means that some other factor has been found for that k/n pair before. This happens fairly often.

  12. #12
    Senior Member engracio's Avatar
    Join Date
    Jun 2004
    Location
    Illinois
    Posts
    237
    larsivi,


    A duplicate isn't really a duplicate factor, but means that some other factor has been found for that k/n pair before. This happens fairly often.

    Coolest, Thank you sir. Off I go then.


    e

  13. #13
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    The chances of 4 in a row to happen is really low (A rough estimate would be 1 in 4 million, i.e. assuming 2.26% of factors are duplicate currently). Still, a possibility of course.

    Another (more likely) possibility might be, perhaps (for some reason), you've submitted them twice (like, reloaded the page thinking that it did not work etc.), and did not notice the first occurance of submission results.

    Also, there are other possibilities as well.

    It is possible to determine if the factors were duplicate or not (or if they were submitted by somebody else before).

    You'll find Mike's pages very helpful.

    If you want, we can help you sort out what the actual case was. Just post your four factors below.
    Last edited by Nuri; 06-28-2004 at 09:30 PM.

  14. #14
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    A lot more factors are in factexcl.txt than in fact.txt - it is very likely that 4 excluded factors occur before a needed one...

  15. #15
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    322500878487647 55459 9934462 7192 30 e
    322501126469329 22699 18540982 7192 30 d1
    322501193034653 10223 8420801 7192 30 e
    322501438517519 10223 16277129 7192 30 e


    So, you submitted the factors under factexcl.txt as well, right?

    Then, it's normal.

  16. #16
    Senior Member engracio's Avatar
    Join Date
    Jun 2004
    Location
    Illinois
    Posts
    237
    Nuri,


    Thanks for staying on top of this.



    "322500878487647 55459 9934462 7192 30 e
    322501126469329 22699 18540982 7192 30 d1
    322501193034653 10223 8420801 7192 30 e
    322501438517519 10223 16277129 7192 30 e


    So, you submitted the factors under factexcl.txt as well, right?

    Then, it's normal."


    As I read the forum and get more factors on the ranges that I have reserved, the more I understand how it is suppose to work. Unfortunately, yesterday and this morning I have submitted and accepted more factors from the factexcl.txt. 3 factors from the fact.txt was submitted also.


    Now, I am kind of confuse on which data result to send. I know the fact.txt must be sent ASAP, but what about the other two files factexcl.txt and factrange.txt

    If somebody can point me to the right area, I would be most thankful.


    e

  17. #17
    Senior Member
    Join Date
    Dec 2002
    Location
    Madrid, Spain
    Posts
    132
    Originally posted by engracio
    Now, I am kind of confuse on which data result to send. I know the fact.txt must be sent ASAP, but what about the other two files factexcl.txt and factrange.txt

    If somebody can point me to the right area, I would be most thankful.
    Please see my previous post in this thread.

  18. #18
    I love 67607
    Join Date
    Dec 2002
    Location
    Istanbul
    Posts
    752
    Short story:

    You can forget about factrange.txt and factexcl.txt. Their contribution is infinitesimal (if not none).


    Long story:

    Think of fact.txt as the actual product, and the other two as some by-products.

    The factors we are aiming to find are stored in fact.txt. At current sieve levels, and with the sob.dat we're using, 97.6%* of the factors that will be written at fact.txt will be unique (first factors of a k/n pair, where k = one of the 11 left, and 1m<n<20m).

    * Calculation of 97.6%:
    Currently, there are 547,950 k/n pairs without a factor, for 1m<n<20m and k = one of the 11 left (this 547,950 figure decreases at a speed of ~55 k/n pairs per day. See Mike's project stats page).

    At the time our new sob.dat was created, there were 561,291 of those.
    So, when the client finds a new factor, and if it divides one of the 561,291 k/n pairs, then the client assumes this is a first time factor for that k/n pair. However, we've already found factors for some of those pairs since the sob.dat was created. Assuming random distribution of factors among k/n pairs, 97.6% (=547,950/561,291) of the factors written in fact.txt today will be the first for their k/n pairs, and the remaining will be duplicates.


    factexcl.txt stores factors that the client already knows are duplicates (i.e. a factor was already found for their k/n pair when the sob.dat we're currently using was prepared). If you consider that there would have been much more k/n pairs for 11 k if there was no sieving at all, it's easy to see why there are much more factors written at factexcl.txt. In fact, the factexcl.txt file on my PC currently has 42 times more factors than the fact.txt. Unfortunately, they're useless as far as the project's goal is concerned (i.e. that specific k/n was already taken out of the pool).


    factrange.txt is another story. It stores the factors which the client catches by chance, for n<1m and for n>20m.

    The factors where n>20m have little (or no) value for the project. There are two reasons: Firstly, when PRP effort becomes closer to 20m, we'll have to start sieving for n>20m from the beginning anyway (so, it will not save any sieving effort). And secondly, it's highly likely that most of those k/n pairs will be eliminated at the very beginning of the new sieve effort anyway.

    The factors where n<1m might still have some value. PRP double check (second-pass) effort is currently at n=820,000. So, if your lucky factor has an n value between 820,000 and 1,000,000, and if it is not a duplicate, then it will save a second-pass PRP test. Roughly speaking, there are 5,500 tests left below n=1m. Please also note that, a PRP test of this size takes only a few hours on an average PC. And a last note, you should also consider that this is a narrowing window (as second-pass approaches 1m), and at current speed it will reach n=1m within two months or so. So, two results: This is a timed possibility. And, the chance of a factor at factrange.txt to hit the desired range gets smaller and smaller as second-pass proceeds.

    Apart from the benefit I mentioned above, the only other benefit that comes to my mind is: If a relatively less active user reserves a range, and does not show up at the forum for a long time, we'll know what portion (to what extend) of his range he sieved more precisely, if he left some footprints via the excluded and duplicate factors he submitted (remember, they are much more frequent). Thus, the decision to resieve his range or not (or to what extend) would be easier.


    Wrap up

    There would be no harm in dumping the contents of factrange.txt and factexcl.txt(except the fact that doing so inflates results.txt file and costs some bandwith for Mike and P-1 factorers).

    Still, the contribution of doing so is not very significant. So, act as you wish in that respect.

  19. #19
    Senior Member engracio's Avatar
    Join Date
    Jun 2004
    Location
    Illinois
    Posts
    237
    Troodon


    Please see my previous post in this thread.
    It's not because I did not see your post, its just seem to be a lot of conflicting info. As normal I is compuse.


    Nuri


    There would be no harm in dumping the contents of factrange.txt and factexcl.txt(except the fact that doing so inflates results.txt file and costs some bandwith for Mike and P-1 factorers).
    Thanks for the long and short. I think I will just upload the fact.txt unless requested otherwise.



    e

  20. #20
    Unholy Undead Death's Avatar
    Join Date
    Sep 2003
    Location
    Kyiv, Ukraine
    Posts
    907
    Blog Entries
    1

    up to you

    well, that's your decision, but I prefer to upload all three files.

    more information is better than lack of information.
    wbr, Me. Dead J. Dona \


  21. #21
    Senior Member engracio's Avatar
    Join Date
    Jun 2004
    Location
    Illinois
    Posts
    237
    I think I will just upload the fact.txt unless requested otherwise.

    Thanks Death, like I said unless requested by people that be I won't. Unlike you I believe pertinent information is better than the whole enchilada. Barely able to eat a burrito let alone the whole enchilada.


    e

  22. #22
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    I always upload all of the files simply b/c it shows mike that a range has been complete... I think. If for example you were to seive for 500G and not find a factor in fact.txt, highly unlikely, but possible. However if other factors were submitted Mike should see the progress of the range.

  23. #23
    Senior Member engracio's Avatar
    Join Date
    Jun 2004
    Location
    Illinois
    Posts
    237
    SOLD!!!!! for 5 cents. Since mike is not saying anything (thinking it does not matter to him either way, better safe than sorry) I will upload the other two files as I complete the range. Tha fact.txt is uploaded as soon as it is found. Thanks for the info, Death guess you were correct.



    e

  24. #24
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    engracio,

    The other thing that is important to check for are those low out of range values.

    Anything >830K right now is benifital, however if it's less than 1m it goes in the out of range file . Hopefully within the next 5 months this won't be the case since supersecret should get to 1m by then.

    Anyways I guess I feel as though I'm pretty much alone in supersecret with only 40 results per day, wish I could send those scores to my home team ARS, but regardsless it's being done none the less.

    Perhaps everyone could submit their out of range results especially those >830K, probably knock at least a day or two worth of test out.

    Remember even if you have already reported the range as finished you could still submit the out of range results.

  25. #25
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    Originally posted by vjs
    Anything >830K right now is benifital, however if it's less than 1m it goes in the out of range file . Hopefully within the next 5 months this won't be the case since supersecret should get to 1m by then.
    I'd like to see supersecret go up faster. This way, one could raise the lower bounds of the sieving range, gaining a small performance surplus...

    Anyways I guess I feel as though I'm pretty much alone in supersecret with only 40 results per day,
    I did some tests from time to time, but at Riesel, it takes not much longer to do a first time test (currently @ n=1M). With >50 k's currently under investigation by B2, it's far more likely do find a prime there...

    Perhaps everyone could submit their out of range results especially those >830K, probably knock at least a day or two worth of test out.

    Remember even if you have already reported the range as finished you could still submit the out of range results.
    I've rewritten my LowNFinder to take a paramter as lower bound (an optionally another one for the higher bound (Riesel has 600K as lower bound for sieving)).
    Just try e.g.
    Code:
    LowNFinder.bat 830
    to get all factors for 830K < n < 1M - very useful for big factrange files...
    If there's interest, I can add the option to enter the bounds directly in the program instead of by parameters only (better when you use a window manager instead of a command shell ). Well, maybe I'll do this anyway, it's just a matter of priorization then...

  26. #26
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    I don't have java installed on my machine I just don't like it very much, had a very early verison installed which was a mess. I'm sure it's improved by now.

    Do you have a write up on what your program does exactly,

    Can you simply enter a particular k then it searches for all of the lowest n between 830 and 1M for example ????

    If so I may try it out.

    I also have a little batch program that will allow you to run your own k/n's.
    Great for doing supersecret work while getting credit to your account.

    I'd like to see supersecret get up to 1m as well, there are something like 3600 k/n pairs between 900k and 1M.

    It would take about 25 good machines about 2 weeks to do all of these values.

    The other (best) option would be to have those who manage the ques to create 3-5 new accounts which assign credit to the top 3 or 5 teams.

    Say

    TeamPrimeRib-Secret
    Anandtech-Secret
    TeamRetro-Secret
    ExtremeDC-Secret
    DPC-Secret

    They could also do the same for the garbage account.

    If someone were to do this I could probably organize a gauntlet to get all n<1M done. IF we have enough competition between team here we could have some type interteam race to 1M...

  27. #27
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    My program simply filters out every factor (the whole line, of course ) of the factrange.txt whose n is in the specified range.
    E.g. LowNFinder 830 will give you all lines with 830K < n < 1M - regardless of the k.

    It's just (pseudocode)
    Code:
    BEGIN
       Foreach line {
          if (lowerBound < n < higherBound) then {
             output line
             write line to new file
          }
       }
    END
    Last edited by Mystwalker; 07-08-2004 at 07:31 AM.

  28. #28
    Sieve it, baby!
    Join Date
    Nov 2002
    Location
    Potsdam, Germany
    Posts
    959
    Just got an idea concerning the factors out of range:

    As proth_sieve has no SoB.dat for them, it cannot distinguish between factors for already known tests and for tests unknown so far.
    Thus, it is quite likely a factor outside the range will be an excluded one.
    Nevertheless, some are not - and those are worth a few hours of PRPing.

  29. #29
    Well, not sure if it's unique yet, but I just got:

    317864024961919 | 28433*2^969049+1


    Yay!

    Edit: Just realized that I found one like that before, only I know that it's unique.

    269.690T 21181 975668
    Last edited by royanee; 07-13-2004 at 11:18 PM.

  30. #30
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    I'd suggest you submit it anyways, if it is a unique factor you'll get something like 5 points for it, not much but hey it will save an hour of coumputing time on one machine for 5 seconds worth of cut and paste.

  31. #31
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    July 14th doesn't get any closer that your estimate of mid July,

    Good work

    Ever think about sieving with those celerons???

  32. #32
    It's actually a few hundred points if it's unique. I did submit it, so...

    *checks the scores page* looks like it was a duplicate. Ah well. I got 481 points for the other one.

  33. #33
    Unholy Undead Death's Avatar
    Join Date
    Sep 2003
    Location
    Kyiv, Ukraine
    Posts
    907
    Blog Entries
    1
    Originally posted by vjs
    July 14th doesn't get any closer that your estimate of mid July,

    Good work

    Ever think about sieving with those celerons???
    huh =) I speed it up using some old pentiums like 300 MHz

    and i can sieve only with some service starter not as usual running. don't want to scare users =)))
    wbr, Me. Dead J. Dona \


  34. #34
    Senior Member
    Join Date
    Dec 2002
    Location
    Madrid, Spain
    Posts
    132
    Originally posted by Death
    and i can sieve only with some service starter not as usual running. don't want to scare users =)))
    Have you tried using runh.exe? It's a program that just hides the window.

  35. #35
    Moderator ceselb's Avatar
    Join Date
    Jun 2002
    Location
    Linkoping, Sweden
    Posts
    224
    Originally posted by vjs
    Please feel free to condense for space
    Another: 460000-470000 VJS [Late: Sept]

  36. #36
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    That's fine as long as mike can keep them as 1T ranges

    on his page helps me keep track of the factors

    http://www.aooq73.dsl.pipex.com/ui/5202.htm

    Quote:

    --------------------------------------------------------------------------------

    quote:
    --------------------------------------------------------------------------------
    Originally posted by vjs
    Please feel free to condense for space
    --------------------------------------------------------------------------------
    Another: 460000-470000 VJS [Late: Sept]

    --------------------------------------------------------------------------------

  37. #37
    Moderator ceselb's Avatar
    Join Date
    Jun 2002
    Location
    Linkoping, Sweden
    Posts
    224
    Originally posted by vjs
    That's fine as long as mike can keep them as 1T ranges
    I *think* they'll stay, if not give mike or me a shout
    Impressive speed, btw. What are you running?

  38. #38
    Moderator vjs's Avatar
    Join Date
    Apr 2004
    Location
    ARS DC forum
    Posts
    1,331
    Originally posted by ceselb
    I *think* they'll stay, if not give mike or me a shout
    Impressive speed, btw. What are you running?
    I posted a bunch of my scores in the benchmark thread, just basically a bunch of computer but 3 are extremely fast at sieve.

    3 x 2500 Mhz bartons
    2 x 1800xp
    2 x 866 p3's
    1 x 800 p3
    2 x 533 Celeron
    1x 1000 mhz athlon

    Totals some where around 3500 kps hopefully I can hold on to all these machines for a while.

  39. #39
    Moderator ceselb's Avatar
    Join Date
    Jun 2002
    Location
    Linkoping, Sweden
    Posts
    224
    Originally posted by vjs
    I was looking at the Gap ranges last night and was wondering if there is any reason to resieve or check the larger gaps, for missed factors once all of the low n sieve is done???

    http://www.aooq73.dsl.pipex.com/gaps...n3_20_p01u.htm
    MikeH is checking some suspicious holes himself. You'll need to talk to him if you want to do something.

  40. #40
    Senior Member
    Join Date
    Jan 2003
    Location
    UK
    Posts
    479
    MikeH is checking some suspicious holes himself. You'll need to talk to him if you want to do something.
    I'm not checking anything right now, all the current gaps look acceptable to me. Nuri was the last one to organise a hole searching effort, but that is complete and I think we're all happy again now.
    Last edited by ceselb; 09-06-2004 at 12:14 PM.

Page 1 of 4 1234 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •